Thursday, July 14, 2016

Adding a user to a SQL Azure Database

This week i had to create a few new users for a SQL Azure database. I have to do this often, but I can never remember the exact steps that need to be run.

If you've connected to a SQL Azure instance, you'll quickly see that some of the useful UI Options don't exist or work differently. For example, if you want to add a new user, you get a SQL script that just creates the user, but doesn't give them any access to the database:


In order to actually give the user access to a database, you need to switch your connection to the database you want to the user to access and run the following command:

FOR LOGIN [UserName]

-- Add user to the database owner role
EXEC sp_addrolemember N'db_owner', N'[UserName]'


You can switch out the default schema and role to whatever fits your needs.

After all of that, the user should work, but if you try to connect from SQL Mgmt Studio, you'll get an error that the user does not have access to the Master database. If this wasn't SQL Azure, we could set the default database and we'd be set, but SQL Azure doesn't support that (yet) so we need have 2 options:

Option 1: Give the user Access to the master database:
This can be done by running the "Create User" script from above but on the master database

Option 2: Pick the specific database from SQL Mgmt Studio.
For this option, when you try a new connection to a SQL server, click the "options" button in the lower right of the screen:

Then type in the name of the database and click Connect:


Tuesday, July 12, 2016

Restart-AzureWebsite: Your Azure credentials have not been set up

I recently ran into an issue when trying to restart an Azure App Service from PowerShell. The command to do it is Restart-AzureWebsite, but when I attempted to run it, I got the following error:

“Restart-AzureWebsite : Your Azure credentials have not been set up or have expired, please run Add-AzureAccount to set up your Azure credentials”

I ran the Add-AzureAccount command but I continued to get the same error.

After some digging I realized that I had more than 1 Azure subscription and the command was using the wrong one when attempting to restart the website.

I ran the following command to retrieve the subscriptions that are cached on my PC:


That returns all of the subscriptions for all of the accounts that you’ve added from Azure. I found the subscription that was correct for the website I was attempting to restart and then ran this command:

Select-AzureSubscription -SubscriptionName "MySubscription"

That set the subscription for the current session to the one that contained the website I needed to restart. From there, everything went smoothly.


Friday, January 2, 2015

Quick and easy table Audit with code first EF6 and Migrations

For one of my current projects, we had a requirement to create track changes made to two tables in the database for auditing purposes. Initially I thought we could just use the built in SQL change tracking features (here), however we were deploying this application to Azure and as of now, that feature is not available in SQL Azure.
I did some investigating into this and there were a few options:

Option 1: Create my own triggers on these tables and write data to the audit tables when the rows were updated.

This was some what straight forward, but I’m not a fan of triggers and I wanted to keep all of my code in C#/Entity Framework for ease of code management down the road. Also, I didn’t want to have to update the triggers when the table was altered

Option 2: Implement this work around leveraging a Local data cache

While this seems like an easier approach, I wouldn’t be fully in control of what was happening during updates. Also, this approach only tells you what rows have changed and not what the before values where. I needed to be able to tell what any change to the row was and its historical values.

Option 3: Build my own using EF Inheritance and Migrations

Building your own is always the most fun option anyway isn’t it?
I decided that I would use Migrations and the built in inheritance features of EF and override the SaveChanges function to write to my audit tables when needed.
Note: For my project I was using EF Code first with migrations, so if you are not using code first or migrations, this may take some massaging.
There are 3 steps to this
Step 1: Setting up the tables
Before I set up the auditing, I had a single object/table called “Claim” and it was defined as shown:
    public class Claim
        public long ClaimId { get; set; }
        public string Title { get; set; }
        public DateTime? ClaimFiledDate { get; set; }
        public DateTime CreateDateTime { get; set; }
        public DateTime? UpdatedDateTime { get; set; }
The class was pretty straight forward leveraging Data Annotations to enforce rules in the MVC View and in the database when creating migrations. For my auditing table, I wanted all of the same columns including the primary key along with a surrogate primary key for query optimization. I could have just created another table definition with the same columns, but then every time I wanted to make a change to the Claims object, I would have to remember to make the same change to the Audit table.  If we use inheritance and leveraging the “magic” of migrations, we can get there with minimal code.
What I ended up doing was creating an abstract base class with most of the data columns in it and then creating two derived classes, Claim and ClaimAudit. Making it abstract means that EF won’t try to create a table for it when you create migrations.
Here is what they ended up looking like:
    public abstract class ClaimBase
        public virtual long ClaimId { get; set; }
        public string Title { get; set; }
    public class Claim: ClaimBase
        public override long ClaimId { get; set; }
        public DateTime CreateDateTime { get; set; }
        public DateTime? UpdatedDateTime { get; set; }
        public bool? IsDeleted { get; set; }
Claim Audit:
public class ClaimAudit : ClaimBase
    public long ClaimAuditId { get; set; }
    [Required, DatabaseGenerated(DatabaseGeneratedOption.Computed)]
    public DateTime TimeStamp { get; set; }
A few things to mention:
  • You will have to mark up the ClaimId primary key in the base class as DatabaseGeneratedOption.None so that you don’t get issues with multiple primary keys
  • You override that markup in the derived class to set to Identity for the Claims table since it is, but you do not override it in the audit table since you will be storing multiple versions of that key in the audit table
  • You create a new surrogate primary key on the audit table for query purposes
Step 2: Update “OnModelCreating”
Event though we’ve set up the inheritance in the code, we need to call out a few explicit things in the OnModelCreating function to enforce our primary keys and inheritance. The MapInheritedProperties and “toTable” function is what allows us to define the columns in the base class and have them created in the derived class. As for the HasKey, I don’t remember exactly why I had to use that, but essentially its reinforcing the primary keys that are defined in the markup.
protected override void OnModelCreating(DbModelBuilder modelBuilder)
    modelBuilder.Entity<ClaimAudit>().HasKey(k => k.ClaimAuditId).Map(m =>

    modelBuilder.Entity<Claim>().HasKey(k => k.ClaimId);


Step 3: Run Up Migration

Now that we’ve updated the classes, when we create the migration script that creates the audit table, it automatically adds the columns from the Claimbase class:

c => new
        ClaimAuditId = c.Long(nullable: false, identity: true),
        TimeStamp = c.DateTime(nullable: false, defaultValueSql: "GETUTCDATE()"),
        ClaimId = c.Long(nullable: false),
        Title = c.String(nullable: false)

.PrimaryKey(t => t.ClaimAuditId);

Step 4: Override SaveChanges

Ok, the tables are set up and created. The last step is to actually write the data to the audit table when things change. To do this, I used AutoMapper to map the Claim and ClaimAudit objects and added a row to the ClaimAudit table every time a Claim object changed. note that for new Claim Objects, I had to save them in a list and only add them to the ClaimAudit table once they were successfully added to the DB so that the ClaimID value was properly populated

public override int SaveChanges()
    Mapper.CreateMap<Claim, ClaimAudit>();

    var newClaims = new List<Claim>();

    //For each Claim, create and audit record if its an update. else wait until its been saved to the DB so we can get the ClaimId
    foreach (var dbEntityEntry in ChangeTracker.Entries<Claim>().Where(e => e.State != EntityState.Unchanged))
        if (dbEntityEntry.State == EntityState.Added)
        else if (dbEntityEntry.State == EntityState.Modified)
            var audit = Mapper.Map<ClaimAudit>(dbEntityEntry.Entity);


    foreach (var newclaim in newClaims)
        var audit = Mapper.Map<ClaimAudit>(newclaim);
    return base.SaveChanges();

That’s it Simple and relatively easy Database auditing in Azure with EF 6 and Migrations. Enjoy!

Friday, April 18, 2014

Sitefinity Blank Login Screen


One on of my current projects, we’re leveraging the Sitefinity CMS for our front end. Initially the setup went really smoothly and I was off to the races. There is a ton to learn and a lot of the functionality is fairly intuitive. I was super excited about it until I tried to deploy some code updates to our Development/Integration Server.

We had already created a site on the development server directly and I didn’t need any of the content that I had created locally, so I figured it would be as easy as moving my code out to the DEV server. When I did this and went to fire up the site, I got a blank Login screen:


We went back and reviewed the windows roles/features that need to be turned on and everything seemed to look fine. I recopied the code over and over with no success.

Finally we did a folder compare between the initially deployed site on the DEV server and the code that I had moved. The only thing noticeable was that the GUID’s in some of the configuration files where different. That’s when I had a “Eureka” moment. I bet the Database that I had stored locally had to be moved as well because the GUID’s in the config files would have to match some data in the database.

I did a backup/restore from my local PC to the dev server and it worked!

Lesson learned: The website and assocaited databases can’t be swapped out ad hoc. There needs to be planning and strategy involved.

Hopefully this saves someone some time.


Monday, February 10, 2014

Build/Deploy Windows service with TFS

Building and deploying a web service or website via TFS is pretty straight forward, but automatically deploying a windows service or console application takes a b it of setup. I’ve outlined the steps below to get your service deployed.

Step 1: Set up Automated Build and MSDeploy on the Target server.

If you are doing any sort of automated build/deploy I recommend using MSDeploy. You can follow steps 1-3 from a previous post here.

Step 2: Edit the .csproj file

The project file for the app you are trying to deploy needs to be edited so that TFS will create a directory for the output of the project. Otherwise the project will be built, but the assemblies will be placed in the code drop location with no organization to it.

To edit the file, go to the open file dialog and locate the csproj file. Select it but don’t click “Open. Instead click on the dropdown arrow next to the Open button, select “Open with”, and choose the XML (Text) editor as shown:


Once the file is open, add the following “PropertyGroup” XML to the file above the ItemGroup section: 

  <PropertyGroup Condition="$(TeamBuildOutDir) != '' ">

Step 3: Run the automated Build

At this point you want to run the automated build to verify that everything builds properly and that the output directory is created properly.

Step 4: Create a batch script to deploy the code

This approach is identical to the steps 5-7 outlined in the previous post here (same as above). However, the script will be different. We will still take the drop location as the first parameter, but instead of using the setparameters.xml file, we’ll set the parameters directly in the service call. We’ll also leverage the PreSync and PostSync abilities to uninstall the service before deploying and install the service after deploying. This way we don’t run into issues with the assemblies being in use while we are trying to deploy:

SET _location=###%1%###
SET _location=%_location:"###=%
SET _location=%_location:###"
SET _location=%_location:###=%
rem set _logfile=%_location%DeployWindowsServices.txt
set _msdeploy="C:\Program Files\IIS\Microsoft Web Deploy\msdeploy.exe"

set _serviceName=ACMEMonitorService
set _exeName="ACME.FASMonitor"
set _sourceDir=%_location%\ACME.FASMonitor
set _destDir=ACME_SchedulerJobs\ACME.FASMonitor\

echo Deploying %_serviceName% to %_destDir%

%_msdeploy% -verb:sync -source:dirPath="%_sourceDir%",includeAcls='False' -dest:dirpath="\\%_destServer%\c$\%_destDir%",computerName=%_destServer%,userName=%_user%,password=%_password%,authtype=NTLM -useCheckSum -preSync:runCommand="C:\Windows\Microsoft.NET\Framework\v4.0.30319\installutil.exe c:\%_destDir%\%_exeName%.exe /u",waitInterval="15000",waitAttempts=1,dontUseCommandExe="false" -postSync:runCommand="C:\Windows\Microsoft.NET\Framework\v4.0.30319\installutil.exe c:\%_destDir%\%_exeName%.exe",waitInterval="15000",waitAttempts=1,dontUseCommandExe="false"
%_msdeploy% -verb:sync -source:runCommand="net start %_serviceName%",waitInterval="25000",waitAttempts=1 -dest:auto,computerName=%_destServer%,userName=%_user%,password=%_password%,authtype=NTLM

echo Done Deploying %_serviceName% to %_destDir%

Step 5: Run the deployment.

That’s it! with the setup above, the windows service will be built, deployed to the code drop location and then deployed and installed on the destination server.

Other uses

The example above shows how to deploy a windows service, but if you have a console app, it can be done the same way, but without the preSync/postSync actions. You can even take it a step further and run schtasks to disable/enable a task if the windows service/console app is associated with a scheduled task.


Wednesday, December 18, 2013

Checking Server to DB connectivity easily

We’ve all been there before. We set up our new web or application server with the latest code, update the connection strings with the proper SQL Server instance and password and fire up the application only to see all sorts of SQL connection/login issues.  These issues can be very frustrating to troubleshoot because you are not sure if the Login is incorrect, the db is spelled incorrectly or if there are network issues preventing you from seeing the SQL Server. On windows machines, there is a simple way to validate that the SQL server is accessible from the server.

  • Remote desktop into the server having connection issues
  • Click on “Start” and search for Folder Options. Click on Folder options, go to the View tab and validate that the “Hide extensions for known file types” is unchecked:image

  • Go to the desktop, right click and select New –> Text Document


  • Rename the file connection.udl and accept the rename warning
  • Double click on the udl file and it will open the Data Link window.
  • In the Connection tab, enter the information for the SQL server and click the “test Connection” button:


  • Now that you are successful connecting to the server, you can close the window and right click the connection.udl file and chose “Open with”. Select “Notepad” to view the proper connection string.

It’s a simple and effective way to troubleshoot SQL connection issues between servers and the SQL DB.


Friday, May 24, 2013

MSDEPLOYAGENTSERVICE 401 unauthorized–Resolution

We recently migrated a production environment for a client to new Servers. I had previously been using MSDeploy to deploy the websites/services to the servers so I figured all I had to do was install MSDeploy, point Update my deploy scripts to point to the new servers, and deploy! I was using MSDeploy 2 on the previous servers so I figured it would work on the new ones. Unfortunately it didn’t turn out to be that easy.
When I ran the updated scripts I got the following error:
Fatal: Request to remote agent URL 'http://myserver/MSDEPLOYAGENTSERVICE' failed.
Fatal: The remote server returned an error: (401) Unauthorized. Fatal count: 1
I was using an admin account and I could hit that URL above in a browser so I knew it wasn’t an authorization issue.
Here are the things I tried that DIDN’T work:
  1. Uninstall/Reinstall MSDeploy 2
  2. Install MSDeploy 3
  3. Create the fake user group on the server per these instructions:
  4. Check the file permissions on the Target server
  5. Bang my head against the wall and start crying…
After many hours of searching, I finally came across this brilliant post that solved my issue:
here are the scripts from the IIS Forum post for easy access:
dism /online /enable-feature /featurename:IIS-WebServerRole 
dism /online /enable-feature /featurename:IIS-WebServerManagementTools
dism /online /enable-feature /featurename:IIS-ManagementService
Reg Add HKLM\Software\Microsoft\WebManagement\Server /V EnableRemoteManagement /T REG_DWORD /D 1
net start wmsvc
sc config wmsvc start= auto

It turns out that the MSDeploy installers don’t tell you that you need to have some of the windows features installed in order to work properly. The installers just move forward and then fail to do the dreaded 401 unauthorized error. I uninstalled MSDeploy, ran the scripts there from the command prompt and installed. IT WORKED!

After that, I did run into 1 more issue regarding file permissions on the redirection.config file, but that was easily fixed via this post:

I hope this saves someone as much time as I wasted over the past 2 days.