Sunday, 15 October 2017

Visual studio 2017 - New npm package won't install...

One of my "how I got burned today" blogs. Spent some time on it so thought to share.

I started with writing a simplistic application using NodeJS today using Visual studio 2017. Tried to install NodeJS using the "Install new npm Package option".



The "Install New npm package" dialog opens. Typed in the name of the package and clicked install "Install Package"


Absolutely nothing happened. No errors or messages were shown and my package wasn't installed. Turned out that there was a syntax error in my packages.json file, where I had missed out a comma. Would have been nice if visual studio had captured this error and shown some sort of message. I am using Visual Studio 2017 Update 3.

Friday, 13 October 2017

PowerShell - The curious case of @ in converted Json strings

PowerShell is great when it comes to working with JSON. Being a scripting language, you can pretty much de-serialize your json without declaring types for them, do your work on de-serialized objects, and then serialize it back to for storing or transport. 

The ConvertFrom-Json and ConvertTo-Json are powerful functions. However, there are a few nuggets that you need to be aware of. I got caught out by one of the so thought to blog about it.

When working with ConvertTo-Json, be mindful of the -Depth parameter. The parameter specifies how deep it should go in our object while converting to json string. The default value is 2. What it means is that if you have a complex json object that goes down more than two levels of depth and it you haven't specified the -Depth parameter, your nested objects would be treated as Hashtable.

As an example, let's assign a json string to a variable 

 $programmersJson = '[{ 
      "Name" : "Hamid",
      "Gender" : "Male",
      "Expertise": [  
        {
          "Skill": "PowerShell",
          "Level": "5"
        },
        {
          "Skill": "C#",
          "Level": "8"
        }
      ]},
      {
      "Name" : "Adnan", 
      "Gender" : "Male" ,
      "Expertise": [
        {
          "Skill": "PowerShell",
          "Level": "7"
        },
        {
          "Skill": "C#",
          "Level": "6"
        }
      ]}]'

The json string contains an object that is three level deep. . The top level object is a collection, each item in the collection is an object with a "Name" and "Gender" property as well as a collection of objects each has a "Skill" and "Level" property

Now, lets call ConvertFrom-Json

$programmers = ConvertFrom-Json -InputObject $programmersJson

Write-Output $programmers

The result is an array of objects, as expected

Name  Gender Expertise
----  ------ ---------
Hamid Male   {@{Skill=PowerShell; Level=5}, @{Skill=C#; Level=8}}
Adnan Male   {@{Skill=PowerShell; Level=7}, @{Skill=C#; Level=6}}

Now, lets try to convert it back to json. So, when you call

 ConvertTo-Json -InputObject $programmers

You would expect the Json string to be same as $programmersJson. Wrong!! The string you get back is


[
  {
    "Name": "Hamid",
    "Gender": "Male",
    "Expertise": [
      "@{Skill=PowerShell; Level=5}",
      "@{Skill=C#; Level=8}"
    ]
  },
  {
    "Name": "Adnan",
    "Gender": "Male",
    "Expertise": [
      "@{Skill=PowerShell; Level=7}",
      "@{Skill=C#; Level=6}"
    ]
  }
]

Notice, the @ sign for each of the item in the Expertise collection. It means that the function has treated each item as a Hashtable rather than an object.

Now execute the following

 ConvertTo-Json -InputObject $programmers -Depth 3

The depth parameter will make it treat each item of Expertise collection as an object as well and the resulting Json would be as you would expect.

[
  {
    "Name": "Hamid",
    "Gender": "Male",
    "Expertise": [
      {
        "Skill": "PowerShell",
        "Level": "5"
      },
      {
        "Skill": "C#",
        "Level": "8"
      }
    ]
  },
  {
    "Name": "Adnan",
    "Gender": "Male",
    "Expertise": [
      {
        "Skill": "PowerShell",
        "Level": "7"
      },
      {
        "Skill": "C#",
        "Level": "6"
      }
    ]
  }
]


This is how we would have expected the output json string to look like. So, next time you are working with json on PowerShell, make sure to be mindful of the -Depth parameter.

Monday, 9 October 2017

Migrating ASP.NET MVC website to ASP .NET Core - Migrating Model

I maintain an ASP.NET MVC website that I have been meaning to move to ASP.NET Core, but found the .Net Core 1.1 library rather limited. With the release of .NET Core 2.0 and ASP.NET Core 2.0, we decided to migrate the websites to the new framework. The site has been operational since October 2010 and was built using ASP.NET MVC 2.0. It has gone through various bouts of upgrades and is currently using ASP.NET MVC 5.2.0, which forms the baseline of this conversion. I had several discoveries along the way so thought to blog about them. 

In this post, I am going to write about prep and moving our "Model" to Entity Framework .Core 2.0


Background

The model of our website was built using Entity Framework code first. All database operations were performed using repository pattern. Our repository interface looks as follows

    public interface IRepository : IDisposable where TEntity : class
    {
        IQueryable GetQuery();
        IEnumerable GetAll();
        IEnumerable Find(Expression> predicate);
        TEntity Single(Expression> predicate);
        TEntity First(Expression> predicate);
        void Add(TEntity entity);
        void Delete(TEntity entity);
        void Attach(TEntity entity);
        void SaveChanges();
        DbContext DataContext { get; }
    }

We use interface inheritance to create repository for each of our model objects, so for a object "Token", the repository looks like following


    public interface ITokenRepository : IRepository
    {
    }

With the interface inheritance in place. our single generic repository class can the logic for database operations as shown below

public class Repository : IRepository where TEntity : class
    {
        private DbContext _context;

        private IDbSet _dbSet;

        private static string _connectionString = string.Empty;

        public Repository(IDataContextFactory dbContextFactory)
        {
            if (string.IsNullOrWhiteSpace(dbContextFactory.ConnectionString))
            {
                _context = dbContextFactory.Create(ConnectionString);
            }
            else
            {
                _context = dbContextFactory.Create();
            }
            
            _dbSet = _context.Set();
        }

        public Repository(DbContext context)
        {
            _context = context;
            _dbSet = _context.Set();
        }

        public DbContext DataContext
        {
            get
            {
                return _context;
            }
        }
        public IQueryable GetQuery()
        {
            return _dbSet;
        }

        public IEnumerable GetAll()
        {
            return GetQuery().AsEnumerable();
        }

        public IEnumerable Find(Expression> predicate)
        {
            return _dbSet.Where(predicate);
        }

        public TEntity Single(Expression> predicate)
        {
            return _dbSet.SingleOrDefault(predicate);
        }

        public TEntity First(Expression> predicate)
        {
            return _dbSet.FirstOrDefault(predicate);
        }

        public void Delete(TEntity entity)
        {
            if (entity == null)
            {
                throw new ArgumentNullException("entity");
            }

            _dbSet.Remove(entity);
        }

        public void Add(TEntity entity)
        {
            if (entity == null)
            {
                throw new ArgumentNullException("entity");
            }

            _dbSet.Add(entity);
        }

        public void Attach(TEntity entity)
        {
            _dbSet.Attach(entity);
        }

        public void SaveChanges()
        {
            _context.SaveChanges();
        }

        public void Dispose()
        {
            Dispose(true);
            GC.SuppressFinalize(this);
        }

        protected virtual void Dispose(bool disposing)
        {
            if (disposing)
            {
                if (_context != null)
                {
                    _context.Dispose();
                    _context = null;
                }
            }
        }

        public static string ConnectionString 
        {
            get
            {
                if (string.IsNullOrWhiteSpace(_connectionString))
                {
                    _connectionString = ConfigurationManager.ConnectionStrings["Rewards"].ConnectionString;
                }

                return _connectionString;
            }
        }
    }

The class above does all the heavy lifting for us. We just need to define classes that implement each of our models' repository interface. For our model Token, it would be


public class TokenRepository : Repository , ITokenRepository
    {
        public TokenRepository(IDataContextFactory dbContextFactory)
            : base(dbContextFactory)
        {   
        }

        public TokenRepository(DbContext dataContext) 
            : base(dataContext)
        {
        }
    }


Entity Framework Core 2.0 limitations

1. No Many-To-Many Relationship

The biggest issue we have encountered while migrating to .Net Core 2.0 is lack of resolution for Many-To-Many relationships. This is an open issue, which haven't been resolved yet. For us, it means a lot of re-work.

With the POCO way of working, you would start with writing your domain model and your write your business logic using models, without really thinking about relational database details. We have a lot of code where our LINQ queries were based on domain model relationships. Now, we need to re-work all those. 

This, in my mind, is a major issue and though ways to resolve this issue, it prevents Entity Framework .Core from being a true ORM tool. 

As an example, consider you have two entities Parent and Student in your model, where a student can have multiple parents and a parent can have multiple students. With Entity Framework 6, the model definition was sufficient to imply the correct type of relationship. If you have to do it explicitly, you could do it at the time of model creation like below
modelBuilder.Entity()
              .HasMany(p => p.Parents)
              .WithMany(r => r.Students)
              .Map(m =>
              {
                  m.ToTable("ParentStudents");
                  m.MapLeftKey("Student_ID");
                  m.MapRightKey("Parent_ID");
              });
You can then go on and work with defining a collection of Parents in Students class and a collection of Students in Parent class. The .WithMany() method  is not there in Entity Framework Core.

The lack of Many-To-Many feature in EF Core is hard to justify. POCO came out as a good model for domain driven development and not supporting many-to-many in a domain driven world is hard to justify. We didn't want to "dilute" the model with resolving entities, so we decided to "implement" the many-to-many resolution in our code. This series of post describes a good way of keeping domain relationship in our objects, so that there is no change in business logic in other parts of the application.


2. IDbSet Interface 

The IDbSet interface was removed in Entity Framework 6.0, because the team were looking to add new operations to it without defining a new set of interfaces. This is pretty well documented in EF 6.0 design decisions. I do not agree to this decision as it breaks the whole promise of interface as immutable being. The EF team wanted to avoid creating interfaces like IDBSet2, etc for more functions they decided to do away with it. However, the interface is still present in the in EntifyFramework 6.0 library, so our code still worked. Now we had to replace any use of IDbSet with the DBSet class. Also, meant our test code had to be re-written as we mocked IDbSet to for results from database.


3. No Lazy Loading

The entity framework does not support lazy loading as of yet. There is an open issue for it on github.  The feature request is in the backlog of EF team but there is no date of adding it yet. Lazy loading is the default behaviour of Entity Framework and would be there for you if you have the navigation property defined as virtual. This is another big way in which Entity Framework core breaks backward compatibility. 

The way around is to "Eager Loading" i.e. ensure that you use the .Include("") and .ThenInclude("") method in all places, where you are relying on Lazy loading. This is no simple as it's easy to miss it out at placed and the error is only manifested at run time. One way of go about doing it, is to find references of all virtual properties and add .Include("") where the object is "hydrated".


4. No GroupBy Translation

Entity Framework Core 2.0 doesn't support translate Group By to SQL. So, if your application is using GroupBy() method, you might need to take a look for alternatives. Fortunately, more support for Group By is getting added in EF Core 2.1.

The only way to resolve this issue without punitive performance impact is to move the logic to stored procedures. We were using GroupBy mostly in our reports, which were already a candidate to use stored procedures. So, although there was some work involved but the result was much better performance.


Final Words...

My experiences with migrating code from Entity Framework 6.0 to Entity Framework Core 2.0 would not have uncovered all pertaining issues in migration process but this post might help out someone who is looking to take the plunge. 

In my view, Entity Framework Core 2.0 is still a bit under cooked but if you are willing to take do the extra effort, it has enough functionality for you to move your model / data libraries to it.

Saturday, 23 September 2017

Extending Team Explorer in Visual Studio 2017

Visual studio extensibility has always been a great feature in Visual Studio and enhance the entire development experience. With Visual Studio 2017, there were a bunch of very substantial changes made with respect to extensibility. Most of these changes comes from the fact that Visual Studio now supports a lighter installation version with bare minimum feature installation as default. There is also the option to have multiple installation on the same machine. So, what does it mean for for extensions? 

VS2017 extensions now following the vsix v3 file format. If you have an extension for earlier visual studio versions and you want to port it to VS2017, it means a whole bunch of changes. Here, I am going to write an extension that demonstrate extending Team Explorer. We will create a very simple extension that has a button on Team Explorer, which will open notepad.

Project Creation & Dependencies

Let's start with creating a new extensibility vsix project. You will only see the option if you had selected the VS SDK option while installing visual studio. Let's call our project TeamExplorerExtSample. Visual Studio 2017 uses .Net Framework 4.6.1, so we select this version.



Once the project is created, you will see a couple of web files and a file called source.extension.vsixmanifest, which contains extension information. We will come to this file later.

Now let's add references to the assemblies we would need to extend Team Explorer. Note that with visual studio 2017, assemblies are not added to GAC so we would need to make sure that all desired assemblies are included in the vsix. To display a navigation button in team explorer, we would need to implement the interface ITeamExplorerNavigationItem2, so we would need to add references to the following assemblies
  •     Microsoft.TeamFoundation.Controls
  •     System
  •     System.ComponentModel.Composition
  •     System.Drawing

VSIX Manifest file:

The manifest file contains information about the extension, it's dependencies, assets and pre-requisites. Double click on the source.extension.vsixmanifest to see details. To extend Team Explorer, the key thing to remember is to add the assembly containing classes that implement Team Explorer interfaces as a MEF component. This will ensure that visual studio loads it up when loading team explorer.

Our VSIX manifest file looks like this


Extending ITeamNavigationItem2

Our extension will create a button in Team Explorer that opens up the notepad application. To do this, we need to extend the ITeamNavigationItem2 interface. The interface is found in Microsoft.TeamFoundation.Control assembly that we have already referenced. We will also need to add TeamExplorerNavigationItem attribute. Our very simple class looks as below.


namespace TeamExplorerExtSample
{
       using System;
       using System.ComponentModel;
       using System.Diagnostics;
       using System.Drawing;
       using Microsoft.TeamFoundation.Controls;
       [TeamExplorerNavigationItem("C9B2CF74-0C87-4CEA-ACA9-8CC1C816D7F3", 1800)]
       public class NotepadNavigationItem : ITeamExplorerNavigationItem2
       {
              public bool IsEnabled => true;
              public int ArgbColor => 0;
              public object Icon => null;
              public string Text => "Open Notepad";
              public Image Image => null;
              public bool IsVisible => true;
              public event PropertyChangedEventHandler PropertyChanged;
              public void Dispose()
              {
                     this.Dispose(true);
                     GC.SuppressFinalize(this);
              }
              protected virtual void Dispose(bool disposing)
              {
              }
              public void Execute()
              {
                     Process.Start("notepad.exe");
              }
              public void Invalidate()
              {
              }
       }
}

As you can see, the only matter we have got in the class is a call to Process.Start to start up notepad. The navigation item appears as below


Click on the button and a new instance of notepad opens up.

Conclusion:

Admittedly. this is a very simplistic extension but contains all the steps you need to extend Team Explorer. You can add classes to add Pages, Sections and Links in Team Explorer, add icons \ images and menu items. The code sample from post is here

Friday, 22 September 2017

Shelveset Comparer now supports Visual Studio 2017

 The popular Shelveset comparer extension that I created a few years ago now support Visual Studio 2017 as well. 

It took me some time to create a compatible version due to load of things happening in personal life. There is also the added reason that I am not using Git for almost all projects I am working on, so the need for shelveset & comparisons wasn't felt as much as it would have. 

While working on creating the new version, I had to learn about very substantial changes in visual studio extensibility. I will write a blog about it. Please feel free to download the extension & give me your feedback


<ciao />

Monday, 19 December 2016

ALM Rangers blog - Sending Email Notification from VSTS/TFS Build

Please read my post on the Microsoft ALM Rangers blog regarding sending notification emails from Build vNext.

Blog Post 

Thursday, 27 October 2016

Running a VSO Build Agent on a Windows Container

In my previous blog post, I wrote about running a VSO build agent on a docker container. So, it was only logical to try it out on a Windows Container. I had a few pitfalls in my quest to do that and found an issue on my way, however. it all worked in the end.

Note: 

At the time of writing this blog, running vsts agent is not supported on Windows Nano server. I could only manage to run it on microsoft\windowsservercore image. 


Window Server 2016 and Windows Containers

Windows Server 2016 comes with a full container support powered by built-in operating system features. There is a great session on the internals of windows container on channel9. There are two mechanisms of setting up containers on Windows - Hyper-V Containers which are effectively light weight virtual machines and Windows Containers. I am going to use Windows Containers


1. Windows 2016 Virtual Machine

To host my containers, I got a Windows 2016 virtual machine going. At the time of writing this blog, Windows Server 2016 is still at Technical Preview 2 stage and new updates are coming frequently. They can be downloaded from the Microsoft website

The minimum build you will require is Windows 2016 Server build 14393.

Once you have installed windows and are on the virtual machine, open command prompt and type in winver. You will see a dialog like this. Make sure the build number is at least the required version



2. Install Containers Feature

Type in the following command in a PowerShell console window
Install-WindowsFeatures Containers
The feature needs a restart so type in the following
Restart-Computer -Force
Once the machine is restarted, continue with following 


3. Install Docker

The docker version deployed from the msi isn't supported on Windows Server 2016 yet. For me downloading the docker msi from the docker website didn't work and I got the following error


However, got it working by downloading the following zip file 
https://download.docker.com/components/engine/windows-server/cs-1.12/docker.zip 
and extracting it to Program Files. I did it by running the following in my PowerShell console 
Invoke-WebRequest "https://download.docker.com/components/engine/windows-server/cs-1.12/docker.zip" -OutFile "$env:TEMP\docker.zip" -UseBasicParsing
You will have two executable files in the extracted directory as shown



Add the directory to your path variable. 

Now register the dockerd service by typing the following
dockerd.exe --register-service

Alternate Installation Option

After installing docker, I found that the following was a better and easier way of install docker on Windows Server 2016. Type in the following in your powershell console.

Install-Module -Name DockerMsftProvider -Repository PSGallery -Force 
Install-Package -Name docker -ProviderName DockerMsftProvider
Restart-Computer -Force

Once installed, verify that docker is running fine your machine by type in the following 



docker run microsoft/sample-dotnet

You should see a message of the like "Welcome to .Net Core!" on your console window. This means that your docker instance is working fine.



4. Pulling microsoft/windowservercore image

So, this is where I got stuck a bit. I was trying to use microsoft/nanoserver, which is a fraction of a size of full windows image and support .Net Core. In the end, I found out that running vso agent on server on nano server is not supported yet.

So, I pulled the full server core image. You can do it by running



docker pull microsoft/windowsservercore

The image is about 8GB and takes some time to download. Once pull, run the image by typing in 


docker run microsoft/windowsservercore

At this stage, we are on a windows docker container running Windows 10. I checked in by typing [System.Environment]::OSVersion.Version and got the following version


Major  Minor  Build  Revision

-----  -----  -----  --------

10     0      14393  0


5. Running VSO Agent


Now that we have a running container, the steps to run VSO Agent is as simple as running it on any Windows 10 machine. 


The only complication is the lack of GUI, so I used powershell to download zip file and extract it as follows



Invoke-WebRequest https://github.com/Microsoft/vsts-agent/releases/download/v2.108.0/vsts-agent-win7-x64-2.108.0.zip -outfile vsts-agent-win7-x64-2.108.0.zip
Expand-Archive -Path .\vsts-agent-win7-x64-2.108.0.zip -DestinationPath C:\vsts-agent 

You will see the usual vsts agent's files in the destination directory. Simply type in .\config.cmd and follow instructions.