First Hours with Visual Studio Code on Mac and Windows

Today is one of those awesome days if you build stuff on .NET platform. They announced bunch of stuff during Build 2015 keynote and one of them is Visual Studio Code, a free and stripped down version of Visual Studio which works on Mac OS X, Linux and Windows. Let me give you my highlights in this short blog post :)
2015-04-29 21:15
Tugberk Ugurlu


Today is one of those awesome days if you are building stuff on .NET platform. Microsoft announced bunch of stuff at Build 2015 keynote a few hours ago and one of them is Visual Studio Code, a free and stripped down version of Visual Studio which works on Mac OS X, Linux and Windows. It leverages bunch of existing open source software like OmniSharp, Electron. Most of all, this was my #bldwin wish :)

First of all, you should definitely install Visual Studio Code and start checking the documentation which is very extensive. I followed those steps and as I am very excited about this new tool, I wanted to share my experience thus far which is not much but very promising.

First thing I noticed was the top notch support for ASP.NET 5. The documentation for ASP.NET 5 support is pretty good but some features are not highlighted there. For example, you are getting the IntelliSense for dependencies:

Ekran Resmi 2015-04-29 19.26.46

When you add a dependency, you get nice notification telling that you should restore:

Ekran Resmi 2015-04-29 19.59.55

Pretty nice! So, how would you restore? Hit ⇧⌘P to get the command pallet up and you can see the restore command there:

Ekran Resmi 2015-04-29 21.45.20

It will run the restore inside the terminal:

Ekran Resmi 2015-04-29 21.48.30

You can also invoke your commands defined inside the project.json:

Ekran Resmi 2015-04-29 20.09.09

Ekran Resmi 2015-04-29 20.10.14

Obviously, you can change the theme.

Ekran Resmi 2015-04-29 20.18.35

Writing C# code is also very slick! You currently don’t have all the nice refactoring features you have in full fledged Visual Studio but it’s still impressive:

Ekran Resmi 2015-04-29 20.19.57

Ekran Resmi 2015-04-29 20.22.43

We even have some advanced stuff like Peek Definition:

Ekran Resmi 2015-04-29 20.02.29

Check out the documentation for all coding editor features.

As mentioned Windows is also fully supported as you might guess :)

Screenshot 2015-04-29 20.35.16

Screenshot 2015-04-29 21.18.29

Screenshot 2015-04-29 21.23.36

I want to touch on the Git integration as well. I generally use Git bash and this won’t change for me but having the diff view inside the editor in a very nice way is priceless!

image

How about old/current .NET applications? I managed to get one up and running easily and managed to get the build working by defining a task for that:

{
	"version": "0.1.0",
	
	// The command is tsc.
	"command": "msbuild",

	// Show the output window only if unrecognized errors occur. 
	"showOutput": "silent",
	
	// Under windows use tsc.exe. This ensures we don't need a shell.
	"windows": {
		"command": "C:\\Program Files (x86)\\MSBuild\\14.0\\Bin\\msbuild.exe"
	},
	
	// args is the HelloWorld program to compile.
	"args": []
}

image

I was expecting this to work without any further configurations but it could be just me not being able to get it working.

As said, it’s very early but I am sold for this editor! Also, this is a fantastic time to build products on .NET platform. I would like to thank all the people at Microsoft and open source community who are making our lives easier and enjoyable. I will leave you all now and enjoy my new toy! :O

Exciting Things About ASP.NET 5 Series: Build Only Dependencies

In this very exciting post, I would like to talk about build only dependencies whose code can be compiled into target project and the dependency won’t be shown as a dependency.
2015-04-28 07:48
Tugberk Ugurlu


Web development experience with .NET has never seen a drastic change like this since its birth day. Yes, I’m talking about ASP.NET 5 :) I have been putting my toes into this water for a while now and a few days ago, I started a new blog post series about ASP.NET 5 (with hopes that I will continue this time :)). To be more specific, I’m planning on writing about the things I am actually excited about this new cloud optimized (TM) runtime. Those things could be anything which will come from ASP.NET GitHub account: things I like about the development process, Visual Studio tooling experience for ASP.NET 5, bowels of .NET Execution Runtime, tiny little things about the frameworks like MVC, Identity, Entity Framework.

In this very exciting post, I would like to talk about build only dependencies whose code can be compiled into target project.

BIG ASS CAUTION! At the time of this writing, I am using DNX 1.0.0-beta5-11611 version. As things are moving really fast in this new world, it’s very likely that the things explained here will have been changed as you read this post. So, be aware of this and try to explore the things that are changed to figure out what are the corresponding new things.

Also, inside this post I am referencing a lot of things from ASP.NET GitHub repositories. In order to be sure that the links won’t break in the future, I’m actually referring them by getting permanent links to the files on GitHub. So, these links are actually referring the files from the latest commit at the time of this writing and they have a potential to be changed, too. Read the "Getting permanent links to files" post to figure what this actually is.

The Problem

From the start of NuGet, it has been a real pain to have source file dependencies. There are some examples of this like TaskHelpers.Sources. When you install this package, it will end up inside your codebase.

image

The nice thing about this type of source dependencies is that you don’t need to fight with DLL hell. You can have one version of this package and your consumer can have another version of it. As the source files you pull down from NuGet has no public members, there will be no problems whatsoever as the code is compiled into their assembly separately. However, there are several problems with the way we are getting them in:

  • I am committing this code into source control system which is weird.
  • How about updates? What happens if I make a change to that file?

So, it wasn’t that good of an approach we had there but ASP.NET 5 has a top notch solution this problem: build only dependencies.

Consuming Build Only Dependencies

These are the kind of dependencies that you can pull in and it will just be compiled into your stuff. As you can also guess, it won’t be shown as a dependency. Let’s see an example!

One of the packages that support this concept is Microsoft.Framework.CommandLineUtils package. You can pull this down as a build-only dependency by declaring it inside your project.json file as below:

{
    "version": "1.0.0-*",

    "dependencies": {
        "Microsoft.Framework.CommandLineUtils": { 
            "version": "1.0.0-beta5-11611", "type": "build" 
        }
    },

    // ...
}

Notice the type field there.  That indicates the type of the dependency. Let’s stop here and without doing anything else further, run dnu pack to get a NuGet package out. When we look at the manifest of the generated NuGet package, we won’t see any sign of the build dependency there:

image

Makes sense. Let’s peak inside the assembly now.

image

That’s what I expected to see. All the stuff distributed with that packages is compiled into my target assembly. As you can guess, I can use these stuff inside my project without any problems:

using Microsoft.Framework.Runtime.Common.CommandLine;

namespace AspNet5CommandLineSample
{
    public class Program
    {
        public void Main(string[] args)
        {
            var app = new CommandLineApplication();
        }
    }
}

You may ask that ASP.NET 5 applications can work without assemblies on disk. That’s true and at that point, this will end up being compiled into the target assembly in-memory.

If you look at what I committed to my source control system, it’s barely nothing which solves one of the biggest pains of source packages.

Generating Build Only Dependencies

Generating libraries which can be consumed as a build only dependency is also fairly simple but there are some little things which doesn’t make sense. Assuming I have a library called AspNet5Utils and it has the following internal type:

namespace AspNet5Utils
{
    internal static class StringExtensions
    {
        internal static string Suffix(this string value, string suffix)
        {
            return $"{value}-{suffix}";
        }
    }
}

If you want this type to end up as a build dependency, you need to declare this as shared inside the project.json file.

{
    "version": "1.0.0-*",

    "shared": "**/*.cs",

    "dependencies": {
    },

    // ...
}

Doing this will give a hint to dnu pack command to pack these types into the shared folder inside the NuGet package.

image

Notice that there is also an assembly generated there. Maybe there is a reason behind why this is there but as I don’t have any type which ends up inside an assembly, I would expect this to not have one at all. Indeed, if you decompile the assembly, you will see that nothing is there:

image

In order to consume this package, you don’t actually need to distribute this through NuGet if you only want to consume this inside the same solution. As the dependency consumption is unified in ASP.NET 5, this can easy be a project dependency as you would expect:

{
    "version": "1.0.0-*",

    "dependencies": {
        "AspNet5Utils": { "version": "", "type": "build" }
    },

    // ..
}

In my opinion, this is one of the many powerful and yet simple concepts that ASP.NET 5 has brought to us. Enjoy!

How Azure Web Apps Hosts an ASP.NET 5 Application

ASP.NET 5 application has totally a different directory structure when you try to publish it and it wasn't clear for me how Azure Web Apps is actually able to host an ASP.NET 5 application. If you are confused on this as well, the answer is here.
2015-04-12 10:13
Tugberk Ugurlu


I want to write this quick post because figuring out how an ASP.NET 5 application is hosted under Azure Web Apps was a big question for me. Some information is already there on this topic but the concept wasn’t crystal clear because when you look at the packed version of an ASP.NET 5 web application, it has the following structure on disk:

image

It will even get more interesting when you look inside the wwwroot folder:

image

We have the static files, bin folder which only contains AspNet.Loader.dll inside it and a web.config file. The most interesting bit here is the information inside the web.config file and this information will be read by Helios:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <appSettings>
    <add key="bootstrapper-version" value="1.0.0-beta4-11526" />
    <add key="runtime-path" value="..\approot\packages" />
    <add key="dnx-version" value="" />
    <add key="dnx-clr" value="" />
    <add key="dnx-app-base" value="..\approot\src\ConfyConf.Client.Web" />
  </appSettings>
</configuration>

web.config file gives us enough evidence that the wwwroot is the directory that we need to point IIS to and then Helios will read these application settings information to figure out where the application actually is, where the dependencies and packages are, etc.. Let’s deploy the application using Visual Studio Publish feature. I created a brand new Azure Web App on the fly and hit publish:

image

When the deployment is completed, the web site is immediately up:

image

Let’s look at how the directory structure look like after the deployment:

image

Two interesting bits here are approot and wwwroot folders. The question here is that how Azure Web App knew to look into wwwroot folder. It was actually dead simple but it wasn’t obvious at the first glance. Before showing the answer, let’s have a look what IIS Express does to host an ASP.NET 5 application which will give us an hint on the answer.

I fired up the application through the Visual Studio to get IIS Express host my application. After the application is up, I dug into Task Manager to get the command line arguments for IIS Express:

iisexpress.exe    10736    Running    Tugberk    00     33,804 K    33    "C:\Program Files (x86)\IIS Express\iisexpress.exe"  /config:"C:\Users\Tugberk\Documents\IISExpress\config\applicationhost.config"  /site:"WebApplication10" /apppool:"Clr4IntegratedAppPool"    IIS Express Worker Process

This points us to applicationhost.config file and WebApplication10 site inside it. When you look at the site node for WebApplication10, you will see that some of the magic is actually happening there:

<site name="WebApplication10" id="77">
    <application path="/" applicationPool="Clr4IntegratedAppPool">
        <virtualDirectory path="/" physicalPath="D:\apps\WebApplication10\src\WebApplication10\wwwroot" />
    </application>
    <bindings>
        <binding protocol="http" bindingInformation="*:47112:localhost" />
    </bindings>
</site>

wwwroot is pointed as a virtual directory for the web application here for the root path. So, is this information helping us to see how Azure Web App is hosting our app? Absolutely! It gives us the information that something similar should be configured on Azure Web Apps side that it sees the wwwroot folder as the root. If you navigate to Configure section for your Azure Web App and scroll down to the bottom, you will see the virtual directory configuration there.

azure-aspnet5-virtual-directory

Clever! My guess is that this configuration was put there when I was publishing the Web Application through web deploy inside the Visual Studio. Digging into Web Publish Activity output could give us more information about when exactly this configuration is set.

How to Use Octopus Deploy Step Templates for SQL Release

In this post, I will go through how you can use SQL Release Octopus Deploy step templates to make the Octopus Deploy Integration of SQL Release easier by going through one of the deployment flows.
2015-04-06 21:39
Tugberk Ugurlu


SQL Release, a set of PowerShell cmdlets from Redgate which automate deploying changes to your production databases, went out of beta and became part of DLM Automation Suite a few days ago. As part of this release, Octopus Deploy step templates for SQL Release are also included inside the suite and in this post, I will go through how you can use these step templates to make the Octopus Deploy Integration of SQL Release easier by going through one of the deployment flows (the recommended one).

If you are trying to integrate your SQL Server databases into your deployment pipeline, I strongly encourage you to try DLM Automation Suite out. At the time of this writing, it has 28-day free trial option. You can also check out the documentation page for DLM Automation Suite documentation and information about included products.

image

Installing SQL Release Step Templates

If you are not familiar with Step Templates, it’s a plugin mechanism that Octopus Deploy has which allows you to get the input from the user through a nice UI and run specific PowerShell script based on the input passed in. A step template is nothing but a structured JSON text and they are hosted inside the Octopus Deploy Step Templates library. They actually don’t need to be hosted there in order for you to use them but it makes it very convenient to find one in a central place.

SQL Release has four step templates to satisfy different flows and use cases.

image

First thing to do in my demo blog post here is to install these step templates into your Octopus server. The way to install a step template is a little different that you might expect.

  1. Go inside a step template page on Octopus Deploy Library web site.
  2. Hit "Copy to clipboard" button on the right hand side.
  3. Go to your Octopus server and navigate to Library (on the top menu)
  4. You should see the "Step templates" pane on the left side. This will open up the step templates page for your Octopus server.
  5. On that page, hit "Import", paste the step template inside the text area and click "Import".

You will end up with a look similar to the following one:

image

After I imported all the step templates for SQL Release, it is time to actually create the deployment process. We will be using only two of these in our example here.

In order to use the SQL Release step templates, you need to have SQL Release installed inside the Octopus Tentacle machine. If you install SQL Release while the Tentacle is running, you need to restart the Tentacle service (through the Tentacle Manager, for example).

Setting up the Octopus Project

Before going through each step of the deployment process, I wanted first show the end look of the Octopus project.

image

At the end, we will have four steps to complete our deployment for this example. A few more things to point out:

  • We will be only deploying the database for the purpose of this demo but you can imagine having your application deployment here as well.
  • We will deploy the database schema changes in two steps in order to allow review and approval of the script.

Download and Extract NuGet Package Step

First step will be to download and extract the NuGet package which contains the scripts folder for the database schema state. This NuGet package will be produced by another DLM Automation Suite tool named SQL CI, a plugin for your CI tool that allows continuous integration for SQL Server databases. The script folder I mentioned here can be produced by a few Redgate tools such as SQL Source Control. The script folder I am using for this sample is hosted on my GitHub repository.

In order to create a package, I fired the following SQL CI command:

sqlci Build --scriptsFolder="D:\github\Geveze\db" --outputFolder="D:\github\Geveze" --packageId="Geveze" --packageVersion="1.0.0"

image

When I have the package created, I pushed it to Octopus Deploy NuGet feed using NuGet Command Line tool:

nuget push Geveze.1.0.0.nupkg -ApiKey API-CMGMYZ1GM95FHJNLWVRQQGQRAPK -Source http://localhost:4000/nuget/packages

Typically, these steps would be performed inside your CI Server but I didn’t want to have CI integration in order not to complicate things more for this demo. Also, check out the SQL CI Documentation for more information about the command line options and other related stuff. We won’t go into details about this tool in this post.

Once I pushed the package to Octopus Deploy NuGet feed, I was able to see the package while I was configuring the step:

image

Create Database Release Step

This is probably the most important step in our process. Here, SQL Release will create the actual changes script and bunch of other artifacts that can be used later. These will be generated based on the package it will obtain from the previous package and the target database which will be used for compression.

In order to add a "Create Database Release Step", you need to hit "Add step" on the Project > Process page. From the "Choose step type" window, choose "Redgate - Create Database Release" option.

image

The step configuration will look something like below:

image

As you can see, I am using Octopus Deploy variables here. The ones that I have are as shown below:

image

Also note that I configured this step to be only run for the staging environment which will basically allow you to reuse the generated changes script to be deployed to production environment. This also means that reviewed script will be used in all deployments. As a final note: SQL Release will fail if the state of the target database is drifted from the compression state which makes the whole process safer.

The next step is "Review Database Changes" which is a standard Octopus Deploy manual intervention and approval step. I will skip that step here as the documentation is pretty straight forward on this.

Deploy from Database Release Step

The last step is for actually deploying the changes. Once the sign-off is given, the changes to the database can be deployed. In order to start configuring this step, choose the "Redgate - Deploy from Database Release" step type from the "Choose step type" window. The configuration will look something like below:

image

One big difference here is that this step will be executed on both Staging and Production environments. In fact, this step is the only step that will run against the Production environment in our example here.

Create a Release and Deploy

We are now ready to create a release (alternatively, you can take advantage of "Automatic Release Creation" feature of Octopus Deploy) and deploy to staging. When you start the deployment, it will pause on the manual intervention step as expected and we will see a few artifacts created on right hand side.

image

You can see the update script, warnings and an HTML report for the changes. If you click on the Changes.html file, it will be download and you can open it up with your choice of web browser. It will give you a nice diff report.

image

Once you approve, the deployment will go on and the last step will run to actually deploy the changes. When you are happy with the staging environment, you can deploy the changes to Production by clicking the Promote button on Octopus server.

image

As you can see, only step will run here is the last one. Remember that if the either of the database are drifted between the time of release creating and deployment, SQL Release will fail to deploy to make the process safer.

Obviously, SQL Release and its Octopus Deploy step templates make it easier to integrate with Octopus Deploy for deploying database schema changes in a safe and reliable way. If you feel that you are already struggling with making your database as part of your continuous delivery story, definitely try DLM Automation Suite and SQL Release out. You can also give feedback to SQL Release on its Uservoice page.

Compiling C# Code Into Memory and Executing It with Roslyn

Let me show you how to compile a piece of C# code into memory and execute it with Roslyn. It is super easy if you believe it or not :)
2015-03-31 20:39
Tugberk Ugurlu


For the last couple of days, I have been looking into how to get Razor view engine running outside ASP.NET 5 MVC. It was fairly straight forward but there are a few bits and pieces that you need to stitch together which can be challenging. I will get Razor part in a later post and in this post, I would like to show how to compile a piece of C# code into memory and execute it with Roslyn, which was one of the parts of getting Razor to work outside ASP.NET MVC.

First thing is to install C# code analysis library into you project though NuGet. In other words, installing Roslyn :)

Install-Package Microsoft.CodeAnalysis.CSharp -pre

This will pull down bunch of stuff like Microsoft.CodeAnalysis.Analyzers, System.Collections.Immutable, etc. as its dependencies which is OK. In order to compile the code, we want to first create a SyntaxTree instance. We can do this pretty easily by parsing the code block using the CSharpSyntaxTree.ParseText static method.

SyntaxTree syntaxTree = CSharpSyntaxTree.ParseText(@"
    using System;

    namespace RoslynCompileSample
    {
        public class Writer
        {
            public void Write(string message)
            {
                Console.WriteLine(message);
            }
        }
    }");

The next step is to create a Compilation object. If you wonder, the compilation object is an immutable representation of a single invocation of the compiler (code comments to the rescue). It is the actual bit which carries the information about syntax trees, reference assemblies and other important stuff which you would usually give as information to the compiler. We can create an instance of a Compilation object through another static method: CSharpCompilation.Create.

string assemblyName = Path.GetRandomFileName();
MetadataReference[] references = new MetadataReference[]
{
    MetadataReference.CreateFromFile(typeof(object).Assembly.Location),
    MetadataReference.CreateFromFile(typeof(Enumerable).Assembly.Location)
};

CSharpCompilation compilation = CSharpCompilation.Create(
    assemblyName,
    syntaxTrees: new[] { syntaxTree },
    references: references,
    options: new CSharpCompilationOptions(OutputKind.DynamicallyLinkedLibrary));

Hard part is now done. The final bit is actually running the compilation and getting the output (in our case, it is a dynamically linked library). To run the actual compilation, we will use the Emit method on the Compilation object. There are a few overloads of this method but we will use the one where we can pass a Stream object in and make the Emit method write the assembly bytes into it. Emit method will give us an instance of an EmitResult object and we can pull the status of the compilation, warnings, failures, etc. from it. Here is the actual code:

using (var ms = new MemoryStream())
{
    EmitResult result = compilation.Emit(ms);

    if (!result.Success)
    {
        IEnumerable<Diagnostic> failures = result.Diagnostics.Where(diagnostic => 
            diagnostic.IsWarningAsError || 
            diagnostic.Severity == DiagnosticSeverity.Error);

        foreach (Diagnostic diagnostic in failures)
        {
            Console.Error.WriteLine("{0}: {1}", diagnostic.Id, diagnostic.GetMessage());
        }
    }
    else
    {
        ms.Seek(0, SeekOrigin.Begin);
        Assembly assembly = Assembly.Load(ms.ToArray());
    }
}

As mentioned before, here, we are getting the EmitResult out as a result and looking for its status. If it’s not a success, we get the errors out and output them. If it’s a success, we load the bytes into an Assembly object. The Assembly object you have here is no different the ones that you are used to. From this point on, it’s all up to your ninja reflection skills in order to execute the compiled code. For the purpose of this demo, it was as easy as the below code:

Type type = assembly.GetType("RoslynCompileSample.Writer");
object obj = Activator.CreateInstance(type);
type.InvokeMember("Write",
    BindingFlags.Default | BindingFlags.InvokeMethod,
    null,
    obj,
    new object[] { "Hello World" });

This was in a console application and after running the whole thing, I got the expected result:

image

Pretty sweet and easy! This sample is up on GitHub if you are interested.

Tags