Should I await on Task.FromResult Method Calls?

Task class has a static method called FromResult which returns an already completed (at the RanToCompletion status) Task object. I have seen a few developers "await"ing on Task.FromResult method call and this clearly indicates that there is a misunderstanding here. I'm hoping to clear the air a bit with this post.
2014-02-24 21:14
Tugberk Ugurlu


Task class has a static method called FromResult which returns an already completed (at the RanToCompletion status) Task object. I have seen a few developers "await"ing on Task.FromResult method call and this clearly indicates that there is a misunderstanding here. I'm hoping to clear the air a bit with this post.

What is the use of Task.FromResult method?

Imagine a situation where you are implementing an interface which has the following signature:

public interface IFileManager
{
     Task<IEnumerable<File>> GetFilesAsync();
}

Notice that the method is Task returning which allows you to make the return expression represent an ongoing operation and also allows the consumer of this method to call this method in an asynchronous manner without blocking (of course, if the underlying layer supports it). However, depending on the case, your operation may not be asynchronous. For example, you may just have the files inside an in memory collection and want to return it from there, or you can perform an I/O operation to retrieve the files list asynchronously from a particular data store for the first time and cache the results there so that you can just return it from the in-memory cache for the upcoming calls. These are just some scenarios where you need to return a successfully completed Task object. Here is how you can achieve that without the help of Task.FromResult method:

public class InMemoryFileManager : IFileManager
{
    IEnumerable<File> files = new List<File>
    {
        //...
    };

    public Task<IEnumerable<File>> GetFilesAsync()
    {
        var tcs = new TaskCompletionSource<IEnumerable<File>>();
        tcs.SetResult(files);

        return tcs.Task;
    }
}

We here used the TaskCompletionSource to produce a successfully completed Task object with the result. Therefore, the caller of the method will immediately have the result. This was what we had been doing till the introduction of .NET 4.5. If you are on .NET 4.5 or above, you can just use the Task.FromResult to perform the same operation:

public class InMemoryFileManager : IFileManager
{
    IEnumerable<File> files = new List<File>
    {
        //...
    };

    public Task<IEnumerable<File>> GetFilesAsync()
    {
        return Task.FromResult<IEnumerable<File>>(files);
    }
}

Should I await Task.FromResult method calls?

TL;DR version of the answer: absolutely not! If you find yourself in need to using Task.FromResult, it's clear that you are not performing any asynchronous operation. Therefore, just return the Task from the Task.FromResult output. Is it dangerous to do this? Not completely but it's illogical and has a performance effect.

Long version of the answer is a bit more in depth. Let's first see what happens when you "await" on a method which matches the pattern:

IEnumerable<File> files = await fileManager.GetFilesAsync();

This code will be read by the compiler as follows (well, in a simplest way):

var $awaiter = fileManager.GetFilesAsync().GetAwaiter();
if(!$awaiter.IsCompleted) 
{
     DO THE AWAIT/RETURN AND RESUME
}

var files = $awaiter.GetResult();

Here, we can see that if the awaited Task already completed, then it skips all the await/resume work and directly gets the result. Besides this fact, if you put "async" keyword on a method, a bunch of code (including the state machine) is generated regardless of the fact that you use await keyword inside the method or not. Keeping all these facts in mind, implementing the IFileManager as below is going to cause nothing but overhead:

public class InMemoryFileManager : IFileManager
{
    IEnumerable<File> files = new List<File>
    {
        //...
    };

    public async Task<IEnumerable<File>> GetFilesAsync()
    {
        return await Task.FromResult<IEnumerable<File>>(files);
    }
}

So, don't ever think about "await"ing on Task.FromResult or I'll hunt you down in your sweet dreams :)

References

How and Where Concurrent Asynchronous I/O with ASP.NET Web API

When we have uncorrelated multiple I/O operations that need to be kicked off, we have quite a few ways to fire them off and which way you choose makes a great amount of difference on a .NET server side application. In this post, we will see how we can handle the different approaches in ASP.NET Web API.
2014-02-21 22:06
Tugberk Ugurlu


When we have uncorrelated multiple I/O operations that need to be kicked off, we have quite a few ways to fire them off and which way you choose makes a great amount of difference on a .NET server side application. Pablo Cibraro already has a great post on this topic (await, WhenAll, WaitAll, oh my!!) which I recommend you to check that out. In this article, I would like to touch on a few more points. Let's look at the options one by one. I will use a multiple HTTP request scenario here which will be consumed by an ASP.NET Web API application but this is applicable for any sort of I/O operations (long-running database calls, file system operations, etc.).

We will have two different endpoint which will hit to consume the data:

  • http://localhost:2700/api/cars/cheap
  • http://localhost:2700/api/cars/expensive

As we can infer from the URI, one of them will get us the cheap cars and the other one will get us the expensive ones. I created a separate ASP.NET Web API application to simulate these endpoints. Each one takes more than 500ms to complete and in our target ASP.NET Web API application, we will aggregate these two resources together and return the result. Sounds like a very common scenario.

Inside our target API controller, we have the following initial structure:

public class Car 
{
    public int Id { get; set; }
    public string Make { get; set; }
    public string Model { get; set; }
    public int Year { get; set; }
    public float Price { get; set; }
}

public class CarsController : BaseController 
{
    private static readonly string[] PayloadSources = new[] { 
        "http://localhost:2700/api/cars/cheap",
        "http://localhost:2700/api/cars/expensive"
    };

    private async Task<IEnumerable<Car>> GetCarsAsync(string uri) 
    {
        using (HttpClient client = new HttpClient()) 
        {
            var response = await client.GetAsync(uri).ConfigureAwait(false);
            var content = await response.Content
                .ReadAsAsync<IEnumerable<Car>>().ConfigureAwait(false);

            return content;
        }
    }

    private IEnumerable<Car> GetCars(string uri) 
    {
        using (WebClient client = new WebClient()) 
        {    
            string carsJson = client.DownloadString(uri);
            IEnumerable<Car> cars = JsonConvert
                .DeserializeObject<IEnumerable<Car>>(carsJson);
                
            return cars;
        }
    }
}

We have a Car class which will represent a car object that we are going to deserialize from the JSON payload. Inside the controller, we have our list of endpoints and two private methods which are responsible to make HTTP GET requests against the specified URI. GetCarsAsync method uses the System.Net.Http.HttpClient class, which has been introduces with .NET 4.5, to make the HTTP calls asynchronously. With the new C# 5.0 asynchronous language features (A.K.A async modifier and await operator), it is pretty straight forward to write the asynchronous code as you can see. Note that we used ConfigureAwait method here by passing the false Boolean value for continueOnCapturedContext parameter. It’s a quite long topic why we need to do this here but briefly, one of our samples, which we are about to go deep into, would introduce deadlock if we didn’t use this method.

To be able to measure the performance, we will use a little utility tool from Apache Benchmarking Tool (A.K.A ab.exe). This comes with Apache Web Server installation but you don’t actually need to install it. When you download the necessary ZIP file for the installation and extract it, you will find the ab.exe inside. Alternatively, you may use Web Capacity Analysis Tool (WCAT) from IIS team. It’s a lightweight HTTP load generation tool primarily designed to measure the performance of a web server within a controlled environment. However, WCAT is a bit hard to grasp and set up. That’s why we used ab.exe here for simple load tests.

Please, note that the below compressions are poor and don't indicate any real benchmarking. These are just compressions for demo purposes and they indicate the points that we are looking for.

Synchronous and not In Parallel

First, we will look at all synchronous and not in parallel version of the code. This operation will block the running the thread for the amount of time which takes to complete two network I/O operations. The code is very simple thanks to LINQ.

[HttpGet]
public IEnumerable<Car> AllCarsSync() {

    IEnumerable<Car> cars =
        PayloadSources.SelectMany(x => GetCars(x));

    return cars;
}

For a single request, we expect this to complete for about a second.

AllCarsSync

The result is not surprising. However, when you have multiple concurrent requests against this endpoint, you will see that the blocking threads will be the bottleneck for your application. The following screenshot shows the 200 requests to this endpoint in 50 requests blocks.

AllCarsSync_200

The result is now worse and we are paying the price for blowing the threads for long running I/O operations. You may think that running these in-parallel will reduce the single request time and you are not wrong but this has its own caveats, which is our next section.

Synchronous and In Parallel

This option is mostly never good for your application. With this option, you will perform the I/O operations in parallel and the request time will be significantly reduced if you try to measure only with one request. However, in our sample case here, you will be consuming two threads instead of one to process the request and you will block both of them while waiting for the HTTP requests to complete. Although this reduces the overall request processing time for a single request, it consumes more resources and you will see that the overall request time increases while your request count increases. Let’s look at the code of the ASP.NET Web API controller action method.

[HttpGet]
public IEnumerable<Car> AllCarsInParallelSync() {

    IEnumerable<Car> cars = PayloadSources.AsParallel()
        .SelectMany(uri => GetCars(uri)).AsEnumerable();

    return cars;
}

We used “Parallel LINQ (PLINQ)” feature of .NET framework here to process the HTTP requests in parallel. As you can, it was just too easy; in fact, it was only one line of digestible code. I tent to see a relationship between the above code and tasty donuts. They all look tasty but they will work as hard as possible to clog our carotid arteries. Same applies to above code: it looks really sweet but can make our server application miserable. How so? Let’s send a request to this endpoint to start seeing how.

AllCarsInParallelSync

As you can see, the overall request time has been reduced in half. This must be good, right? Not completely. As mentioned before, this is going to hurt us if we see too many requests coming to this endpoint. Let’s simulate this with ab.exe and send 200 requests to this endpoint in 50 requests blocks.

AllCarsInParallelSync_200

The overall performance is now significantly reduced. So, where would this type of implementation make sense? If your server application has small number of users (for example, an HTTP API which consumed by the internal applications within your small team), this type of implementation may give benefits. However, as it’s now annoyingly simple to write asynchronous code with built-in language features, I’d suggest you to choose our last option here: “Asynchronous and In Parallel (In a Non-Blocking Fashion)”.

Asynchronous and not In Parallel

Here, we won’t introduce any concurrent operations and we will go through each request one by one but in an asynchronous manner so that the processing thread will be freed up during the dead waiting period.

[HttpGet]
public async Task<IEnumerable<Car>> AllCarsAsync() {

    List<Car> carsResult = new List<Car>();
    foreach (var uri in PayloadSources) {

        IEnumerable<Car> cars = await GetCarsAsync(uri);
        carsResult.AddRange(cars);
    }

    return carsResult;
}

What we do here is quite simple: we are iterating through the URI array and making the asynchronous HTTP call for each one. Notice that we were able to use the await keyword inside the foreach loop. This is all fine. The compiler will do the right thing and handle this for us. One thing to keep in mind here is that the asynchronous operations won’t run in parallel here. So, we won’t see a difference when we send a single request to this endpoint as we are going through the each request one by one.

AllCarsAsync

As expected, it took around a second. When we increase the number of requests and concurrency level, we will see that the average request time still stays around a second to perform.

AllCarsAsync_200

This option is certainly better than the previous ones. However, we can still do better in some certain cases where we have limited number of concurrent I/O operations. The last option will look into this solution but before moving onto that, we will look at one other option which should be avoided where possible.

Asynchronous and In Parallel (In a Blocking Fashion)

Among these options shown here, this is the worst one that one can choose. When we have multiple Task returning asynchronous methods in our hand, we can wait all of them to finish with WaitAll static method on Task object. This results several overheads: you will be consuming the asynchronous operations in a blocking fashion and if these asynchronous methods is not implemented right, you will end up with deadlocks. At the beginning of this article, we have pointed out the usage of ConfigureAwait method. This was for preventing the deadlocks here. You can learn more about this from the following blog post: Asynchronous .NET Client Libraries for Your HTTP API and Awareness of async/await's Bad Effects.

Let’s look at the code:

[HttpGet]
public IEnumerable<Car> AllCarsInParallelBlockingAsync() {
    
    IEnumerable<Task<IEnumerable<Car>>> allTasks = 
        PayloadSources.Select(uri => GetCarsAsync(uri));

    Task.WaitAll(allTasks.ToArray());
    return allTasks.SelectMany(task => task.Result);
}

Let's send a request to this endpoint to see how it performs:

AllCarsInParallelBlockingAsync

It performed really bad but it gets worse as soon as you increase the concurrency rate:

AllCarsInParallelBlockingAsync_200

Never, ever think about implementing this solution. No further discussion is needed here in my opinion.

Asynchronous and In Parallel (In a Non-Blocking Fashion)

Finally, the best solution: Asynchronous and In Parallel (In a Non-Blocking Fashion). The below code snippet indicates it all but just to go through it quickly, we are bundling the Tasks together and await on the Task.WhenAll utility method. This will perform the operations asynchronously in Parallel.

[HttpGet]
public async Task<IEnumerable<Car>> AllCarsInParallelNonBlockingAsync() {

    IEnumerable<Task<IEnumerable<Car>>> allTasks = PayloadSources.Select(uri => GetCarsAsync(uri));
    IEnumerable<Car>[] allResults = await Task.WhenAll(allTasks);

    return allResults.SelectMany(cars => cars);
}

If we make a request to the endpoint to execute this piece of code, the result will be similar to the previous one:

AllCarsInParallelNonBlockingAsync

However, when we make 50 concurrent requests 4 times, the result will shine and lays out the advantages of asynchronous I/O handling:

AllCarsInParallelNonBlockingAsync_200

Conclusion

At the very basic level, what we can get out from this article is this: do perform load tests against your server applications based on your estimated consumption rates if you have any sort of multiple I/O operations. Two of the above options are what you would want in case of multiple I/O operations. One of them is "Asynchronous but not In Parallel", which is the safest option in my personal opinion, and the other is "Asynchronous and In Parallel (In a Non-Blocking Fashion)". The latter option significantly reduces the request time depending on the hardware and number of I/O operations you have but as our small benchmarking results showed, it may not be a good fit to process a request many concurrent I/O asynchronous operations in one just to reduce a single request time. The result we would see will most probably be different under high load.

References

AspNet.Identity.RavenDB: Fully asynchronous, new and sweet ASP.NET Identity implementation for RavenDB

A while back, ASP.NET team has introduced ASP.NET Identity, a membership system for ASP.NET applications. Today, I'm introducing you its RavenDB implementation: AspNet.Identity.RavenDB.
2013-11-29 09:39
Tugberk Ugurlu


A while back, ASP.NET team has introduced ASP.NET Identity, a membership system for ASP.NET applications. If you have Visual Studio 2013 installed on you box, you will see ASP.NET Identity Core and ASP.NET Identity Entity Framework libraries are pulled down when you create a new Web Application by configuring the authentication as "Individual User Accounts".

image

After creating your MVC project, you will see that you have an AccountController which a completely different code from the previous project templates as it uses ASP.NET Identity.

You can find tons of information about this new membership system from ASP.NET Identity section on official ASP.NET web site. Also, Pranav Rastogi (a.k.a @rustd) has a great introduction video on ASP.NET Identity which you shouldn't miss for sure.

One of the great features of ASP.NET Identity system is the fact that it is super extensible. The core layer and the implementation layer (which is Entity Framework by default) are decouple from each other. This means that you are not bound to Entity Framework and SQL Server for storage. You can implement ASP.NET Identity on your choice of storage system. This is exactly what I did and I created AspNet.Identity.RavenDB project which is fully asynchronous, new and sweet ASP.NET Identity implementation for RavenDB. You can install this library from NuGet:

PM> Install-Package AspNet.Identity.RavenDB

Getting started with AspNet.Identity.RavenDB is also really easy. Just create an ASP.NET MVC application from scratch by configuring the authentication as "Individual User Accounts". Then, install the AspNet.Identity.RavenDB package. As the default project is set to work with ASP.NET Identity Entity Framework implementation, you need to make a few more tweak here and there to make it work with RavenDB.

First, open the IdentityModels.cs file under the "Models" folder and delete the two classes you see there. Only class you need is the following ApplicationUser class:

public class ApplicationUser : RavenUser
{
}

As the second step, open up the AccountController.cs file under the "Controllers" folder and have delete the first constructor you see there. Only constructor you need is the following one:

public AccountController(UserManager<ApplicationUser> userManager)
{
    UserManager = userManager;
}

Now you should be able to build the project successfully and from that point on, you can uninstall the Microsoft.AspNet.Identity.EntityFramework package which you don't need anymore. Lastly, we need to provide an instance of UserManager<ApplicationUser> to our account controller. I'm going to use Autofac IoC container for that operation to inject the dependency into my project. However, you can choose any IoC container you like. After I install the Autofac.Mvc5 package, here how my Global class looks like inside Global.asax.cs file:

using AspNet.Identity.RavenDB.Stores;
using AspNetIndetityRavenDb.Models;
using Autofac;
using Autofac.Integration.Mvc;
using Microsoft.AspNet.Identity;
using Raven.Client;
using Raven.Client.Document;
using Raven.Client.Extensions;
using System.Reflection;
using System.Web.Mvc;
using System.Web.Optimization;
using System.Web.Routing;

namespace AspNetIndetityRavenDb
{
    public class MvcApplication : System.Web.HttpApplication
    {
        protected void Application_Start()
        {
            AreaRegistration.RegisterAllAreas();
            FilterConfig.RegisterGlobalFilters(GlobalFilters.Filters);
            RouteConfig.RegisterRoutes(RouteTable.Routes);
            BundleConfig.RegisterBundles(BundleTable.Bundles);

            const string RavenDefaultDatabase = "Users";
            ContainerBuilder builder = new ContainerBuilder();
            builder.Register(c =>
            {
                IDocumentStore store = new DocumentStore
                {
                    Url = "http://localhost:8080",
                    DefaultDatabase = RavenDefaultDatabase
                }.Initialize();

                store.DatabaseCommands.EnsureDatabaseExists(RavenDefaultDatabase);

                return store;

            }).As<IDocumentStore>().SingleInstance();

            builder.Register(c => c.Resolve<IDocumentStore>()
                .OpenAsyncSession()).As<IAsyncDocumentSession>().InstancePerHttpRequest();
            builder.Register(c => new RavenUserStore<ApplicationUser>(c.Resolve<IAsyncDocumentSession>(), false))
                .As<IUserStore<ApplicationUser>>().InstancePerHttpRequest();
            builder.RegisterType<UserManager<ApplicationUser>>().InstancePerHttpRequest();

            builder.RegisterControllers(Assembly.GetExecutingAssembly());

            DependencyResolver.SetResolver(new AutofacDependencyResolver(builder.Build()));
        }
    }
}

Before starting up my application, I expose my RavenDB engine through http://localhost:8080. and I'm all set to fly right now. The default project template allows me to register and log in the application and we can perform all those actions now.

2

The same sample application available inside the repository as well if you are interested in: AspNet.Identity.RavenDB.Sample.Mvc.

The current ASP.NET Identity system doesn't provide that many features which we require in real world applications such as account confirmation, password reset but it provides us a really great infrastructure and the UserManager<TUser> class saves us from writing bunch of code. I'm sure we will see all other implementations of ASP.NET Identity such as MongoDB, Windows Azure Table Storage, etc. from the community.

Windows Azure Management Client Libraries for .NET and It Supports Portable Class Library

One of the missing pieces of the Windows Azure story is within our reach now! A few days ago Azure folks have released Windows Azure .NET Management Client Libraries
2013-10-30 06:17
Tugberk Ugurlu


One of the missing pieces of the Windows Azure story is within our reach now! Awesome Azure folks have released Windows Azure .NET Management Client Libraries which also support portable class library (PCL). It is now possible to build custom Windows Azure management applications in .NET ecosystem with possible minimum effort. Brady Gaster has a very detailed blog post about the first release of these libraries if you would like to get started with it really quickly. In this post, I'll look at the these libraries in a web developer point of view.

Management libraries are available as NuGet packages and they are at the pre-release stage for now (ignore the HdInsight package below):

image

Some packages are there for the infrastructural purposes such as tracing, exception handling, etc. and other packages such as Storage and Compute has .NET clients to manage specific services which is good and bad at the same time. When we take a first look at the libraries, we can see a few great things:

  • All operations are based on the HttpClient. This gives us the opportunity to be very flexible (if the client infrastructure allows us) by injecting our inner handler to process the requests.
  • All client operations are asynchronous from top to bottom.
  • It supports Portable Class Library.

Getting started is also very easy. The following code is all I needed to do to query the storage accounts I have inside my subscription:

static void Main(string[] args)
{
	const string CertThumbprint = "your-cert-thumbprint";
	const string SubscriptionId = "your-subscription-id";
	X509Certificate2 cert = Utils.FindX509Certificate(CertThumbprint);
	SubscriptionCloudCredentials creds = 
		new CertificateCloudCredentials(SubscriptionId, cert);

	StorageManagementClient storageClient = 
		CloudContext.Clients.CreateStorageManagementClient(creds);

	StorageServiceListResponse response = storageClient.StorageAccounts.List();
	foreach (StorageServiceListResponse.StorageService storageService in response)
	{
		Console.WriteLine(storageService.Uri);
	}

	Console.ReadLine();
}

The client certificate authentication is performed to authenticate our requests. Notice that how easy it was to get a hold of a storage client. By using the CloudContext class, I was able to create the storage client by using the CreateStorageManagementClient extension method which lives under the Microsoft.WindowsAzure.Management.Storage assembly.

How HttpClient is Used Underneath

If we look under the carpet, we will see that each time I call the CreateStorageManagementClient, we'll use a different instance of the HttpClient which is not we would want. However, I assume that StorageManagementClient's public members are thread safe which means that you can use one instance of that class throughout our web applications but I'm not sure. Nevertheless, I would prefer an approach which is similar to the following one:

// single instance that I would hold onto throughout my application lifecycle. I would most
// probably handle this instance through my IoC container.
using (CloudContext cloudContex = new CloudContext(creds))
{
    // These StorageManagementClient instances are either the same reference or 
    // use the same HttpClient underneath.
    StorageManagementClient storageClient = 
        cloudContex.CreateStorageManagementClient();
}

On the other hand, we need to perform a similar operation to create a compute management client to manage our could service and virtual machines: call the CreateComputeManagementClient extension method on the CloudContext.Clients. Here, we have the same behavior in terms of HttpClient instance.

A Sample Usage in an ASP.NET MVC Application

I created a very simple ASP.NET MVC application which only lists storage services, hosted services and web spaces. I used an IoC container (Autofac) to inject the clients through the controller constructor. Below is the registration code I have in my application.

protected void Application_Start()
{
    // Lines removed for brevity...

    // Get a hold of the credentials.
    const string CertThumbprint = "your-cert-thumbprint";
    const string SubscriptionId = "your-subscription-id";
    X509Certificate2 cert = FindX509Certificate(CertThumbprint);
    SubscriptionCloudCredentials creds =
        new CertificateCloudCredentials(SubscriptionId, cert);

    // Get the ContainerBuilder ready and register the MVC controllers.
    ContainerBuilder builder = new ContainerBuilder();
    builder.RegisterControllers(Assembly.GetExecutingAssembly());

    // Register the clients
    builder.Register(c => CloudContext.Clients.CreateComputeManagementClient(creds))
        .As<IComputeManagementClient>().InstancePerHttpRequest();
    builder.Register(c => CloudContext.Clients.CreateStorageManagementClient(creds))
        .As<IStorageManagementClient>().InstancePerHttpRequest();
    builder.Register(c => CloudContext.Clients.CreateWebSiteManagementClient(creds))
        .As<IWebSiteManagementClient>().InstancePerHttpRequest();

    // Set the dependency resolver.
    AutofacDependencyResolver dependencyResolver = 
        new AutofacDependencyResolver(builder.Build());

    DependencyResolver.SetResolver(dependencyResolver);
}

Noticed that I registered the management client classes as per-request instance. This option will create separate client instances per each request. As the underlying architecture creates separate HttpClient instances per each client creating, I'll not be using the same HttpClient throughout my application lifecycle. I'm also not sure whether the client classes are thread safe. That's why I went for the per-request option.

My controller is even simpler.

public class HomeController : Controller
{
    private readonly IComputeManagementClient _computeClient;
    private readonly IStorageManagementClient _storageClient;
    private readonly IWebSiteManagementClient _webSiteClient;

    public HomeController(
        IComputeManagementClient computeClient, 
        IStorageManagementClient storageClient, 
        IWebSiteManagementClient webSiteClient)
    {
        _computeClient = computeClient;
        _storageClient = storageClient;
        _webSiteClient = webSiteClient;
    }

    public async Task<ActionResult> Index()
    {
        Task<StorageServiceListResponse> storageServiceResponseTask = 
            _storageClient.StorageAccounts.ListAsync();
        Task<HostedServiceListResponse> hostedServiceResponseTask = 
            _computeClient.HostedServices.ListAsync();
        Task<WebSpacesListResponse> webSpaceResponseTask = 
            _webSiteClient.WebSpaces.ListAsync();

        await Task.WhenAll(storageServiceResponseTask, 
            hostedServiceResponseTask, webSpaceResponseTask);

        return View(new HomeViewModel 
        { 
            StorageServices = storageServiceResponseTask.Result,
            HostedServices = hostedServiceResponseTask.Result,
            WebSpaces = webSpaceResponseTask.Result
        });
    }
}

public class HomeViewModel
{
    public IEnumerable<StorageServiceListResponse.StorageService> StorageServices { get; set; }
    public IEnumerable<HostedServiceListResponse.HostedService> HostedServices { get; set; }
    public IEnumerable<WebSpacesListResponse.WebSpace> WebSpaces { get; set; }
}

Notice that I used Task.WhenAll to fire all the asynchronous work in parallel which is best of both worlds. If you also look at the client classes, you will see that operations are divided into logical groups. For example, here is the IComputeManagementClient interface:

public interface IComputeManagementClient
{
    Uri BaseUri { get; }
    SubscriptionCloudCredentials Credentials { get; }

    IDeploymentOperations Deployments { get; }
    IHostedServiceOperations HostedServices { get; }
    IOperatingSystemOperations OperatingSystems { get; }
    IServiceCertificateOperations ServiceCertificates { get; }
    IVirtualMachineDiskOperations VirtualMachineDisks { get; }
    IVirtualMachineImageOperations VirtualMachineImages { get; }
    IVirtualMachineOperations VirtualMachines { get; }

    Task<Models.ComputeOperationStatusResponse> GetOperationStatusAsync(
        string requestId, 
        CancellationToken cancellationToken);
}

HostedServices, OperatingSystems, VirtualMachines and so on. Each logical group of operations are separated as class properties.

Getting Inside the HTTP Pipeline

The current implementation of the libraries also allow us to get into the HTTP pipeline really easily. However, the way we do it today is a little ugly but it depends on the mentioned HttpClient usage underneath. Nevertheless, this extensibility is really promising. Every management client class inherits the ServiceClient<T> class and that class has a method called WithHandler. Every available client provides a method with the same name by calling the underlying WithHandler method from ServiceClient<T>. By calling this method with a DelegatingHandler instance, we can get into the HTTP pipeline and have a chance to manipulate the request processing. For example, we can inject an handler which has a request retry logic if certain error cases are met. Windows Azure Management Library provides an abstract class for a retry handler which has the basic functionality: LinearRetryHandler. As this is an abstract class, we can inherit this class to create our own retry handler. If we need, we can always override the ShouldRetry method but for our case, we don't need it.

public class CustomRetryHandler : LinearRetryHandler
{
}

Now, we can use our retry handler to create the management clients:

RetryHandler retryHandler = new CustomRetryHandler();
IComputeManagementClient client = 
    CloudContext.Clients.CreateComputeManagementClient(creds).WithHandler(retryHandler)

Now, if a request to a Windows Azure Management API fails with some of the certain status codes, the handler will retry the request.

There are More…

There are more features that are worth mentioning but it will make a very lengthy blog post :) Especially the unit testing and tracing are the topics that I'm looking forward to blogging about.

Short Introduction Video for OWIN and Project Katana

I've recorded a short video which covers the brief introduction of OWIN and Project Katana. That short video will give you an idea about OWIN and Project Katana.
2013-10-21 13:29
Tugberk Ugurlu


I've recorded a short video which covers the brief introduction of OWIN and Project Katana. That short video will give you an idea about OWIN and Project Katana and why we should care about it.

An Introduction to OWIN and Project Katana from Tugberk Ugurlu on Vimeo.

Let me know whether this recording is any help to you. Your feedback is always more than welcome. Enjoy!