Asynchronism and Parallelism in Business Applications

Last week I did a talk at the BASTA! Spring conference in Darmstadt (Germany) about asynchronous and parallel programming. My goal was to show that asynchronous and parallel programming is not only relevant for calculating primes but also for many business scenarios as well. After some bad and best practices concerning async/await I showed different scenarios beginning from the pc level (WPF, MVVM, Entity Framework 6) to the cloud (Traffic Manager, Service Bus Queue).

Here are the (german) slides:

Scenario 1: Use async features of Entity Framework 6 in a WPF application

Most business applications operate with lots of data. Executing time-consuming statements and displaying a huge amount of data while still being reactive for the user: that’s the place where asynchronous programming turns up.

Since EF 6 many methods are also available in an asynchronous alternative, e.g. ToListAsync(), FirstOrDefaultAsync(), etc. It’s really easy to turn a blocking, synchronous database method into an asynchronous one: just a few words have to be changed. With a little more effort you can add a cancellation mechanism for your users.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
public static List<Person> GetPeople()
{
    using (var db = new AdventureWorks2008R2Entities())
    {
        return db.Person.ToList();
    }
}
 
public static async Task<List<Person>> GetPeopleAsync(CancellationToken cancellationToken)
{
    using (var db = new AdventureWorks2008R2Entities())
    {
        try
        { 
            return await db.Person.ToListAsync(cancellationToken);
        }
        catch (EntityCommandExecutionException e)
        {
            if (e.InnerException != null && e.InnerException is TaskCanceledException)
            {
                return new List<Person>();
            }
 
            throw e;
        }
    }
}

Scenario 2: Why async/await also matters for Web Applications

MVC 4 (in combination with .NET 4.5) enables you to write async controller actions. That’s a good thing because more and more libraries are ready for the asynchronous world and you can use them in ASP.NET as well. But wait, there’s one more reason: using async/await can sometimes help to improve scalability of your website enourmosly: imagine a controller action that calls a web service. It might take a while until the response arrives and the view can be rendered. During that time a thread of the web server is exclusively reserved for the request - and IIS has a default maximum of 5,000 threads that can be run in parallel. When having an asynchronous action the thread is not waiting for the response but can be utilized in that period of time. Once the web service response arrives the next available thread takes the result and starts rendering the view.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
public class HomeController : Controller
{
    public ActionResult Sync()
    {
        var people = DataManager.GetPeople();
 
        return View("People", people);
    }
 
    [AsyncTimeout(1000)]
    public async Task<ActionResult> Async(CancellationToken cancellationToken)
    {
        var people = await DataManager.GetPeopleAsync(cancellationToken);
 
        return View("People", people);
    }
}

Scenario 3: Working with Service Bus Queues in Windows Azure

In the third scenario I went away from async/await and tried to show an architecture that is optimized for parallelization and scalability. Windows Azure Service Bus Queues are a great opportunity to build a data exchange mechanism. Common ways (e.g. providing a web service) have some disadvantages: scalability is hard, and if one partner stops working, the try/retry logic is not easy to build.

Windows Azure Service Bus Queues can be used to decouple the components: one partner inserts data packages into the queue, the other partner takes them and starts importing. If one partner drops out, the queue still exists (as long as Windows Azure exists…). Try/Retry is a first-citizen feature: if a package has not been processed in the specified interval it is thrown back to the queue and a dequeue count is incremented. And if you use a Worker Role for processing the queue items you can easily increase the number of instances to gain more power - or use the auto scale feature of Windows Azure to automate it.