High Performance ASP.NET application (2) – Caching ASP.NET application Introduction

Last time we take a bite of a piece of writing high performance ASP.NET application – State Management. Today, we'll continue our topic to explore the ASP.NET caching mechanism. How to effectively and efficiently leverage cache when writing your ASP.NET application is essential to the application performance in terms of response time and resource management. Basically, we'll cover the following four parts:

 

  • Use cache API properly
  • Leverage output caching and fragment caching
  • Caching in Web Farm
  • Useful Caching Guidelines

 

Cache API is quite straightforward and easy to use but there are still some circumstances where you should avoid using them. For instance, the data you are going to cache is user-specific which is better to be stored in session state. If the date is updated very frequently, it won't stay in cache very long so it won't be beneficial to the performance.  An interesting thing I'd like to bring to your attention is AddValidationCallback. This turns to be very useful when you want to tailor the cached data for different clients. The following code demonstrated how to use AddValidationCallback to customize the request handling behavior based on query string.

 

<script language="c#" runat="server">

 

   HttpContext ctx = HttpContext.Current;

   static string validationstate;

 

   public void Page_Load()

   {

      Response.Cache.AddValidationCallback(new HttpCacheValidateHandler(ValidateCache), null);

      ctx.Response.Write("");

   }

 

   public static void ValidateCache(HttpContext context, Object data, ref HttpValidationStatus status)

   {

      if (context.Request.QueryString["Customer"] != "Bob")

      {

         status = HttpValidationStatus.IgnoreThisRequest;

         context.Response.Write("You are not Bob. Your request will be handled explicitly.");

      }

      else

      {

         status = HttpValidationStatus.Valid;

         context.Response.Write("Respond with cached data");

      }

   }

 

</script>

There are three cache expiration policies you can leverage to make the data cached more efficiently. Absolute expiration policy specifies an absolute date and time when an item expires and is removed from the cache.  Sliding expiration policy specifies an interval over which the item must go unused before it is expired and removed from the cache. And dependency expiration policy means that the cached item may be linked to a resource and expired only once this resource changes for instance the cached item links to a file in the disk. The cached item remains in the cache until the file changes. One thing to note that you cannot use absolute expiration and sliding expiration policies together while you can set dependency expiration along with either absolute or sliding expiration policies but it's not recommended.

 

Output caching is a powerful technique that increases request/response throughput by caching the content generated from dynamic pages. It enables you to cache the contents of entire pages for a specific duration of time. Consider to enable output caching on dynamically generated pages that do not contain user-specific data. Consider to enable output caching for pages that are frequently visited and time consuming to be generated such as Reports. If it takes you 1 minute to generate a report and it only contains few variations, it makes a good candidate to cache. However, avoid using output caching in the following scenarios:

  • 1. You need programmatic access to the data on the page. Using cache API.
  • 2. The page contains a large number of variants.
  • 3. The page contains mixed data: static, dynamic and user specific. In this case, using fragment caching is better.
  • 4. The page is frequently refreshed with every request.

 

Fragment caching is implemented by using user controls in conjunction with the @OutputCache directive. Just listed above when there is a mixture of data in a page, partition your page into separate logical regions by creating user controls.  Some common scenarios that make good candidates for fragment caching are like navigation menus which are non-user-specific and page headers/footers which are normally static ontent and don't not need to be refreshed in every view.

 

Just like session management caching in the web farm scenario is also more complicated than standalone machine. Basically, you have three options:

  • 1. Synchronizing all nodes in web farm. It's simple, fast and elegant. What we are going to do is we simply add the application logic so that when any cache operation(API) occurs it just goes through the node list sending the web request to each and every node to sync up the cache on all nodes. If it's not a primitive object, you'll have to take care of serialization as well.
  • 2. Centralized cache location. You can design a centralized cache using a Web service component, as follows: 1) Determine what kind of information needs to be cached. For example, let's say your application generates a large DataSet to assemble the same start page for every user.2) Create a Web service that contains a method for generating the common DataSet. 3) Compile the Web service and install it on a server that is accessible to the Web farm. Ideally, this server should be in the same domain, and on the same physical network, in order to minimize communication time. 4) Add a Web reference to the application code to retrieve the DataSet from the Web service.

 

  • 3. SQL server caching. SQL Server is powerful in terms of persisting data across process recycles, server reboots and things like power failures/system crashes. If your application requires such data protection, caching mechanism based on a persistent data store like SQL Server is cool enough.

 

There are some useful caching guidelines that you must keep in mind when you are programming your ASP.NET application.

<%@ OutputCache Duration="30" VaryByParam="a" %>

The setting shown in the previous sample would make the following pages have the same cached version:

If you add b to the VaryByParam attribute, you would have three separate versions of the page rather than one version.

When you make the decision to use a VaryBy attribute, make sure that there are a finite number of variations because each variation increases the memory consumption on the Web server.

 

Well, that's basically what I'd like to share regarding ASP.NET cache.  Based on my experience, even experienced .NET developer may sometimes make mistakes with ASP.NET caching like memory issues with caching too much, performance issue caused by caching something that shouldn't be cached etc. The good thing is we have some awesome troubleshooting/debugging tools to help us identify the culprit. But it would be less painful if we are cautious when designing our caching code. Thank you for your time reading my blog!

 

Regards,

 

Yawei