A Tale of 20 Cookies

As more applications move to leveraging the web, either through desktop-integration or complete migration to a web model, maintaining user state on the web becomes critical. For many web sites and applications this means the use of in-memory and persistent cookies.

Netscape originally defined HTTP cookies in a preliminary document and later the IETF standardized cookies in RFC’s 2109 and 2965. Interestingly, the WinINet cookie implementation is still (mostly) modeled after the early Netscape document, which specifies that browsers should:

  • Maintain at least 300 cookies total,
  • Support cookies up to 4KB in size each,
  • Support no fewer than 20 cookies per unique host or domain

User-agents, such as mobile phones, have more relaxed recommendations. Internet Explorer 7.0 and WinINet support up to 20 cookies of 5KB each per domain with no set limits on the total number of cookies overall on a system. Even with these totals, some web applications may need more.

A follow-on question might be to ask what happens when a web application sends more than 20 cookies for a domain (21 cookies perhaps). The answer is that we maintain the 20 most recent cookies. Recent is based on the order in which we received the cookies. That means if your web server sends 21 cookies named Cookie1, Cookie2 thru Cookie21 to WinINet (via Internet Explorer or otherwise), your web server would only receive Cookie2, Cookie3 thru Cookie21 on the next request from WinINet.  We simply maintain the 20 most recent and quietly discard the other cookies for that host/domain.

Next, what happens when a web application exceeds the other maximum and sends a cookie with a size over 5120 bytes (>5KB)?  Instead of truncating the values of oversized cookies, we simply discard these cookies. We chose to discard instead of truncate in order to avoid any type of cookie data corruption.  I think you would agree that a cookie with a value of $10000 is very different from a truncated one with a value of $100. In addition, it is easier for the server application to check for the existence of a cookie and react if it is missing versus detecting if the value is correct.

There are ways to work-around our current 20x cookie limit.  In my opinion, the best work-around is to leverage a durable back-end store for user information and use cookies for user tracking and lightweight values.  This keeps the request/response more streamlined.

Another work-around, in cases where you absolutely need more data sent back-and-forth within the 20-cookie limit, is to pack more data into fewer cookies. This means that instead of:

Set-Cookie: FirstName=Billy

Set-Cookie: LastName=Anders

You can combine into something like:

Set-Cookie: FullName=Billy~Anders

Then, in your web application, you split the FullName cookie.  Here is a rough (rough meaning no error handling) ASP.NET example in C#:

string[] Names = Request.Cookies[“FullName”].Split(‘~’);

if (Names.Length == 2) {

     string FirstName = Names[0];

     string LastName = Names[1];

}

Admittedly, this is not as straight-forward as the standard ASP.NET approach of:

string FirstName = Request.Cookies["FirstName"];

string LastName = Request.Cookies["LastName"];

However, this may prove helpful if you ever need to get around the 20x limit that we currently have. 

Update: The sample above is to demonstrate the concept of packing more data into fewer cookies for any web development platform. ASP.NET natively supports subkeys which already allow you to store multiple values within cookies.  

string FirstName = Request.Cookies["Name"]["First"];
string LastName = Request.Cookies["Name"]["Last"];

The current limits seem reasonable, but we would like to hear from you on whether that is case or not. Are you writing web applications that would like to use the browser’s cookie store beyond the 20x and 5KB limits that we have? Do you need 25, 50, 100, 250 cookies for your application?

- Billy Anders