How to disable HTTP compression for specific file types?

The question arose from a customer whom had implemented an application to stream PDF files from ASP.NET and was also using HTTP compression to save bandwidth and improve download time; using IE 6, Adobe Reader failed to open the file with the following error message:

Adobe Reader could not open ‘<name>.tmp’ because it is either not a supported file type or because the file has been damaged (for example, it was sent as an email attachment that wasn’t correctly decoded)

The customer had no problems with Adobe Reader using IE7, but of course they could not force their customers to upgrade.

I already worked on a few calls similar to this one in the past and I had the chance to dig into this issue with our Escalation Engineers (both from the Internet Explorer and IIS teams) and also with the Product Group; it turned out that there is a problem in the compression mechanism in IE until version 6, which basically affected the ability to successfully decompress the HTTP/Html stream received from an IIS server when using HTTP compression. At the time there was also a minor issue on the server side of compression, but it has been fixed. The point is that for Internet Explorer 6, the Product Group decided they could not fix this problem because there is too much code to change, retest and redistribute (this is involving some core components in IE which are also widely used across the Operating System, which means a code change there, needs a full re-test of IE and Windows codebase and is way too risky in terms of the possibility to introduce other bugs in other core OS components).

The latest information I had at the time was that the issue would have been fixed in Internet Explorer 7 since the Product Group had a massive code-rewrite; this was confirmed by the fact that the customer was not able to reproduce the problem on IE7.

Well, we have a couple of options: disable HTTP compression for specific file types as described in Customizing the File Types IIS Compresses (IIS 6.0) (but this means you could have to add more file types over time, and you cannot controls specific file names, but the exclusion will apply to all files with the specified extensions) or store all the files you do not want to be compressed in a specific folder on your server, and then edit the metabase to tell IIS to not compress the content of that folder.
To disable static compression for only a single directory, first enable global static compression (if it is disabled) and then disable static compression at that directory. For example, to enable static compression for a directory at, run the following steps:

  1. Enable global static compression by executing the following command at a command prompt:
        adsutil set w3svc/filters/compression/parameters/HcDoStaticCompression true
  2. Disable static compression at this directory by executing the following command at a command prompt:
        adsutil set w3svc/1/root/Home/StyleSheets/DoStaticCompression false

There is another nice post on this matter at IIS HTTP Compression and Streaming PDF’s: Don’t do it.

Anyway note that this approach can he used if you are loading binary content from disk. If you load the file from disk then use BinaryWrite or Stream* classes to stream to transfer (force a download) the binary content from IIS to the client, then you should not be affected by the bug as long as modifying the metabase as explained in my previous email you exclude one (or more) specific folder from HTTP compression and your PDF files are loaded from there. Of course this implies that you can control your application, where and how PDF files or other binary content is stored and loaded from.

If instead you extract some data from a database, create a PDF file on the fly in memory and then stream is to the client (even if using the techniques explained above) then it is not possible to tell IIS to not use HTTP compression for content which is created and manipulated all in memory, since even if we can exclude file types from HTTP compression, at application level we are working with a memory stream which is not a file type, we qualify it only when adding the relevant HTTP header value before actually writing the stream to the outgoing output for the client which is too late…



Quote of the day:

It is a good rule in life never to apologize. The right sort of people do not want apologies, and the wrong sort take a mean advantage of them – P. G. Wodehouse

Comments (3)

  1. Darren Kopp says:

    Heh, we ran into the EXACT same issue like 6 months ago, except it also affected CSV files that we would send (again, only w/ IE6). We were never really able to solve the issue completely w/ adsutil, so we eventually bought HttpZip, but that itself brought up other issues (memory leak would inflate application pool causing it to restart all the time).

    BTW, like the new theme carlo

  2. Thanks Darren 🙂

    Well, there could be another solution actually, but I’m not sure if it is really applicable…

    Static compression only applies to files loaded from disk (which is not our case here), the same is true for Dynamic compression where the file extension of the file is used, not the MIME type (the one you specify adding the Http Header to the outgoing request), so if you have an .aspx page creating a PDF then compression will be used if .aspx is  configured for Dynamic Compression (and it is).

    In theory it is possible to change the metabase to exclude from compression one or more specific files (pages), but this means that if over time you’ll add pages which needs to stream binary content, you’ll have to change accordingly the metabase… this does not seem feasible in practice, it requires a lot of maintenance and is an error prone configuration…