User Agent String Documentation


One of the purposes of releasing the IE7 beta is to collect feedback on compatibility with both websites and extensions and we continue to look at all the reports we receive of sites and applications not working. There are reasons for this such as the base tag change that we blogged about recently and also bugs that we are detecting that you’ll see addressed later in the project cycle. However the most common issue we continue to see for web sites not working are that they are blocking access or serving the wrong content as a result of the new User Agent string for IE7. I’ve seen some websites that when visited using IE7 ask the user to upgrade to the latest version of Internet Explorer or Netscape and other websites that serve content with very small fonts. Sometimes even some Microsoft’s own websites have been blocking access! A user can work around an incompatible website by adjusting the User-Agent string as described in a previous blog post. We now also have some additional documentation on MSDN to help:

I want to be clear that we do not advocate blocking access to content based on the User Agent string of the browser, as this practice results in users of unknown and newer versions of browsers being locked out. Rather than blocking unknown user-agents, consider using user-agent sniffing only in isolated cases to work around known limitations of specific user-agents.

Thanks,

– Dave

Comments (35)

  1. Anonymous says:

    I agree half-heartedly…

    While I enjoy seeing the recommendations for Firefox or IE7 at the bottom of webpages, some webmasters are going out of their way to block standard-deficient browsers like IE6/7beta1 and Opera.

    When I see a full page message like:

    YOU ARE USING A NON-STANDARDS COMPLIANT BROWSER, GET FIREFOX OR CONTINUE TO SITE AT YOUR OWN RISK.

    It pisses me off.

  2. Anonymous says:

    I’m glad to hear you don’t sanction user-agent sniffing. I think we all know (or should know) by now that such a practice invites maintenance (usually on a schedule roughly equal to the union of the Mozilla and Microsoft browser release schedules). Frankly, however, why not try something really radical and eliminate the user agent string altogether? It seems a lasting tribute to Internet infancy that should be abandoned; an implement of a poor programming practice that should be incapacitated; an explicit barrier to platform and browser agnosticism web standards advocates champion whose time has come and gone. User agent certainly isn’t the only item that meets these criteria, but deleting it would be an admirable (and no doubt highly controversial) first step.

  3. Anonymous says:

    Without the user agent string, browser usage stats become obsolete.

  4. Anonymous says:

    User Agents are necessary for seeing who uses what, especially with Firefox bug tracking. It’s the Javascript that sniffs it that should be outlawed :)

  5. Anonymous says:

    One of the reasons why some sites are having problems with the new User Agent string is because of the version number. The 7.0b designation can cause problems if the developer expects that value to always be numerical. The failure to parse "7.0b" can, of course, cause the web app to return a page intended for users of legacy browsers.

  6. Anonymous says:

    I use something like this:

    function runningInJScript() {

    var e = new Error(1234, ‘TestAMessage’);

    if ( (e.number != 1234)

    || (e.description === undefined)

    || (e.message!=’TestAMessage’) ) {

    return false;

    }

    return true;

    }

    If I recall correctly only Microsoft browsers (starting with IE5) will make this function return true.

    More fun with JScript and ECMA-262 non-compliance:

    javascript: var e=Error(1234); alert(e.message)

    javascript: var e=Error(‘12345678901234567890’); alert(e.message); alert(e.number)

    The compliance issue could be fixed by always setting the message to the string value of the first arg excepted when there are 2 args with the first one being a number and the 2nd being a string.

  7. Anonymous says:

    Many of the web developers whom I met recently expressed thier main concern on Internet Exporer 7 as…

  8. Anonymous says:

    Note that IE pretends to be Mozilla, Opera pretends to be IE and Safari pretends to have Gecko engine. All these because of broken and lame UA sniffers.

    When identifying (and I suggest only for statistical purposes) first chceck for Opera and Safari ("WebCore") and look for "MSIE" as last option.

  9. Anonymous says:

    I think that if the web designer’s done his job properly, there should be no need to employ browser sniffing on a site at all

    In the days of IE4 vs NN it was a total nightmare, and site development would take twice as long as it does now

    Now though I design for Mozilla and IE5.5, but always make sure there’s a text option so at least others can read the content

  10. Anonymous says:

    http://news.zdnet.co.uk/software/applications/0,39020384,39214958,00.htm

    "We have to make this our only priority and put our top people on the job. In addition to our planned Win32/OLE work, we have to get serious about extending and owning HTML as a format, and in the process leverage our existing assets to get ahead."

    Seems you FAILED! HA HA!

  11. Anonymous says:

    We’ve put up some more documentation on MSDN regarding User Agent string and posted on the IE team blog…

  12. Anonymous says:

    We’ve put up some more documentation on MSDN regarding User Agent string and posted on the IE team blog…

  13. Maurits says:

    On

    http://msdn.microsoft.com/library/?url=/workshop/author/dhtml/overview/aboutuseragent.asp

    you quote the following javascript method for showing the user agent:

    javascript:alert(navigator.userAgent)

    While that works, I find the following to be more useful because you can copy the useragent to your clipboard

    javascript:prompt(‘Your user agent is’, navigator.userAgent); void(0);

  14. Anonymous says:

    batfastad said:

    "I think that if the web designer’s done his job properly, there should be no need to employ browser sniffing on a site at all"

    I see your point, but as of the situation today that would mean the web designer had to

    a) limit his own creativity and rely on seriously old technologies to support browsers from last century (or IE6 as it is more commonly known as)

    b) use a bunch of CSS hacks to get the desired result in all browsers, and as a result:

    c) use more time and money than he should have to

    …can this really be called doing your job properly? In fact, I would say it’s more or less impossible to do a "proper job" as long as 80% of the browser market belongs to IE6.

  15. Anonymous says:

    Maurits: Good tip. Note that you can hit CTRL+C when a Windows message box is showing to copy its text to the clipboard.

  16. Anonymous says:

    Eric, now that’s a really useful tip. Thanks!

  17. Anonymous says:

    "MSIE 7.0; Windows NT 5.1;"

    Lets get rid of this "Mozilla/4.0 (compatible" junk please! There is NO point to it …

    Mozilla 4 does not exist… (they are working on 1.8 now)

    Mozilla 4 was the UA for Netscape 4

    IE isn’t great, but it sure as hell blows away Netscape 4!

    Browsers should identify what they are.

    Browsers should not compare themselves ever, be it comparable, worse, or better browsers.

    It would be nice if the Konquerer team took Gecko out of the Safari string.

    I’d also like to see the ability to confirm a browser through a secondary method should the UA fail. You can turn off a useragent which REALLY does not help us (web designers/developers) and thus instead of blocking access completely (as I have on my site to blank UAs) I’d still be willing to make a second test that is browser specific to determine if the agent in question is IE7 or not. Go a step further and fuge it for non-IE apps trying to pass themselves off as IE when they are not.

    Either way, the MSIE team seems to be going in the right direction! :-)

  18. Maurits says:

    Instead of creating BrowserCaps files it would be nice if the browser was explicit about what it thought it understood.

    How about something like

    User-Agent: Microsoft Internet Explorer 7.0 build 20050902

    Understands: CSS 2.1

    Understands: ECMAScript 1.1

    Understands: HTML 4.0

    Understands: XHTML 1.1

    Understands: …

    Then server-side I could do things like

    if ($Browser->understands("CSS") >= 2.1)

    {

    $Response->Write("<link rel=’stylesheet’ … >");

    }

  19. Anonymous says:

    I just attended an online chat with members of the IE7 team and was allowed to share portions of it. Here are some interesting quotes which I just print here without any further judgement. Some of them are really interesting, some are encouraging, some ar

  20. Anonymous says:

    I don’t agree with Browser sniffing at all.

  21. Anonymous says:

    <<Lets get rid of this "Mozilla/4.0 (compatible" junk please! There is NO point to it … >>

    The point to it is that if you remove this, a HUGE number of sites on the internet would break. You can use Fiddler to strip this out to see for yourself.

    <<it would be nice if the browser was explicit about what it thought it understood>>

    The downside to that is that you’re asking to pass a HUGE amount of redundant data with every request.

  22. Anonymous says:

    <<I’d also like to see the ability to confirm a browser through a secondary method should the UA fail. You can turn off a useragent which REALLY does not help us (web designers/developers)>>

    You can still use either conditional comments or JScript’s Navigator object to perform a user-agent test on the client side?

    << thus instead of blocking access completely (as I have on my site to blank UAs)>>

    As noted above, this is not recommended. Rather than blocking missing or unknown UAs, simply treat them as modern/uplevel. That way, if your content chokes a browser that doesn’t identify itself, it’s not your fault, it’s the browser’s fault. This will simplify your ongoing site maintenance and will ensure users of unknown or future browsers are not inconvenienced.

  23. Anonymous says:

    How would sites break if you removed that portion of the UA string?

    IE conditional comments would be an effective way to test for browsers claiming to or not to be IE.

  24. Anonymous says:

    [eliminate the user agent string altogether? It seems a lasting tribute to Internet infancy that should be abandoned; an implement of a poor programming practice that should be incapacitated]

    Well written, mschulz.

    Yet eliminating user agent won’t solve the issue. It would still be possible to "incapacitate" (nice term, I liked it) access by merely checking the presence or lack thereof for a few IE specific javascript properties or methodos.

    The problem would come back to bite us also eliminating user agent, you see.

    The problem we are dealing with is not a lack of heed by big corporations like Microsoft or Google (although they bicker, yet Google is just another company that many purists accuse of writing non compliant mark up, so go figure).

    The problem does not reside with IE.

    The problem does not reside with User Agent.

    The problem resides with a community of end front developers who are in their vast minority (minority, but ample) simply obtuse.

    Artificial intelligence, you know: no match for natural stupidity.

    If at our times somebody is able to script such a mark up or such a css that he/she has to prevent access to a browser, the right deduction is that he/she doesn’t know how to write mark up or css.

    We have by now ALREADY so many options on websites at hand, that declaring dearth amidst such wealth, and declaring one is BOUND to prevent some accesses, means somebody has got perception issues.

    I would understand preventing IE3, or maybe IE4 at most.

    But when we’re at gen 5, that’s simply ridiculous.

    The internet is ALL about access (that’s a huge DATABASE, NOT a tv set to implement flash tricks on), and if one excludes a platform like IE6 (!) because it cannot parse a silly fixed layer of theirs where they say "see, this is muggy, my cute cat" (and which of course was absoultuely, oh so absolutely, mission critical for them), or because it cannot capitalize that so important first letter of that 3 lines long paragraph where he/she shows to us that naughty pic taken at the camping – well then they worry about access without reasons or, alternatively, they bring forth the necessity to filter accesses out for the wrong motives.

    In fact, the real reason access to their sites should be prevented, is that there was nothing worth seeing.

    They are worrying without reason.

    On year 2005 a website that excludes a generation 5 browser does not denounce problems with the browser, but with the OWNER.

  25. Anonymous says:

    In case you want to do browser sniffing without using JavaScript I’ve been keeping Microsoft’s woefully out of date browscap.ini file updated for many years now.

    There is usually an updated download every Sunday.

    My file(s) include the popular browsers and user agents such as search engines, e-mail harvesters, media players and more.

  26. Anonymous says:

    Alberto,

    Great insight. I was partially playing devil’s advocate with my initial post mostly out of exasperation. How many times must we go through this!?

    I couldn’t agree more with you about inappropriate and/or frivolous DHTML. It’s probably also true that eliminating the user agent string would do nothing to prevent it any more than eliminating the user agent string would improve the quality of DHTML code in general.

    I do believe that if more interface designers and developers were familiar with basic web usability concepts, they might resist some of the sizzle in exchange for more readily accessible steak. 😉

  27. Anonymous says:

    That’s true mschulz. As a matter of fact, our community (and I wish I could mean by it only we "end front" developers or just webmasters, but unfortunately we draw in our mud also big companise like Microsoft and Google) is producing not only a completely false paradigm about what the internet is and what the standards would have been meant for, but it is also implicitly eroding the real purpose of the standards: that of stressing the PUBLIC, and not the CONFIDENTIAL nature of online resource.

    It is becoming such a nonsense that I wonder whether IE7 would truly solve the issue: when we encounter websites that incapacitate IE6 and then, when revealed to an allowed browser, they disclose not one single mark up or css line that would have been problematic in the least for IE6 (just blogs, you know!), we finally understand that the problem has never been the sake of the standards for which they ALLEGE they fight the crusade.

    In this cold war logics, they may WELL be going to be unsatisfied by IE7 in the same fashion they alleged they were by IE zero.

    It is going to be an issue: for now they are the ample minority I was previously referring to, but they grow, and if they grow unresisted, they will eventually establish the idea that either you browse a website with what the owner wants, or you go away from the net.

    Now, the MOST uncompliant browsers of all are search engines robots: but they do not disable them. Of course.

    I think we should seriously rethink this whole issue, and what we want the internet to be: I mean "we" as we front end debvlopers for it seems to me that our problems NEVER truly depended from Microsoft.

    I have actually DIGESTED the whole of the major issues raised in this Msdn blog in a way that is somewhat conclusive, and as such that can be worth sharing regardless of agreements or disagremments:

    http://www.unitedscripters.com/spellbinder/internetexplorer.html

    Our comminity is blaming Microsoft, and Google too for that matter, for reasons and motives that have nothing to do with either.

    We have increasing hoardes of end front CSS DEVELOPERS who spend their days saying to Google, to Microsoft, and to Chris Wilson that they ought to learn how to program properly.

    Are we JOKING?

    No, they aren’t. So it’s gonna be a problem.

    It is our general, underlying web culture that has to be fixed in the FIRST place, not IE bugs.

    Contents and humans come FIRST, not Css and machines.

  28. Anonymous says:

    Where are you people visiting that blocks IE?

    I’ve seen any number of sites that block non-IE browsers, but none that block any version of IE.

    BTW: If you want to detect JScript, it’s much faster and easier to use conditional compilation. Here’s a script that will determine which rendering engine is used:

    <script type="text/jscript">

    /*@cc_on @if(@_win32) document.write(‘Trident’);

    @else document.write(‘Tasman’); @end @*/

    </script>

    <script type="application/x-javascript">

    if(document.layers) document.write(‘Netscape4’);

    else if(!window.clientInformation) document.write(‘Gecko’);

    else if(!document.createProcessingInstruction) document.write(‘Presto’);

    else if(document.compatMode==’CSS1Compat’) document.write(‘Cab’);

    else document.write(‘KHTML’);

    </script>

    <noscript>unknown</noscript>

    From http://webcoder.info/reference/BrowserFiltering.html

  29. Anonymous says:

    [Where are you people visiting that blocks IE?]

    Brianiac – unfortunately there is plenty of such sites.

    An example out of many could be (without any intention to single it out as more characteristic or -god forbid- worse than others):

    http://www.stud.ntnu.no/~shane/

    Yet that instance is particularly curious for one feature at least: because it redirects IE to the Opera download webpage. Yet if you browse it with Opera, it claims you’re running IE and goes to the Opera download page as well.

    I told the owner about this, and of course he said it depended on Opera having on the "identify as MSIE tag" – our end front developers, the type of community we are breeding, never have something to blame themselves: but they always have something to complain with a browser about.

    But you can identify Opera with _absolute_ certainty; and _yet_ though somebody may not figure out how to detect browsers correctly (which I do NOT blame anyone for) yet the same persons produce on their website this type of IE bias (which I "blame" them for: incompetence in detection and redirection shouldn’t stand in the same line).

    As I stressed in a previous comment, we are truly witnessing a community of web developers who is going to grow crammed with prejudices and misunderstood lessons.

    We are indeed dealing with something that has gone rotten, if our end front developers are INFINITELY more concerned with how their blogs look like for a css online parser, and what online parsers say about their css, than how they look like before the eyes of real humans, in flesh and in command of the browser they prefer and that they send away, and what education or information __humans__ may get from their contents rather than from their css.

    We can see developers producing what they claim being high quality perfect css, abysmally low quality contents (typical case: posts of heated insults against IE with abundance of profanity, but claim to allegedly refined and classy Css), and strutting around with this match as if it would have even been a W3C endorsed practice.

    The W3C sits mum about this – well ok, that’s not its job.

    Shall __WE__ too sit mum and let this dramatically misunderstood culture about what the internet is, go on undisturbed?

  30. Anonymous says:

    The script you linked to on MSDN detects Opera as Internet Explorer. It doesn’t take much to fix it, because Opera contains "Opera" in the UA string.

    I think browser sniffing is a necessary evil when trying to support anything beyond basic HTML and DOM, since it’s necessary to detect Internet Explorer, Safari, and Opera to apply workarounds to bugs they have so it looks like it does on Firefox, the most standards-compliant browser. I don’t agree with doing it to block out other browsers, though. As Dave said, only do it for workarounds.

    It’s too bad Microsoft doesn’t pay more attention to what it’s IE developers say.

    Let’s look at what Microsoft still does to block out other browsers:

    Many more sites block out other browsers because they use ActiveX for things like embedding Windows Media Player and plugin providers (especially Microsoft) don’t provide non-ActiveX plugins. Not only that, but MSN has blocked browsers in the past, and MSN groups doesn’t allow anything but plain-text comments when you are using non-IE browsers because it requires a rich text editor, even if it would be very easy to allow something like BBCode or to insert it through the JS DOM instead (look at phpBB). MSDN also doesn’t show dynamic content (such as dynamic menus) in browsers other than IE.

    Many sites work their butts off to support IE with a million workarounds, and not just turn off anything but basic HTML, but Microsoft still won’t make their sites work correctly in any other browsers when it’s fully within their power. So if you aren’t for browser sniffing to block access, then make sure Microsoft’s site doesn’t do it.

    As far as it goes for blocking access to IE for personal sites, the same thing was happening to Netscape 4 when IE 5.5 came out, and unfortunately it’s going to continue happening to IE until they fix their standards. It’s not necessarily right or wrong, just the result of blog writers becoming annoyed by having to do continuous workarounds for IE when their page renders correctly in other browsers.

    As for me, I work in an environment where I have to support IE, even if I prefer Firefox. My blog supports IE too (temporarily down because I moved). But some people are not willing to go through the trouble, and I don’t blame them. If you really want to read what they have to say, open up Firefox, otherwise, it’s their right.

    My suggestion to the IE team is look at what happened when it released Netscape 6 too early. If you don’t do more to support standards before you release IE7, you’ll see a backlash like you couldn’t possibly imagine because you’ll have two browser versions out there — one with a different set of bugs than the other.

    I don’t really believe you have the channels open enough for good feedback. I’ve found bugs on IE6, but my perception is you don’t really care, so why should I even bother trying to report it? Instead, I just do a workaround and pray for the day that IE is no more.