Updates to the IE Testing Center


One of the major investments we’re making during the IE9 project is support for more web standards. Web developers all around the world consistently gave us feedback that they wanted to use the same pages with the same markup across browsers. By working closely with the W3C and its members on the newest web standards, we can make that dream a reality for web developers.

118 New Test Cases Submitted to the W3C

Today, we released an updated version of the IE9 Platform Preview build. In conjunction with implementing support for several more web standards, we developed more test cases. These new test cases are available as usual on the IE Testing Center. We’re formally submitting these 118 new test cases to the W3C for review, feedback, and inclusion into the official W3C test suites for each of these web standards.

In addition, we have written 1309 JavaScript test cases and are making those available to the web development community as well. These support Ecma-262-5, Ecma International’s ECMAScript Fifth Edition specification (also known as ES5). Ecma is currently implementing a test case submission process. When it is in place, we will submit these cases into their official process.

The IE Testing Center

One of the questions that we hear each time we write, publish, and submit new test cases to the W3C is, “How should I think about the IE Testing Center?” There are several dimensions to the IE Testing Center so I’ll go through them one-by-one.

The site

The IE Testing Center is part of the current IE project. Like we did during the IE8 project, we will make these test cases available immediately to the web community and submit these cases to the W3C working groups for inclusion into the official test suites.

There are two main tables on the IE Testing Center. The first table is merely a results rollup of the second table. The second table has links to each test case we’ve developed during the IE9 project for each standards specification.

The columns (aka Browsers)

The columns represent the most recent broadly available version of the biggest browser engines. This means Gecko, WebKit, Trident, and Presto. Given that there are two major WebKit implementations and they don’t always use the same version, Google Chrome and Apple Safari have separate columns. Based on feedback from other W3C members, we also added IE8 to the table for consistency.

The rows (aka Standards)

The rows of the first table include the core technologies that web developers have told us are most important to them when they consider the modern, under-development web technologies.

Proposed HTML5 features have received a lot of attention in recent months. In practice, the functionality described in the W3C’s HTML5 specification actually depends heavily on many other W3C specifications. To make sure that HTML5 works correctly, it’s important to test some of these other foundational technologies as well. This is basically the greater metropolitan area of “HTML5”, which includes HTML5 itself plus the suburbs of SVG 1.1 2nd Edition, CSS3, DOM L2 and L3, and ECMAScript 5.

The second table includes links to each test case that Microsoft has submitted to each W3C working group for inclusion into their official test suites. This is a proper subset of all officially submitted cases to the working groups.

The cells (aka the results)

Each cell in the first table is a summary of the pass rate for each specification across each of the major shipping browser versions. These are contrasted with the most recent IE9 Platform Preview. The cell coloring is simply Microsoft Excel 2007’s conditional formatting Green – Yellow – Red color scale, which provides a smooth color gradient from Red to Green as the pass percentage increases.

Each cell in the second table is the test result for a specific test case on a given browser. These are simply listed as Pass/Fail and colored Green/Red, respectively.

Another question that comes up a lot is “Why is the IE9 Platform Preview so green while others aren’t?” When we make a decision to implement a given web standard, we methodically walk through the specification and start building the test cases for the spec while also building the implementation. It resembles test driven development, which works well for web standards, as long as there is a comprehensive test suite. If we fundamentally change a test case based on feedback, it usually drops the current IE9 Platform Preview’s pass rate.

Feedback on the test cases

As always, we look forward to working closely with the W3C and its members on the web standards process. If you have actionable feedback on a specific test case, please use the W3C mailing list for the appropriate working group. For JavaScript test cases, please submit your feedback using Connect. We encourage the other browser vendors to help the W3C finish the HTML5 specification by providing additional tests to the official HTML5 testing task force.

Thank you,
Jason Upton
Test Manager, Internet Explorer

Comments (4)

  1. Anonymous says:

    Thank you for supporting HTML5 canvas!

  2. Anonymous says:

    It's kind of sad that everyone got as worked up as they did over your high test-case scores. You developed the suite, so of COURSE you also made the effort to fix those bugs in your implementation.

    At any rate, keep up the good work!

  3. Anonymous says:

    Thanks for Canvas support and for doing these explanations to clarify tests result (and IE dark green !). These explanations should appear in the Testing center page !

    Adding the ES5 tests is a fantastic idea !

  4. Anonymous says:

    @Michael Kozakewich: it is indeed reasonable to assume that the test suite should show a high score for IE 9; however, the tests used look more like regression tests than anything (a benchmark against which the IE team can assess if one build doesn't fail where the previous one succeeded); if it were a 'real' test suite, IE 9's score would be lower – and everybody could see what works, what doesn't, and what is intended to be implemented in subsequent builds.

    For that reason, third party test suites like Acid3 (which is not made of a hundred, but more like a thousand tests wrapped in 100 units categorized in 6 areas) are a better indication of the overall progress in a browser; in Acid3's case, a successful test more often than not means a rather complete implementation of a feature, the SVG test suite is even better in that regard (since the test suite is proposed by the body that wrote and now maintains the standard).

    The reason some people get worked up is because of a form of misrepresentation: Microsoft pushes its HTML5 test suite as the benchmark it uses to assess how advanced its next browser is; it does so by playing on its strengths, but also fails to indicate its results on other test suites – on which other browsers may show IE up quite badly (current scores on SVG, for example, are rather embarrassing: IE 9pre2 failed 69% of the SVG test suite, Opera failed less than 6%).

    What results do we see currently?

    – MS HTML5 test suite results

    – Acid3 results

    – SunSpider results.

    What do we get from these?

    – IE 9 has a fine HTML 5 implementation, -according-to-MS- (other tests show a different picture)

    – IE 9 has an improved score, but is still worst of all released and supported browsers, 10 points behind Firefox 3.5 (and 14 behind Minefield, which will stop there as Mozilla has no intention to commit the two missing features – they're getting deprecated)

    – SunSpider results are good; but, contrary to other tests, it is a speed test: IE 6 passed it, after all. Is speed more important than features? It's different – a train may be faster than a car, but it can't go off-road. Speed gains are good, but, considering how much faster all browsers have gotten now, further improvements are more a bonus than a real advantage.