User Experience Maturity Model – Microsoft Style

Internally at MS we've been talking about how to define and assess an organisation's "user experience capability". This is important because different organisations have different levels of sophistication when it comes to defining, creating and assessing user experience, and when I am talking to organisations I need to be able to assess 'where they are at' and what their next steps are in terms of developing their user experience skills and capacity. There's nothing to be gained by overwhelming an organisation with talk of commercial ethnography when they are still deciding whether "useability" contains an "e".

So we've been discussing a number of attributes that identify "basic", "standard", "advanced" and "dynamic" organisations wrt UX.

To my mind, one simple indicator is "who does the user interface design around here?"

This is what you typically see from my experience (this is biased more towards software development projects, rather than web projects, which would be slightly different - maybe a future post):

Level Who does UI Design? When?
Basic Developer At development time.
Standard Architect or Business Analyst Key screens specified at requirements definition time. The rest are left to the developer, sometimes in consultation with the business analyst.
Advanced External or internal UI design resources At design stage. Detailed design done by business analysts or developers, probably using a style guide.
Dynamic Dedicated team UI design resources Bulk of design done at definition and design stages, with detailed design and design refinement at development stage

Software testers and technical writers often get involved too at the "standard" level. Note that my definition of "basic" is not meant to imply developers CAN'T do user interface design. With the right skills and support they certainly can (and have to). These are just my 'indicators' of UX maturity.

Your mileage may vary - what is your experience?

For other definitions, see also the "Usability Maturity Model". (More info: "Beyond usability testing: user-centred design and organisational maturity", "Usability Maturity Model". Does anyone have any experience applying this (aging?) model?

Also, here's Jakob Neilsen's take: Corporate Usability Maturity.

Comments (2)

  1. Todd says:

    I am surprised that product iteration has not warranted a mention in this post!  Regardless of who does it, iteration is a key element.  I have had experience with one organisation where they regularly use external UI design resources (‘advanced’ according to above definitions), but because their software lifecycle process is still linear then UI must be done ‘correctly’ the first time without the chance of any formative design / evaluations.  Great if it works, but (as we all attest) it rarely does.

    So while ‘who’ does UX is one indicator, I do think that another indicator is whether the lifecycle typically employed allows for formative feedback.

    Another indicator that I have informally used is when does the company in question get the user involved?  Never, early, middle, late, or throughout?  Do they try and ascertain what the user would actually want from the product up front?   Are they only involved during evaluation sessions?  Or are they totally ignored?  

    A few thoughts…

  2. Garry Trinder says:

    You’re right Todd – and internally we’re talking about a bunch of different indicators, including user involvement and process. WHO does design is just one of them, that I’ve been thinking about. Perhaps less interesting at the Advanced/Dynamic level, but more telling at basec/standard.


Skip to main content