Categories
Uncategorized

Tell companies what you think: a satisfaction survey sometimes makes a difference

Carol Sarler of the Times in the UK asks that after she “had six questionnaires to fill in last week… does anyone care what my opinion is?”

“It was only a coffee and a glass of indifferent wine, shared with a friend on last Thursday’s soggy afternoon. Nevertheless, in order “to improve quality and service”, we were presented with a card bearing boxes, begging to be ticked, that invited us to grade every aspect of our “visit” that day – thus bringing to six the number of consumer surveys thrust my way during last week alone.

“There had also been one from an airline, one from a hotel, two following online purchases and one from the NHS. Never have our opinions been so fulsomely solicited, as we are flattered into giving time, thought and energy to people and companies who care so very much about us that they will put our efforts directly towards – oyez! – our future “customer satisfaction”.

“Where was the box that said: “Dear Mr Delta, your airline sucks but as long as it is the only one that flies to my most frequent destination I am forced to use it”? And even if there were such a box, if he knew I was trapped by route map, where’s the incentive for him to make the improvements that his questionnaire, by its very existence, infers he might?”

That’s really a good question: what do companies do with the information and feedback that customers provide on satisfaction surveys.

Not only do we look to third party surveys by analysts and popular stats as provided in surveys such as the American Customer Satisfaction Index.

At Microsoft, we also solicit feedback from our customers and partners directly, as I’ve noted on this blog previously.  As Mark notes in the UK, we work hard to provide solutions and services that meet the needs of our customers.  To that effort, we ask some of our customers and partners to provide feedback in formal satisfaction surveys, research that Mark says will “directly shape not only Microsoft’s future products and services but also how we do business here in the UK.”

As few years ago (back in 2003), we created a small and agile group that helps the company focus on improving the customer experience, analyzing the data we get from our surveys. (I posted this item about the effort and Computerworld’s article.)  Since that inception, we’ve not only increased the attention and improved the processes to get feedback from our customers and partners to improve their experience and increase their satisfaction with our products and services.

But, if you get a survey, does anyone actually pay attention to the data and make needed course corrections?  As Computerworld reported, sometimes it’s difficult to get employees to pay attention. 

“… Microsoft expanded its customer surveying after Bliss arrived. But getting executives to pay attention and use the results to improve their processes was more difficult.

“We did those classic customer satisfaction surveys — I won’t even tell you how much money we spent on them — and then they would land like a brick on people’s desks,” she said.”

That’s not a challenge today with our senior leaders: Microsoft provides incentives to our management and leadership via the SPSA program (see the Microsoft annual proxy report for 2007 for more), “designed to focus our top leaders on shared business goals to guide our long-term growth and address our biggest challenges by rewarding participants based on growth in customer satisfaction, unit volumes of our Windows products, usage of our developer tools, and desktop application deployment over a multi-year performance period.”

We also review satisfaction and customer feedback regularly with product, support and sales & marketing teams, using data we collect in our Response Management systems, direct product feedback via tools such as Microsoft Connect, Dr Watson and software quality metrics (aka SQM, or ‘squim’) we obtain through our Customer Experience Improvement Program (CEIP).  These are all parts of the effort at Microsoft to automate product error reporting and analysis, making it easier for our product teams to understand the error reports that are sent back to Microsoft when you click ‘send’ in the error message box (as noted here on Abhinaba’s blog).

Jensen wrote an overview of SQM in his post on the Office UI blog

“So much of what we did was based on feel, estimation, and guesswork. How much that was true only became clear with the introduction of a technology called SQM (pronounced “skwim”).

“SQM, which stands for “Service Quality Monitoring” is our internal name for what because known externally as the Customer Experience Improvement Program. It works like this: Office 2003 users have the opportunity to opt-in to the program. From these people, we collect anonymous, non-traceable data points detailing how the software is used and and on what kind of hardware. (Of course, no personally identifiable data is collected whatsoever.)

“As designers, we define data points we’re interested in learning about and the software is instrumented to collect that data. All of the incoming data is then aggregated together on a huge server where people like me use it to help drive decisions.”

Back in 2004, Chris Pratley wrote about the Watson process, where we use automated tools to help us better understand the issues that happen in the “diverse set of environments and activities that our actual customers have.”

“Everybody has an anecdote about problems, but what are anecdotes worth? What is the true scale of the problem? Is everything random, or are there real problems shared by many people? Watson to the rescue… [to] categorize every crash our users have, and with their permission, collect details of the crash environment and upload those to our servers.”

This product information, combined with customer satisfaction information helps us focus on the right improvements and solutions. 

There are also examples where we took customer feedback from our various listening systems and surveys to improve the licensing experience, as we noted last year.  Our licensing team made improvements to our Volume Licensing programme (to simplify the overall license management) based on customer feedback.  The team provided some important changes to license agreements, including fewer pages, greater efficiency, simplicity and consistency.

Teams also work cross-group — across customer support and product teams — to take the feedback we get from the field (the “voice of the customer”) directly to the product groups in order to make impactful improvements.

Kathleen Hogan, our Corporate Vice President, Worldwide Customer Service, Support and one of our corporate sponsors on the Customer and Partner Experience (CPE) effort often talks about the importance of the customer feedback loop.  Ultimately, if we’re successful at learning through these experiences, we will help our customers avoid or eliminate problems even before they happen.  Kathleen noted here some of the proactive improvements we’ve made through analysis of customer and partner support incidents, in this case to improve the configuration process of Exchange Server.

“These issues were difficult for customers to identify within their environments, and CSS responded quickly, working with the Exchange product team to create the Exchange Best Practices Analyzer (ExBPA) tool. Based on further implementation analysis and in partnership with the Exchange Server product team and Premier Field Engineering, the Exchange Risk Assessment Program (ExRAP) was established, which incorporates the ExBPA tool. This combination of service delivery program and tools provides best practices to our enterprise customers around how they should implement and optimize Exchange Server.”

This is an example of the virtual cycle of how we listen and respond.  So, the next time you are presented with an opportunity to provide feedback, think about how the information will be used by the company.  At Microsoft, this is taken seriously, as teams around the world do look at the data we collect through consumer surveys, and act on the data.  It’s about doing the right thing to improve our products that will result in overall business improvements and success.  

And (as noted above in the Computerworld’s article) it doesn’t hurt that the senior staff and executives at Microsoft are rewarded and held accountable against various metrics: not only are sat metrics used by teams, the customer and partner satisfaction results as part of the SPSA program affect “a fairly big part of their bonus.”

Tags: Customer satisfaction, CPE, Kathleen Hogan, survey, Carol Sarler, Microsoft.

2 replies on “Tell companies what you think: a satisfaction survey sometimes makes a difference”

Comments are closed.