Using personas to help develop setup tests


Last week I wrote about two different types of testing.  One was setup testing: the very time consuming process of ensuring an application installs correctly.  The other was using personas to develop tests to ensure people can use our software in specific scenarios for which it was designed.  We can tie the two together, and here is how.


Suppose Mort is the name of a system administrator at a medium sized company.  Those readers with a tester mentality will immediately ask "What do you mean by 'medium sized' company?"  Fair enough.  We'll say Mort is in charge of 1400 users on about 1000 computers across 2 remote locations, Dallas and Houston.  His task is now to deploy the newest version of OneNote to all the users in his company.  We'll assume there are field associates as well who do not have 24 hour access to the LAN connecting his Houston and Dallas offices.  Due to his previous efforts, everyone in his company has standardized on Windows 2003, so he does not have to cope with Vista or Windows XP considerations.  Almost all users have Office 2003 with OneNote 2003, although the senior level department heads have already started migrating to Office 2007.  And there are a few "road warriors" who are still using Office 2000.  His company is multi-lingual with documents in both English and Spanish.  His company has one legal consideration they told him which affects OneNote.  Mort has to ensure all users set the amount of backup data to 7 days instead of using the default of 2.


Already we had to get more detail about what "medium" meant.  In order to start ensuring OneNote (as part of Office) can be deployed we need to understand everything facing Mort.  Obviously, he has far more considerations to face than a home user who can simply "run setup and be done."


Here's one way Mort can reach his goal.  Create an administrative setup point for office.  This step runs through setup and creates a customized install location on a file share from which users can run setup.  He will customize the setup so that policy is applied to the install in order to ensure the backup data is set to 7 days.  He will also ensure Spanish and English support are both enabled.  Since he has a Windows domain, each user, when logging into the domain, can have a script file run.  To this file, he will add the setup command to launch Office setup.


Next, we will need to test his routine.  We needs some computers with Office 2000 to verify this upgrade scenario.  Some will have Spanish support installed for Office 2000, some won't.  We'll either simulate low bandwidth via some bandwidth throttling tools, or break out some modems for real world use.  (True story: I used nothing but a 56K modem for 2 months when testing Outlook 2007 to ensure cached mode was usuable in that config.  Honestly, except for large attachments, I never even noticed).  We will also need some Windows 2003 machines with Office 2003 installed.  Two of the machines will have Office 2007 installed instead. 


Now, since we can probably assume that at least one user has a Windows XP or Vista machine that no one knows about, we'll try one of each.  If we were to assume that all of our testing was only for Mort (it's not: he's just one persona of many), any bugs found here are candidates to consider not fixing.  If there is any easy workaround to get Office installed for Mort, we may be able to document the workaround for the unsupported configuration. 


It looks like we'll need about 7-8 machines for testing.  Setting them up and creating images is pretty easy - we'll need a day or two for that.  Once that is done, we can start testing.  The easy cases are pretty simple: just log in and wait for setup to complete.  That will take at least a day to run, and probably a second day for verifications.  The more interesting cases are also the fun ones: wait for setup to get about 20% done, then simulate the user accidentally unplugging the power cord.  Turn off the server while setup is running.  If using the network throttling tools, set the connection to drop 50% of the incoming packets, and/or set latency to 5 seconds or so.  Try with a user who for some reason is running Windows XP or Vista without telling Mort.  And try with a few Spanish installs of Windows.


Things to look for outside of a seemingly successful install are the dual language support, performance, ensuring users can rerun setup (for detect and repair, or to install components unique to that user) and performance of running setup.  Parsing through log files after setup is one component, and running automated tests on the installations is another verification.  Figure this will take about 5 days, giving this one setup scenario about 2 weeks of testing needed for verifications.   Hopefully, we can eliminate some of the duplicate work, like upgrading the previous version of Office, in order to reduce the time needed for further setup testing.


If we find and fix a bug in all this, we get to go through the whole process again to both verify the bug is fixed, and no new bugs are introduced.  As a rough estimate, once we get all the initial images created, we will need up to a week to perform a second test pass in order to verify bugs were fixed correctly.  Each bug regression now has a new cost associated with it: one week of testing, tying up eight machines.  This is in addition to the developer time needed to come up with a fix. 


Questions, comments, concerns and criticisms always welcome,


Comments (2)

  1. ppdragon says:

    Is it possible for testers to review the installation scripts and analyze the situations which are most likely easy-to-break.

    Or maybe some pair-wise tools could be leveraged in your case?

  2. JohnGuin says:

    It is both possible and just about required to review installation scripts.  Analyzing where they are likely to break is tougher during the development process, but gets easier as more and more bugs get fixed.

    Obviously, automation will pay off greatly here.  Using model based testing to generate setup tests pays off greatly: start the permutations on machines, throw a few machines "in a corner somewhere" and let your automation churn through the matrix.  It’s very possible to get tens of thousands of tests completed, logged and verified without much human intervention at all.


Skip to main content