I learned/had remembered to me several interesting things by posting that street sign. (Yes, Warren, I snapped that in Toronto.)
- What seems complex, complicated, or confusing to one person may seem simple, clear, and obvious to another.
- Context always matters. (Actually, there must be At Least Three cases where context doesn't matter, so I'll reframe that: This is one more situation where context matters.) If I was familiar with Toronto's snow plowing schedule, and/or the parking habits of the area, the signs may have made more sense to me.
- Posting a picture which I state confuses me generates more comments than do my typical posts! <g/> (There I go again, generalizing the system based on a single datapoint…. Reframe: posting this particular picture generated more comments than have most of my other posts.)
My challenge to test a cup was comment-inducing as well. Y'all had some interesting ideas! I like that Stuart came back several times with more test case ideas. Most testers do not think of every last test case the first time they sit down and think about it. The more you learn about a product the more test cases you are likely to think of, and the better you can gauge which test cases are likely to produce useful information. Any process which expects you to define every test case up front, and then run through exactly those test cases and no others, is likely to miss important issues.
I was especially happy to see Andrew, and Apoorva, and Stuart (eventually) ask what is perhaps the most important question a tester can ask: "What is it meant to do?" As testers our job is to provide information about whatever it is we are testing. We do not need to know anything about what it is supposed to do in order to do that. However, the information we provide tends to be used to make business decisions (like "Does this thing solve a problem sufficiently well that people will purchase it?"), and the people requesting the information tend to expect us to be providing specific types of information on specific aspects of the item. The more we know about what the product is supposed to do, and what problem it is supposed to solve, the better we can tune our information-seeking activities - i.e., the testing techniques we use and the test cases we execute - to provide information that will be useful in making those business decisions.
I find the test-a-cup problem useful because its surface simplicity quickly becomes deep and complex. Consider how many different aspects a cup has: Its color. The material from which it appears to be made. The material from which it is actually made. Whether it is watertight. Whether it glows in the dark. Whether it is insulated. Its mass. Its weight. Its fragility. Its transparency. I could keep going. (Feel free to do so in the comments!)
Each of those aspects has many different possible values, each of which are desirable in one context and undesirable in another context. Take fragility, for example. People generally assume the cup should not be easily broken. While this is likely true for a cup meant to be used by a young child, it almost certainly is false for a cup meant to be used during a Jewish wedding ceremony.
Knowing which aspects to investigate, and which test cases to execute, and which information to pay attention to, is a key testing skill. And hard! Training can help. Experience is a good teacher. Your application is one training ground you can use. Life is another. Test a light switch. Test a door. Test a tree. Test the flat panel television in the elevator. Have your teammates test it. Have test case generation competitions. And races. The more you think about testing and exercise your testing muscles the more effective a tester you will become.
Even if you are a developer. <g/>