Through The Glass Darkly


Testing is often divided into two types: black box testing, where you poke at the product from the outside and don’t have a clue how it’s put together or what makes it tick, and white box testing, where you inspect source code and execute the product under a debugger and have complete access to every last detail. Black box testing is certainly important, for this is the closest you can get to what the customer will experience. White box testing is very useful as well, because when you understand the assumptions your developers made you can identify how those assumptions might go awry. With black box testing you don’t know the developer’s model and thus are forced to form your own; it may have zero correspondence with the developer’s, and so you may miss problems that would otherwise be obvious. On the other hand, with white box testing you can become too familiar with the developer’s model and so lose the ability to see what’s wrong with it – again missing problems that would otherwise be obvious.

I find that grey box testing provides a nice balance between these two extremes. Here you understand the developer’s model well enough that you can push at it and see where it falls down, but at the same time you maintain sufficient distance that you don’t lose your perspective. My favorite starting place is to ask my developers to explain their design. Generally I try to avoid doing this in their office, because a) most conference rooms have way more whiteboard space, b) it’s a lot harder for them to just drone through the code when it’s not right there, and (bonus) c) if they don’t know their design well enough to explain it to me without constantly referring to their code I know it almost certainly isn’t worth testing yet!

This works great for me, and I thought it would work for my team as well. So I asked our architect to take our next many design reviews to take us through Sparkle’s design: what it is, why it is that, how it got there, and where it’s going. It’s been fabulous! We’re all learning lots and testing better as a result.

We are doing this in special meetings, but talking with your developers over lunch or snack breaks works just as well. Eventually you may discover that you aren’t just learning about your application but that you are also building a relationship with your developers. This is a Very Good Thing! Nurture that relationship, and you may find your developers asking you how they can better test their code. And then you’re approaching nirvana!

*** Want a fun job on a great team? I need a tester! Interested? Let’s talk: Michael dot J dot Hunter at microsoft dot com. Great coding skills required.

Comments (4)

  1. phil kirkham says:

    Sounds like nirvana indeed – much better than testers at one end of the office, developers the other and Never The Twain Shall Meet.

    How technical are the design talks that you and your testers have and what technical background do they have ?

  2. micahel says:

    Every tester on my team is a good coder. Design experience varies but everyone understands the basics. The design talks range from even-a-manager-could-understand high level down to very detailed.

  3. Scott Lemke says:

    I’ve always had an issue with the way these 3 types of test were defined, and your narrative brings to light the reason why.

    The primary reason is that the type should not be defined by the knowledge of the tester, but instead that of the test context itself. By your, and many others, definition, a tester that first sees a piece of software would be doing black box testing, as they have no clue what is happening inside. But as they continue to test, and learn about the software either through testing, interviewing the developer, or reading through the documentation and code yourself, you lose the ability to do black box testing, and eventually grey box testing, because your knowledge has grown.

    If you remove the requirement of knowledge of the tester, and replace it with the context of the test, for example This test case is going to throw a range of inputs at this module to produce outputs, which I will verify are the expected results(Black Box), you now "allow" that tester to continue to black box test even after their knowledge of the software increases. (It is also this context/purpose method that I prefer to define my tests instead of with names like Black Box, White Box, Performance, Regression, mostly because I have yet to run across 2 people who have the same definition for those names.)

    So, I guess a final thought would be this, Do you think, according to your definition, a tester could perform Black Box testing after having the knowledge to perform White Box testing?

  4. micahel says:

    Scott,

    Definitely a tester can move back and forth between black box and white box testing! As you say, it’s all about context: what knowledge are you bringing to your testing? Developers can even black box their own testing, if they can ignore everything they "know" about how their code works and focus on what it’s actually *doing*.