The difference between professional testing and arbitrary guessing or wild speculation

My friend and teammate Alan knows how passionate I get about certain things on occasion, so he threw me a bone the other day regarding a blog post on boundary testing, or should I say a rather a poor attempt to discredit boundary testing and boundary value analysis as a valuable testing technique. The author of the post made an attempt to diminish the value of boundary testing by stating that he has "never seen a boundary!" and asked, "As a tester have you ever seen a boundary?" Maybe that is a trick question, but I am positive the answer is YES! Let me give you a clue...32767 is a boundary value, 65535, is another one, and there are many more. Also, if I artificially constrain an int in a predicate statement using a relational operator such as if (intValue <= 0) then there is another boundary condition that I would certainly want to analyze. (That is what ISTQB, Myers, et. el. mean by Test boundary conditions on, below and above the edges of input and output equivalence classes.)

Unfortunately, the author's "conjecture that boundary is not static in software" is simply folly, and demonstrates a lack of understanding of how to identify boundary values, or how to adequately analyze boundary values, and simply assumes that boundary testing is as simple as testing at the extremes ranges. In the post the author describes his 'experience' boundary testing an audio decoder that will playback MP3 files encoded between 23 kb/s and 196 kb/s. His assumption is that boundary testing would simply entail testing MP3 files at 22 kb/s, 23 kb/s, and 24 kb/s, and 195 kb/s, 196 kb/s, and 197 kb/s encoded bit rates. Not only is this a bad assumption, it is technically impossible.

The bit rates for an MP3 file using mpeg2.0 encoding are 8, 16, 24, 32, 40, 48, 56, 64, 80, 96, 112, 128, 144, 160 kb/s, and the bit rates using mpeg-1 layer 3 are 32, 40, 48, 56, 64, 80, 96, 112, 128, 160, 192, 224, 256 and 320 kb/s. Modern MP3 players are capable of decoding files encoded with variable bit rates; however, with the exception of LAME encoder, files encoded with variable bit rates adhere to the bit rate encoding standards established by ISO/IEC for Layer I, II, and III bit rate indexes (and even the LAME encoder does not allow bit rate increments of 1 kb/s).

Therefore, since no encoding exists for 23 kb/s, 25 kb/s, 195 kb/s, and 197 kb/s encodings these are simply impossible and suggesting tests at these values indicate a lack of domain knowledge and appears to be simple guessing at what to test. Boundary analysis implies testing on, below and above the boundary condition using actual values, not something arbitrary or made up. If the requirements indicated support for mp3 files using a minimum bit rate of 24 kb/s the min - 1 value is 16 kb/s (not 23) and the min + 1 value is 32 kb/s. (Personally, given the limited number of bit rate encodings I probably would have tested a file encoded for each standard bit rate within each bit rate index (Layer I, II, and II), and also random samples of VBR encoded files within ranges specified by the requirements. And yes, include analyzing files encoded at bit rates just above and just below the stated requirement boundaries.)

The blogger also attempts to draw a parallel between the definition of planets in our solar system and boundary values in software. But, even here the author is misinformed. The analogy to planets illustrates antiquated thought based mostly on speculation and controversy. If by chance you are interested in facts with regards to planet classification look here. (BTW...Pluto is no longer classified as a planet, so once again the assumption that Pluto is a "boundary" of some sort is due to inaccurate or imprecise information or simple guessing. ) Fortunately, I don't think too many of us have to deal with developers who bicker over the size of a 32 bit integer value the same way astronomers argue about planet classification.

Now, of course, if developers simply changed data types or relational operators randomly throughout the code on a weekly basis, then I am all bought into the argument that there are no boundary values and boundary value analysis is not a valuable testing technique. But, on this planet (the one in the "reality" universe) where most developers are not morons who constantly change data types or relational operators, or constants then it is quite possible to identify boundary conditions and then carefully analyze the 'possible' values immediately above and below that boundary condition. And that is a good thing because historical analysis indicates that more than a handful of defects occur at or near boundary values.

But, testers must be aware that boundary values don't always exist only at the extreme ranges of data types or other variables. Occasionally, there are boundary values/conditions within the minimum and maximum physical ranges of a variable. That's specifically why ISTBQ suggests boundary testing as boundary conditions on, immediately below and immediately above the edges of input and output equivalence class subsets. Of course, that assumes the person knows how to decompose data into equivalence class subsets. This is where in-depth technical, system, and professional knowledge and skills separate the professional testers from the amateur guessers.

This is a good example why, for those who are really interested in being a professional tester and want to pursue software testing as a career, we should read a few more books on software testing, and a few less on epistemology, cognitive psychology, and metaphysics. (I am not saying these topics are not interesting or important. But, I suspect that most testers can already think for themselves, learn and understand abstract and complex thoughts and apply logic and reason.)