Thinking about fly fishing…

I am an avid fly-fisherman, and I am spending a few of these last winter evenings tying flies in preparation for the new year. The lakes are still too cold so the trout are deep and lethargic, and many of the rivers are closed and too damn cold and swollen anyway. So, now is a good time restock my fly boxes, reflect on past years, and dream of the up-coming season. While fly fishing is an enjoyable escape from the day to day torrent of technology, when I can’t stand in a river and wave a stick I can still relax with a good book that conjures memories of years past or engulfs my mind in an adventurous narrative as if I were there. I don’t really enjoy reading most fiction books, but I do enjoy reading the memoirs of people such as John Gierach. (Perhaps it is the sailor in me that loves a good yarn, or the fact that I can relate personally to the stories.) Anyway, this weekend I acquired a book entitled Fishless Days, Angling Nights by Sparse Gray Hackle that introduced a legend in American fly-fishing by the name of Theodore Gordon. I had never heard of Mr. Gordon before this (perhaps that is because I tend to fish a lot more soft hackle flies rather than dry flies), but I found a quote by him quite interesting. He said, “The great charm of fly fishing is that we are always learning.”

This morning I thought how apropos this statement is to software testing. But, there of course, as it pertains to the practice of software testing I would rephrase Mr. Gordon’s statement to state, “The great demand of software testing is that we must always learn.”

There are different types of people who fly fish. There are those who buy a fly rod and a pocket full of store bought flies and head to the water just to catch fish. This group of people don’t seem to care much about the techniques of casting, learning how to read water, or understand patterns of fish or aquatic entomology; they are simply there to try to catch fish using whatever slop-shod mooching approach seems to work. Yes, they still catch fish…usually the small, dumb farm raised trout stocked into regional lakes by the state’s department of wildlife who will bite at anything. On the other end of the spectrum are true purists who fish with cane poles, tie their own flies to match the hatch (sometimes right beside the river), and study trout, regional entomological lifecycles, and the geological formations of a river bed to better pin-point where the big, smart trout are hiding. Then there are the group in between these extremes with varying degrees of skill and knowledge. Depending on how much time a person devotes to both practice and learn (and their capacity to learn) about the sport will often make a huge difference in both their enjoyment and their effectiveness in catching large, persnickety trout.

From my observations of the testing community over the past few decades, I can see a similar pattern regarding the spectrum of skills and knowledge of people who participate in the practice. In the past, it was not uncommon for some companies to hire ‘clever’ people who were simply good at finding bugs into testing roles. Some companies hired developers who would (often times begrudgingly) take the job as a stepping stone into a developer position. Unfortunately, some people at both extremes of this spectrum often stagnated because they did not learn more about software testing or the technological advances that were happening around them. At one extreme, I suspect that some people thought as long as they were finding behavioral type issues they were providing a benefit to the company because they were ‘good at representing the customer.’ At the other end of the spectrum the developer’s in testing roles who failed to realize the challenges in software testing. In both extremes complacency and stagnation usually occurs. Of course, there are many other testers between these extremes; some who will go on to become professional testers and have significant impact, and others who will simply belly-ache and whine about how unfair it is or claim how wrong any change is and why change won’t work.

As professional testers we must constantly strive to improve our knowledge of testing, technology, and the systems we are working on. We must also increase our skills and abilities as the demands of the role expand beyond the traditional comfort zone of behavioral testing and ‘playing customer advocate’ by executing ‘tests’ to find bugs at the end of a cycle. The challenges of testing complex systems built around advancing technologies significantly raises the aptitude bar for testers. Emerging practices such as TDD and agile lifecycles designed to drive engineering quality upstream and form closer customer connections is also impacting the role of testing and how testing adds value in the lifecycle (and I don’t think the role of testing in an agile lifecycle is trying to wedge testing between the end of a sprint cycle and the release to customers in order to provide a pseudo-proxy customer buffer…that’s a bottleneck.) Reinstituting best practices such as design and code reviews and inspections (when warranted), or developing new approaches or tools to help increase testing effectiveness and reduce costs also require greater skills and knowledge among testers.

The formidable challenges of testing software that lie ahead will require highly intelligent critical thinkers who also have an in-depth understanding of the systems they are working on, and who possess the technical aptitude to provide valued input in throughout the product cycle. Indeed we work in a very dynamic industry filled with diverse challenges that demand continued learning and greater proficiency of the skills used in our profession.