Challenges in Test – Knowing the code & sometimes ignoring it

A tester just needs to master the art of understanding their product & then file bugs based on their findings right?  Wishful thinking! You better have the technical understanding of the code your developer is writing if you intend to properly test it.

Knowing the code & sometimes ignoring it

So understanding how your product works is a good start, but as mentioned earlier a tester needs to have breath & be able to go deep.  Part of that depth is understanding the code behind their product.

There are a couple examples of why you will need to have this depth:

  1. Designing your automation frameworks
  2. Debugging your product

Designing your automation frameworks

While development is designing & implementing the product you will need to figure out how you are going to test the product, while also being involved in the product design itself. Understanding how the code is implemented is going to help with designing your automation.  For example if you have a typical 3 tier application, how is the UI talking to the middle tier?  Are those APIs public?  Do they give you what you need to write automation?  How can you extend them for what you need?  Is there a better way to design the product to make it more testable?

Debugging your product

Part of being a tester means you are one of the first customers of your product.  You are developing tests against a moving target as developers are implementing new features & making changes.  One day a test may be passing & the next it fails. Is it a test issue or a product issue that is causing it to fail?  Your product may not have much manageability yet so there are no good events in the log that you can use.  The only thing left is being ready to strap a debugger to the product & step through the code to understand where things are going wrong.  If it is a product change that broke your tests you may need to understand how to update your tests to work again. Oh & when you are done, you’ll want to file a bug to improve the manageability of the product. You are doing something a customer would be doing and they won’t be able to strap a debugger to it. You’ll want enough information in the logs that ideally a customer could resolve the problem.

Ignoring the code

So you now know your product’s code inside & out.  You are able to debug & even provide your developer with proposed fixes.  You might even have helped write some of the features in your product.  Now forget everything you know about the code.

What?! You might say, I worked so hard to figure it all out. Yep & if you are not careful you may make some of the same assumptions that the developer did when they wrote the feature.  That means you run the risk of designing your test cases around the code & missing some juicy bugs.  You need to step back & challenge any and all assumptions that you might be making.

Along with challenging the assumptions made in the code, don’t forget to challenge & look at everything from the customer’s eyes.  Does the instructions we provide in the UI make sense to the end user or are we using 3 different terms to reference something because it’s name has changed over the course of development.  One powerful tool here is to swap features with another tester & try each other’s area out.  If they can’t figure out how to use your feature, don’t dismiss it as they are new & don’t understand the product enough.  They are likely stumbling over some areas that a real customer would & since you are neck deep in your feature & code, of course how to use it is obvious to you.  It likely won’t be obvious to your customer, so look for how you can improve that also.

Comments (4)

  1. I agree that a tester has to test a system from customer's eyes, paying attention to topics like stability, usability, robustness, manageability or intuivity of a software product. As you stressed out, it's important to put yourself into the position of a newbie and test a system from this perspective, entering inadequate input, clicking randomly on buttong, attepting to crash the system, following the documentation, etc.

    Considering a tester as a customer it's maybe a little to much said because there are totally different demands and perspectives, also the usage of a product is totally differen, and maybe that's one of the gaps needed to be covered.

    From your point of view which of the following points it's more important (a ranking might be even better)?

    1. understanding the code

    2. understanding the functionality

    3. understanding how a product can be misused

    4. identifying the functional gaps existing in a product

    5. identifying the deviations from documentation, best practices, standards, procedures and good sense

    6. indentifying how a product can be improved

    7. identifying bugs and areas of refactoring

    8. identifying (security) risks

    And now the most important question: how much from the above could a tester achieve, considering especially the decreasing software production life-cycle?

  2. Darryl Russi says:

    I don't think you can just do a straight ordering of the items & cover all products in all scenarios.  Additionally I think they all tend to be important at varying levels & times.  Instead I tend to think of them as tools in a tester's toolbelt on how they can increase their teams ability to deliver a high quality product that customers want.

    Now you do point out a tester's delima of how can I be all these things?  I believe there is a balance to be had & each individual tester brings their own strengths to their team.  When building a team you should look to how you can attract people with complementary skill sets. (Note: This is not just a manager's job, but anyone on the interview loop to provide this input) For example you may hire a top notch security tester & but their customer focus might be lower than desired.  Ideally you would also hire someone with higher customer focus & then as part of the individuals development they will likely have opportunities to learn some from the other.

  3. Sure, you need to cover with your resources most of the above points, though I find it in theory harder to approach all or at least the most important points. Automation is great, though is still a long way to go, and most of the above items (I would actually dear to say all) requires manual work, extensive knowledge of what's happening in the field, competencies hard to cover with 1-2 resources.

    I reformulate my last question: "how much from the above could an organization achieve, considering especially the decreasing software production life-cycle?".

    Thank you for your feedback!

  4. Darryl Russi says:

    Yes, the shift towards shorter release cycles is a challenge.  Yet a shorter release cycle does not mean you are delivering the same amount of features as a longer release cycle.  Instead it is about rapid smaller iterations & getting feedback from your customers so you can make course corrections based on that feedback quicker.  There is an overhead about releasing software & the rapid cycles forces the team to look at what that overhead is for doing a release & then working to optimize their processes.

    It is unclear what your team structure is, but one thing that helps when working in a team is that the whole team is responsible for quality.  The developers are writting unit tests & even functional tests while the testers can focus on the broader problems like integration, scenarios, etc.  There are successful teams with a high dev to tester ratio which have to ensure they all own quality and can't just assume that it is test's responsibility.