All For One And One For All: Our Complete Automation Stack

Factoring these details out of the test case into intuitively organized libraries allows the test case to focus on the essence of what it is testing rather than incidental details. This helps us write more comprehensive tests faster but spend less time maintaining them.

It also enables us to approach testing in a very different fashion. We can write test cases before the feature has been coded. We can write test cases before the UI has been built. We can write test cases as the feature is being designed. Being able to write test cases immediately upon entering the milestone allows us to be front loaded rather than eternally catching up to Dev. This is how we do it:

  • It all starts with user features. Along with deciding what the feature should do, how the feature should work, what the workflow should be, and what the UI should be, the feature team designs the Logical Functional Model for the feature. The LFM is really part of the feature, so the LFM is defined inline to the feature specification. This makes clear the similarities between the LFM and the feature it is modeling and highlights any mismatches between them. Delivering the framework necessary to test the feature is part of the cost of building the feature, so implementation of the LFM is costed alongside implementation of the feature. We have found that stubbing out the LFM and starting to think about its implementation as it and the feature are designed can surface issues that would otherwise have been missed.
  • Next, we can start writing test cases. Build verification and exit criteria tests can be available for developers to run before they check in the corresponding code, with basic and extended functionality test cases following throughout the milestone. Similarly, developers can use the LFM to write their integration tests prior to implementing their features. Having all this code checked in but never executed and thus never tested is a bit unsettling at first. It definitely puts the onus on the test case writer to understand what each test is supposed to do and how the application is supposed to respond. Writing test cases that simply “light up” as soon as the corresponding application, LFM, and POM code is checked in is a bit of a challenge but is by no means impossible. We have had some great successes using this technique.
  • Stub out the Physical Object Model as soon as the UI is defined. This can happen quite late in the development cycle since the test cases are written against the LFM and thus do not need to be changed when the UI changes.
  • Implement the LFM, POM, Application Internals, and Expected State Generators as application code comes online. The feature team has more latitude scheduling this work since each can be implemented mostly independently from the others as well as from the test cases.
  • Design and implement the verification model throughout. As with the LFM this is designed through the joint efforts of the feature team and recorded inline to the feature specification. Defining the verification model forces the feature team to truly understand the feature’s inner workings. The resulting discussions are incredibly effective at finding areas where the specification is less than forthcoming or less than precise. Because Loosely Coupled Comprehensive Verification mostly decouples verifying actions from executing them, verification can start small and ramp up over time. As the feature becomes better defined and application code comes online verification can be brought online as well. Regardless of how much or how little verification is enabled at any one point in time, test cases continue to run and continue to find problems.
  • Define and implement Execution Behaviors and Data Providers as the need arises. Execution Behaviors are likely to be identified while the feature team is designing the LFM; the need for specialized Data Providers may become evident during feature team discussions of the test cases. As with the verification model, feature team discussions regarding the various ways a feature can be executed and the types and ranges of data a feature must handle help the team to identify areas of the feature that require further thought.

Teamwork combined with these innovative testing techniques has already proven dividends. This is however just a fragment of the benefits we will see from testing smarter. You can too!

Comments (2)
  1. In many of my posts I have alluded to the automation stack my team is building, but I have not provided…

  2. In many of my posts I have alluded to the automation stack my team is building, but I have not provided…

Comments are closed.

Skip to main content