How to create good validation tests for your implementation. The scenario is – ‘You have an ISV who delivers you updates and you want to have an automatic validation that proves nothing has been broken’. In the test automation pyramid the tests described in this post are typically called ‘integration tests’. A quick search on ‘Test Automation Pyramid’ is recommended here.
The detail here in this blog is how to use the task recorder to do this and what extra changes we can do in the generated X++ code to highlight any deviation you may have in the code update from standard business processes.
This blog post was created using PU10. The task recorder generates code dependant on FormAdapters and references the Test Essentials model. This is not a post about form adapters or their replacement Type Providers however a version of this post for Type Providers will be created in the future.
To complete this exercise some development skills are required to create the test project and add the appropriate references in order to build the project.
The objective of your tests should be that they are reliable and produce the same result repeatably, they shouldn’t be disrupted by changes after simple updates and should produce some meaningful output during the build when things go wrong. The following example steps will show how the test is created in Task recorder and then how the resulting generated X++ code in Visual Studio is modified to further refine the test case.
From a high level there are these steps:
- From your list of scenarios based on critical business processes design your task recording
- Run through it a few times to get comfortable with the steps, try to keep it simple
- Create a final task recording containing placeholders and references for the developer to work on before placing the test in the automated build.
How to create the final recording
If we take a simple example of creating a sales order we follow these steps:
Go to the settings menu and Select the task recorder
Create a new recording
Give the recording a meaningful name and description and click OK
Now we can do our simple integration test
We are going to use the following task recorder features to make our test easier to manage once we generate the code
- Copy values
- Developer placeholder
So, we will create a sales order using Contoso data as follows:
Go to All workspaces -> Sales order processing and enquiry
Click New -> Sales order
Select a customer and click on OK to create the sales order
We will select an item for this order: 0005
Now to help with Validation:
Go to the Sales Header and right click on the sales ID, select Task Recorder, and then select Copy:
You will notice in your Task recorder script the following message: ‘Note the value in the Sales order field to reference’
Now we will create a developer placeholder. This will create us an empty method in our generated code that we can use to add our custom validation using previously referenced values:
Go to the task recorder menu and select Add developer placeholder
Name the placeholder add a comment and click OK
We will not do anymore to this order now. If you export your recording to Excel it should look something like this:
Now Save your recording as a developer recording:
Stop the task recorder by pressing the stop button in the top right of the browser and select Save as developer recording:
Now we have the recording saved as XML we can use the Visual Studio Add-in to generate our X++ test.
So Open Visual Studio and go into your test project and import the task recording
The Add in will pop up a dialog for you to reference the XML file – in this instance I have imported it to my start-up project – you can create a new project here and reference your test model as appropriate:
Now we can analyse the code that has been generated
Note the SalesId that we can now easily reference. And also the following validation point has been provided for us with the description we added in the task recording:
For a simple validation we can add the following code to appear in our test results, it is important that meaningful messages are added to assertions so that the build output gives clear instruction as to what has failed:
Build the solution and open the Test Explorer to view our new test:
If we now run this test – it should end in success.
This test has a very simple validation function as an example, extra logic can be placed in here to check any extension fields or extension logic that may have been added to the initial sale order creation logic.
Hopefully this post will help in understanding the possibilities with testing using the task recorder features to extend the generated code and adding your own assertions.