Using the batch framework to achieve optimal performance


Recently I learnt how powerful the batch framework in AX is – and discovered how to improve performance of long running operations.

Most long running operations in Dynamics AX can be scheduled to run in batch.   In most cases you can explicitly define a query to select which data to process.  It is often simple to create a single query that selects all the data to process and then schedule the operation for batch.  Doing it this way will start one batch operation that process the data piece by piece. It works, it is simple to maintain – but it’s not necessarily fast.

Instead consider defining multiple queries that each covers a portion of the data to process, and then schedule them all to run in batch at the same time!  Now suddenly you have parallel processing.

Here is a real life example.

In Dynamics AX 2012 R3 we had a customer with 30,000 items that needed replenishment in their picking warehouse. They used fixed locations for each item, and used Min/Max replenishment.  The replenishment operation in AX is defined using a template. The template consists of lines, each with a query to specify items and location to replenish.

The original setup we deployed was a single replenishment template covering all 30,000 items. The total execution time was 2hr31m:
clip_image002[10]

Then we created a new template with a query that covered about half the items, and changed the original template to cover the other half. We scheduled both to start at the same time. Not surprisingly they completed much fast. Almost in half the time. Total execution time was 1hr21m:
clip_image002[12]

Repeating the pattern, we split the replenishment into 8 templates. Another drop in total execution time was observed. Down to just 40 minutes:
clip_image002[14]

At this point the system was quite loaded. CPU averaged 80% and SQL had constantly a few tasks waiting – that is a good thing, the hardware is meant to be exercised.

Conclusion

With a few simple configuration changes the overall execution time was cut by a factor 4. 

This pattern can be applied many places – they key caveat to look out for is logical dependencies between the batches created.  In the example above, it is important that two batches are not replenishing the same item or the same location. That could lead to one batch waiting for another batch to complete. The implementation will of course be transactional safe, even if there are dependencies between the batches. If there are dependencies it may not yield the same impressive results, and could result in some batches getting aborted.


Comments (2)

  1. In the example above, we just had one batch AOS.  The maximum number of threads the AOS can handle is not as interesting as the actual throughput. You'll want to measure the gain as you add more threads, at some point there will be no gain by adding more threads – you might even experience a slow-down as the system's resource gets exhausted.  As mentioned above you want to make sure CPU and SQL processing power is utilized, once you are close to that, stop adding more threads.

  2. Marcin says:

    I wonder how many batch AOSes do you have. And what is the maximum number of batch threads that can be run on the AOS instance at the same time.

Skip to main content