Your SEO optimized title page contents

SQL Server Data Compression for Dynamics AX Part 1 – Introduction and Best Practices

In this blog post we will discuss the Benefits and Costs of database compression and we will provide Best Practices for successful data compression.

Why data compression matters?

Data compression is one of the Data Volume Management strategy (together with Data Clean up and Data Archive) which as we know, consists on actively managing the amount and type of data retained in the Dynamics AX database.

What are the key benefits of Data Volume Management?

  1. It minimizes database storage requirements.
  2. It minimizes time to recovery in case of disaster.
  3. It improves query performance.

When visiting our Dynamics AX customers, we can see they’re often facing issues with databases growing very fast. There is no volume threshold rule on when to apply the compression, however there are few conditions to be considered:

  1. DVM strategy order: usually it’s recommended to clean up temporary data first (there will be a blog post coming soon on Dynamics AX clean up strategies so stay tuned) and optional archive data before compressing
  2. Versions of SQL Server and Dynamics AX
  3. CPUs condition of SQL Server
  4. Type and usage of data you want to compress


For a quick reminder: SQL Server Data compression was introduced in SQL Server 2008. It is only allowed with SQL Server Enterprise and Developer editions. Recently the compression support was also announced for SQL 2016 and Azure SQL Database.

Compression is supported for Dynamics AX 2009 and Dynamics AX 2012 from SQL Server. Starting from Dynamics AX 2012 you also have the option to enable compression directly from within the application.

Benefits and Costs of database compression


  1. Storage: Decrease the size of database size (40-60% smaller)
  2. Performance: Improve performance when disk I/Os is a bottleneck because it will reduce the I/Os to the disk subsystem. It will also keep more data in memory, improving overall system performance as the compressed index is read into SQL Memory (Buffer Pool) in a compressed state.


  1. Requires SQL Server Enterprise Edition.
  2. CPU Utilization: Increased processor utilization (5 -15%) so may decrease performance if CPU is already a bottleneck.
  3. Performance: Writing to a compressed index takes longer than writing to an uncompressed index (especially for when using PAGE compression) See this blog article for more details.

Best practices for Data compression – Before you start:

  1. We recommend all tables to have at least a clustered index
  2. Compression should be tested and measured on Test environment with similar characteristics than Production to be able to have reliable operations.
  3. It’s recommended to apply compression during offline hours with no users on the system as the compression process itself may impact I/Os performance and any running queries. Compression goes through ALTER Rebuild Index ONLINE where possible, but some indexes need to be Rebuild OFFLINE only.
  4. Depending on the hardware and data size, multiple maintenance windows may be needed to accomplish entire scope of compression. There are many factors which can impact the duration of compression operation, particularly the performance of the IO sub-system. But for a given index, it is common to see compression taking 1,5 to 2 times more time than classic ALTER REBUILD operation.
  5. Backup should be taken before the compression in case any unexpected damages occur (hardware failures, sudden power outages, etc...)


See part 2 of this blog series: Planning Database compression here.



Anna Mohib-Januszewska
Premier Field Engineer Dynamics AX


Jean-Bosco Bavugilije
Senior Premier Field Engineer SQL Server

Skip to main content