page contents

How to solve AX Data Refresh with Powershell?


The number of resources regarding PowerShell with AX, at our disposal, is so small and it is so little we can do with them, this is a topic that I would like to increase awareness starting an open discussion. I will try to bring to the blog scripts that have been helping me to optimize daily tasks.

For my first script, I will talk about Refreshing AX Data from Production to Test. This is complex and time-consuming work, especially when you have to ensure that your test environment is not pointing to production batch servers, POS terminals, web services, TFS and so forth.

In my initial version of this script I was exporting the environment configuration to CSV files. In this approach, I have to create a store procedure to generate inserts from the AX database. The store procedure had its limitations, such as exporting some data fields (timestamp, geography, geometry, hierarchyid, image, binary or varbinary) and slow performance for big tables (1000+ rows).

The script has been improved creating a central database for UAT, DEV or TEST, which gives us the ability to reuse the configuration more easily, empowered for SQL features (compression, backup, bulk inserts).

AX Data Refresh Process Flow

Let’s look at the script and how to get your test data refreshed.

Prerequisites before you move forward:

1. Windows PowerShell Extensions for Microsoft SQL Server (PowerShellTools.msi)

2. Create Refresh Database (RFR_Database.sql)

3. Update PowerShell Script:
3.1. SQL Server name and Database name (lines 25 and 26) are required values.
3.2. Review $ParamSqlTruncate  (default set to $true), this parameter truncates all destinations before copying data. It prevents duplicate keys in AX Database and data merge in store database.
3.3. Review $ParamSqlCompression (default set to $true), it tries ColumnStore first and then Page compression, if both methods fails, compression is set to None.

4. Define your working tables in $Script:RFRTables and truncate only in $Script:RFRDelTables (please note there are no backup for these tables).  I strongly recommend to you spending some time doing this analysis and creating a well-structured set of tables. The tables bellow are an example, you should change it according to your needs.

Note: The script does not take into consideration table constraints, primary keys and relationship between AX objects. If you export SALESLINE without SALESTABLE, you have data integrity issues.

5. To manage Windows services, you should open PowerShell as Administrator and enable Remote Service Management, WMI, and WinRM on your firewall.

Once everything is configured, press F5 to run the script.
(Do not use this tool in Production. Make sure you have exported all data before deleting any table.)

The script will request you to input an environment configuration before any option. You may use an existing environment configuration or create a fresh store name (environment names have a limit of 30 chars). You can load and release environment configurations using L (Load Environment) and X (Release Environment), at any time. Also, Q to finish the script and R to get back to the previous menu.

Main Menu

• The Main Menu has all the steps to create a backup of your lower environment configuration into the store database, execute most common refresh steps and restore the configuration back to your environment. Also, through this menu, we have access to some additional tools.

Additional AX Tools Menu

• Batch Jobs Maintenance, see description bellow.
• Check SystemSequences Table provides us the possibility to check RecIDs on all working tables and fix them to avoid duplicates. (less likely to happen).
• AX Services Tools, see description bellow.
• Delete Environment Store deletes all backups and historical data for a designated environment.
• Export and Import CSV are the previous methods to save an AX, this process creates a stored procedure in the database (sp_GenerateInsertScript) to create CSV files with inserts for each row that might need to be restored. Exporting to CSV runs slow when working with 1000+ rows.
• Update GUID is not common, but it fixes issues with AX caching.

Batch Jobs Maintenance Menu

• Batch Jobs maintenance assists when moving batch groups out of batch servers or moving batch groups to a specific batch server. Also, it is possible to delete batch history from the database.

AX Service Tools Menu

• It is possible to Start, Stop, Restart and Check AOS Services for a specific environment. If there are no servers in the script database, you can type a server name or multiple servers separated by comma.

In conclusion, this script can make significant improvements by simplifying operational tasks when copying live data to all your lower environments. I hope this script can help you and your customer to realize the full potential of PowerShell as a PowerShell scripting language, in order to automate AX administration tasks. Let’s keep the conversation going and create more interesting scripts for AX using PowerShell.

CodePlex Download: https://dynamicsrefresh.codeplex.com/

Thanks,
Bruno Ferreti

 

DISCLAIMER
Microsoft Corporation. All rights reserved.
Do not use in Production. Sample scripts in this guide are not supported under any Microsoft standard support program or service. These sample scripts are provided without warranty of any kind expressed or implied. Microsoft disclaims all implied warranties including, without limitation, any implied warranties of merchantability or fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages. The opinions and views expressed in this script are those of the author and do not necessarily state or reflect those of Microsoft. Do this at your own risk. The author will not be held responsible for any damage you incur when running, modifying or carrying out these scripts.
Comments (6)
  1. Hi Bruno

    Great post!

    To be honest I have to check better the script but which are the benefits comparing with Test Data Transfer Tool?

    1. Mötz Jensen says:

      Hi Denis,

      I have looked a bit into the code, and also been working with the the Test Data Transfer Tool.

      From my point of view this approach is more robust. You get a full set of data, you don’t have to ensure that the you remembered to generate the needed meta data files of each table.

      Reading the limitations of Test Transfer Data Tool (https://ax.help.dynamics.com/en/wiki/test-data-transfer-tool-beta-for-microsoft-dynamics-ax-2012/#limitations) reveals you cannot be sure that the data you import, is in stored in referential integrity state.

      Looking from an architectural point of view, it doesn’t make sense that data we stored in a RDMS (SQL Server), needs to be exported into some proprietary file format, for then be imported into a receiving RDMS (SQL Server).

      Test Transfer Data Tool has its place and use; exporting a set of data, from a system. It could be a way to supply a BI team with valid data, without exporting a full database copy. But using the tool for exporting a full set of data from a PRODUCTION system and restore data into a UAT system, that doesn’t make sense.

      Many customers and partners have over the years relied on normal backup / restore from SQL Server, and built their own scripts to change the configuration data that they receive from PROD and overwrite it with a valid configuration for the receiving system.

      Now Microsoft has come forward with a framework that supports that model, now we as a community can use that and built better tools for the projects that we are running.

      Please note that the current solution only supports AX 2012. It would be a very small task to get this solution to work for any version of AX, as long as you identify the system / base tables that correlates to the specific release.

      1. Hi Mötz

        Agree with you.

        However, I use often TDTT and is a really smart tool.
        As you know, you can also filter records and column.
        Finally, is very fast.

        About BI, usually I use the Entity Store feature introduced by Microsoft in May 2016, https://blogs.msdn.microsoft.com/dynamicsaxbi/2016/05/02/introducing-entity-store-for-dynamics-ax-2012-r3/

        Anyway, nice stuff!
        Regards
        Denis

        1. @Mötz Thanks, I wouldn’t have explained that well. =)

          @Denis There is no doubt Test Data Transfer Tool is a great tool. However, the focus is to work with parameter tables instead of transactional tables, keeping it manageable and quick. On a central configuration database, we can compare, merge, change, create versions of the data.
          It’s easy to restore a SQL backup to different environments and run the script for each one restoring the appropriate parameters. Another benefit I would point out is after an initial analysis (AX team) the process can be performed by a DBA with no access to the application, no deep knowledge of AX tables structure and no metadata.

          Bruno

          1. Mötz Jensen says:

            @Bruno Thank you for getting this framework out to the community, let us hope that people will pick it up and contribute to make our everyday a bit easier 😉 The point about being able to have a DBA execute these steps is actually a key point, on top of that using PowerShell enables the community to make a 100% automated solution. The first thing I will “fix” is that we have a network share with a .bak file, instead of doing a backup directly on the PRODUCTION. Then you can have the DBA team to drop a .bak file into the folder from the normal backup routine and don’t need permissions for PRODUCTION.

            @Denis Fast isn’t always what we want, if we can’t trust the data (because of the referential integrity limitations). My BI example was more point on that you might want to supply a 3. party a subset of the data. Entity Store solves the real BI expose of the data from AX 2012, so I couldn’t agree more with you. Next time I will find a better example than BI, when discussing Test Data Transfer Tool 😉

          2. Mötz Jensen says:

            Dang!

            @Bruno I just looked at my comment and triple checked the code. You already made the solution to get a specific path for the .bak file, so we don’t need permissions for PRODUCTION. Great work!

Comments are closed.

Skip to main content