How to Compile a Database Twice as Fast (or faster)


Compiling a complete NAV database can take quite a while. Even on powerful development machines with a lot of CPU cores this is still the case – the development environment wasn’t designed for the multi-core era. The only way to speed things up is to use separate instances of the development environment and have each compile a (distinct) subset of the objects in the database. With the new Development Environment Commands for PowerShell that were included in the Microsoft Dynamics NAV 2015 Development Shell, this has become a lot easier.

Before heating up those cores, let’s first introduce the command that we need for this: Compile-NAVApplicationObject. In the following example we’ll assume that the database and Development Shell reside on the same machine. To compile all non-compiled objects in a database named the command simply takes a parameter that specifies the database (i.e., MyApp) and optionally if and how schema changes should be synchronized:

Compile-NAVApplicationObject -DatabaseName MyApp -SynchronizeSchemaChanges No

To compile all objects in a database regardless their current compiled state use the Recompile switch:

Compile-NAVApplicationObject -DatabaseName MyApp -SynchronizeSchemaChanges No -Recompile

The command also takes a filter, e.g.:

Compile-NAVApplicationObject -DatabaseName MyApp -Filter ID=1..100

compiles all non-compiled objects with an ID in the range 1 to 100.

Now to parallelize the compilation process we need to partition the set of objects in distinct sets that can be compiled in parallel. The most straightforward way to do this is based on object type. For each object type we can start a compilation job using the AsJob parameter. Using this parameter an instance of the development environment is started in the background and a handle to this background job is returned. PowerShell comes with a set of commands to work with jobs, for instance, to get the output of a job (Receive-Job) or to wait for a job to finish (Wait-Job). Occasionally, race conditions could occur while updating the object table. As a result, some objects may fail to compile. Therefore, after all background jobs have completed we compile all non-compiled objects in a final sweep. This is all we need to understand the following function that compiles all objects in a database in 7 parallel processes:

function ParallelCompile-NAVApplicationObject
(
[Parameter(Mandatory=$true)]
$DatabaseName
)
{
$objectTypes = 'Table','Page','Report','Codeunit','Query','XMLport','MenuSuite'
$jobs = @()
foreach($objectType in $objectTypes)
{
$jobs += Compile-NAVApplicationObject $DatabaseName -Filter Type=$objectType -Recompile -SynchronizeSchemaChanges No -AsJob
}

Receive-Job -Job $jobs -Wait
Compile-NAVApplicationObject $DatabaseName -SynchronizeSchemaChanges No
}

  
Just for fun, let’s measure the performance gain. We can do this using Measure-Command:

Measure-Command { Compile-NAVApplicationObject MyApp -SynchronizeSchemaChanges No -Recompile }
Measure-Command { ParallelCompile-NAVApplicationObject MyApp } 

 

These two screenshots from task manager illustrate the difference in CPU utilization: while running the non-parallel version CPU utilization is hovering around 40%; while running the parallel version CPU utilization is maxed out at 100%.

             

                    

On my laptop (with 2 CPU cores) compilation of the 4148 objects in the W1 application takes 8 minutes and 46 seconds using the non-parallel version; using the parallel version it takes only 4 minutes and 32 seconds. Machines with more CPU cores may produce even better results. Note that when parallelizing this way (i.e., based on object type) only four processes are active most of the time – compilation of Queries, XMLports and MenuSuites finishes relatively quick. So if you have a machine with a lot of cores (say 6 or 8) that you want to put to use, you need to find a way to partitioning the set of objects into a larger number of smaller sets.

Who beats 4 minutes and 32 seconds? And please share how you did it!

Comments (9)

  1. Kine says:

    Is it possible to use same technique in NAV 2013?

  2. basgraaf says:

    Hi Kine,

    The short answer is: no this technique won't work because the Development Environment Commands for PowerShell (including Compile-NAVApplicationObject) are not available in NAV2013.

    However, the Development Environment Command that these PowerShell commands are based on (finsql.exe) is part of NAV2013. See msdn.microsoft.com/…/hh168571(v=nav.70).aspx. Although the interface has changed somewhat, it might not be a lot of work to adapt the script module that implements Compile-NAVApplicationObject such that it works with NAV2013. The script module is named Microsoft.Dynamics.Nav.Ide.psm1 and you'll find it in the RTC installation folder.

    Also note that it is our intention to make the Development Environment Commands for PowerShell available for NAV2013R2, which would reduce the amount of work required to adapt it to NAV2013.

    Hope this helps you.

    _Bas

  3. Sigh,

    Microsoft really shouldn't be drawing attention to the "compile" times for RTC code, they're really very bad.

    Basically all that's done for a 'Dev Env' compile is a direct source->source translation from C/AL to C#. This sort of primitive translation should run at millions of lines per second (at the very least several hundred thousand) on a modern machine. What I guess was happening is that the C# translator is written in DotNET and for some obscure reason it got recompiled (to machine code) for every single object.

    This means that a V7.0 compile (which DOES NOT even contact the service tier) takes a LOT longer then V5 (over ten times as long).

    On the plus side it does seem like V8 is faster than V7, nowhere near V5 speeds but getting better.

    I did always wonder why it was done this way though; the reasonable method would have been to use the existing XML export in the 'Classic client' to save the C/AL for the service tier to translate on the big honking machine OR do a full compile to DLL on the client. Not this weird half and half mixture we've got now???

  4. Kine says:

    Hello Bas,

    thanks for the answer, I just wanted to know, if there were some changes in the finsql.exe to allow that or it is still same. I was trying to simulate same on NAV 2013 but had some problems, but it seems that the problem will be in the powershell part and not in the finsql.exe (jobs running, but all goes slow and CPU is idle… but again, it seems that I missed something in the powershell).

  5. Here are some PowerShell compilation functions that I have written which can be used from NAV 2013 upwards. The text is in German, but the coding is in English :).

    http://www.msdynamics.de/viewtopic.php

    Note that the switch -SynchronizeSchemaChanges is unsupported in NAV 2013 (R2) and causes a runtime error in these versions.

  6. Pallea says:

    @Rober de Bath

    Dynamics NAV contains around 5000 objects of which many do not have any code added to them and therefore are pretty fast to compile. To me the compiling time is not crucial. I don't care if it takes 5 minutes or 50 seconds, because I don't need to do a full compile several times a day. And honestly I cannot see that it should be neccessary either.

    If you image thatn you have your "own microsoft" developer 100 hours. Would you use those 100 hours to fix this "problem with compiling time" or spend the 100 hours to solve other issues. I would choose other issues, that have a much larger impact of what I am working with as a developer.

  7. Jens Glathe says:

    Compile times are more or less irrelevant for most of us, but it actually is quite slow. So, a bit of optimization would be nice. When you do a lot of merging, then the "recompile all" is a must, and having it automated and fast would be a plus. But the 100 hours of the developer would be better spend on fixing that dreaded GUI (search, sorting, filtering and drilldown would be the most pressing IMO). 🙂

  8. Anonymous says:

    @Pallea

    I don't disagree with what you're saying about developer resources on a full compile, I would much rather it be correct than fast. Nevertheless, on V7 it is so slow it gives a very noticable pause for just one object. This means some people aren't doing saves so often and so are more likely to lose changes when the client crashes (which is also happening more often, but that's another matter!). This is probably why it has in fact been improved for V8.

    And BTW, I haven't checked but statistically it's likely that fixed overheads that are paid to compile every object are what pushes the full compile time up the most; they have to be paid 5000 times, whereas a huge posting codeunit is only complied once.

    @Everyone

    If you're in a hurry, there's nothing to stop you starting up multiple Development environments connected to a single database and compile different sets of objects from each one.

    Also 90% of the time if I want to do a "full compile" that actually means only the objects that have been modified by US, not the ones that are still "pure Microsoft".

    1. Thalon says:

      I had no luck with parallelizing. I’ve tried this snippet yesterday and it gave me a lot of errors in the log, because “Table Object was locked by another user”. As this script was the only user accessing the database (except of the middle tier service) at this time, I will not recommend this to my coworkers 🙁