Memory consumption of tables variables


I recently made an interesting discovery, that I want to share with you. I was investigating an recursive algorithm that scaled nicely; until it ran out of memory. For each level of the recursion a new class instance was created – this class contained multiple member variables of type Table (as in CustTable). I decided to measure the memory footprint of this construct, and I was much surprised by my findings.

Consider these 3 classes – that each stores the same amount of data, but in different ways:

/// <summary> 
/// Test class containing a table buffer and setting 20 fields 
/// </summary> 
class Class1 
{ 
    BOMCalcTrans bomCalcTrans; 

    public void new() 
    { 
        bomCalcTrans.ConsumptionVariable = 50; 
        bomCalcTrans.ConsumptionConstant = 50; 
        bomCalcTrans.CostPriceQty = 50; 
        bomCalcTrans.CostPrice = 50; 
        bomCalcTrans.CostPriceQtySecCur_RU = 50; 
        bomCalcTrans.CostMarkup = 50; 
        bomCalcTrans.NumOfSeries = 50; 
        bomCalcTrans.SalesPriceQty = 50; 
        bomCalcTrans.SalesMarkupQty = 50; 
        bomCalcTrans.CostMarkupQty = 50; 
        bomCalcTrans.CostMarkupQtySecCur_RU = 50; 
        bomCalcTrans.CostPriceSecCur_RU = 50; 
        bomCalcTrans.CostMarkupSecCur_RU = 50; 
        bomCalcTrans.NetWeightQty = 50; 
        bomCalcTrans.CostPriceUnit = 50; 
        bomCalcTrans.SalesPrice = 50; 
        bomCalcTrans.SalesMarkup = 50; 
        bomCalcTrans.SalesPriceUnit = 50; 
        bomCalcTrans.SalesPriceFallBackVersion = 'abc'; 
        bomCalcTrans.CostPriceFallBackVersion = 'def'; 
    } 
} 

/// <summary> 
/// Test class containing a hashTable with 20 key/value pairs 
/// </summary> 
class Class2 
{ 
    System.Collections.Hashtable hashTable; 

    public void new() 
    { 
        hashTable = new System.Collections.Hashtable(); 
        hashTable.Add(fieldNum(bomCalcTrans,ConsumptionVariable), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,ConsumptionConstant), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,CostPriceQty), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,CostPrice), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,CostPriceQtySecCur_RU), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,CostMarkup), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,NumOfSeries), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,SalesPriceQty), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,SalesMarkupQty), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,CostMarkupQty), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,CostMarkupQtySecCur_RU), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,CostPriceSecCur_RU), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,CostMarkupSecCur_RU), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,NetWeightQty), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,CostPriceUnit), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,SalesPrice), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,SalesMarkup), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,SalesPriceUnit), 50); 
        hashTable.Add(fieldNum(bomCalcTrans,SalesPriceFallBackVersion), 'abc'); 
        hashTable.Add(fieldNum(bomCalcTrans,CostPriceFallBackVersion), 'def'); 
    } 
} 

/// <summary> 
/// Test class containing 20 member variables with values 
/// </summary> 
class Class3 
{ 
    InventQty bomCalcTransConsumptionVariable; 
    InventQty bomCalcTransConsumptionConstant; 
    CostPrice bomCalcTransCostPriceQty; 
    CostPrice bomCalcTransCostPrice; 
    CostPriceSecCur_RU bomCalcTransCostPriceQtySecCur_RU; 
    CostPrice bomCalcTransCostMarkup; 
    InventQty bomCalcTransNumOfSeries; 
    InventSalesPrice bomCalcTransSalesPriceQty; 
    InventSalesMarkup bomCalcTransSalesMarkupQty; 
    CostMarkup bomCalcTransCostMarkupQty; 
    InventPriceMarkupSecCur_RU bomCalcTransCostMarkupQtySecCur_RU; 
    CostPriceSecCur_RU bomCalcTransCostPriceSecCur_RU; 
    CostPriceSecCur_RU bomCalcTransCostMarkupSecCur_RU; 
    ItemNetWeight bomCalcTransNetWeightQty; 
    PriceUnit bomCalcTransCostPriceUnit; 
    CostingVersionId bomCalcTransCostPriceFallBackVersion; 
    InventSalesPrice bomCalcTransSalesPrice; 
    InventSalesMarkup bomCalcTransSalesMarkup; 
    PriceUnit bomCalcTransSalesPriceUnit; 
    CostingVersionId bomCalcTransSalesPriceFallBackVersion; 

    public void new() 
    { 
        bomCalcTransConsumptionVariable = 50; 
        bomCalcTransConsumptionConstant = 50; 
        bomCalcTransCostPriceQty = 50; 
        bomCalcTransCostPrice = 50; 
        bomCalcTransCostPriceQtySecCur_RU = 50; 
        bomCalcTransCostMarkup = 50; 
        bomCalcTransNumOfSeries = 50; 
        bomCalcTransSalesPriceQty = 50; 
        bomCalcTransSalesMarkupQty = 50; 
        bomCalcTransCostMarkupQty = 50; 
        bomCalcTransCostMarkupQtySecCur_RU = 50; 
        bomCalcTransCostPriceSecCur_RU = 50; 
        bomCalcTransCostMarkupSecCur_RU = 50; 
        bomCalcTransNetWeightQty = 50; 
        bomCalcTransCostPriceUnit = 50; 
        bomCalcTransSalesPrice = 50; 
        bomCalcTransSalesMarkup = 50; 
        bomCalcTransSalesPriceUnit = 50; 
        bomCalcTransSalesPriceFallBackVersion = 'abc';  
        bomCalcTransCostPriceFallBackVersion = 'def'; 
    } 
}

Now; let us create 1,000,000 instances of each and store them all in a List(Types::Class), and measure the memory footprint of each type of the 3 classes – running as IL and running as pcode.

mem

Notice that using a table as a member variable consumes 6x as much memory as having each field as a member – apparently there is a huge overhead associated with the xRecord+Common functionality.

My conclusion – apparently you cannot have nice code and low memory consumption at the same time. My advice – be careful when using tables as member variables, especially in batch scenarios, i.e. classes derived from RunBaseBatch or used by same.


Comments (3)

  1. Tommy Skaue says:

    That is indeed *very* interesting! Table buffers having a bigger memory footprint makes sense, but I never through the difference would be this big.

    Maybe completely unrelated, but I will attend the Technical Conference in October, and one of the things I hope to get some kind of answer to is the huge change in memory consumption for AX2012 compared to AX2009. What is going on?

  2. Tommy Skaue says:

    For a more in-depth article around memory handling in AX 2012, have a look at this article:

    blogs.msdn.com/…/memory-usage-in-xppil-code.aspx

  3. Michael Fruergaard Pontoppidan says:

    Thanks for the link, Tommy.

    Worth calling out here is that the consumptions listed in this post have nothing to do with GC not yet having collected memory. It is intended to be kept in memory (everything is still referenced) – the size is just alarmingly huge.

    I will also be a the Technical Conference – but I'm not sure I have the answer to your question – nor do I think there is a simple single answer to it.