Be aware of 701 error if you use memory optimized table variable in a loop

In blog “Importance of choosing correct bucket count of hash indexes on a memory optimized table”,  I talked about encountering performance issues with incorrect sized bucket count.  I was actually investigating an out of memory issues with the following error.

Msg 701, Level 17, State 103, Line 11
There is insufficient system memory in resource pool 'default' to run this query.

I simplified the scenario but customer’s code is very similar to the loop below.  Basically, this customer tried to insert 1 million row into a memory optimized table variable and process them.  Then he deleted the rows from the memory optimized table variable and inserted another 1 million.  His goal was to process 1 billion rows.  But before he was able to process 1 billion rows, he would run out of memory (701 error as above)

DECLARE @t2 AS [SalesOrderDetailType_inmem]
insert into @t2 select * from t2

while 1 = 1  --note that this customer didn’t use 1=1. I just simplified to reproduce it
       delete @t2
       insert into @t2 select * from t2

This customer was puzzled because he delete existing rows. At any given time, there should not be more than 1 million rows.  SQL Server should not have run out of memory.

This is actually by-design behavior documented in “Memory-Optimized Table Variables”).  Here is what is state “Unlike memory-optimized tables, the memory consumed (including deleted rows) by table variables is freed when the table variable goes out of scope)”.  With a loop like above, all deleted rows will be kept and consume memory until end of the loop. 


Complete Repro

Step 1 Create  a disk based table and populate 1 million rows

CREATE table t2(
  [OrderQty] [smallint] NOT NULL,
  [ProductID] [int] NOT NULL,
  [SpecialOfferID] [int] NOT NULL,
  [LocalID] [int] NOT NULL

Step 2 create a type using memory optimized table

CREATE TYPE [SalesOrderDetailType_inmem] AS TABLE(
  [OrderQty] [smallint] NOT NULL,
  [ProductID] [int] NOT NULL,
  [SpecialOfferID] [int] NOT NULL,
  [LocalID] [int] NOT NULL,

  INDEX [IX_ProductID] HASH ([ProductID]) WITH ( BUCKET_COUNT = 1000000),

Step 3 run the following query and eventually you will run of memory
DECLARE @t2 AS [SalesOrderDetailType_inmem]
insert into @t2 select * from t2

while 1 = 1
       delete @t2
       insert into @t2 select * from t2


Jack Li |Senior Escalation Engineer | Microsoft SQL Server

twitter| pssdiag |Sql Nexus

Comments (6)

  1. I like this post, enjoyed this one regards for posting.

  2. mbourgon says:

    Okay, so if you have to loop, how would you work around that? Dynamic SQL for the whole thing?

  3. Frank says:

    I will immediately taoe holld off yoir rrss ass I can nnot to
    fond your e-mail ssubscription hyperllink oor e-newsletter service.

    Do you’ve any? Please llet mme know soo that I may jut subscribe.
    Thanks. I wijll immediatel take hokd off yoiur rsss feed aas I ccan nnot iin finding your
    e-mail subscriptijon hyperlink or newslettr service.

    Do you havce any? Kindly permit mme recogise inn ordeer that I maay
    just subscribe. Thanks. Hello just waqnted too givve
    you a quik heeads up. Thhe text inn yor contednt serm too
    be running ooff thee screen iin Chrome. I’m nnot sre iff tbis iis a format
    iswsue orr someething to do with browser comppatibility bbut I hought I’d ost too leet you know.
    Thhe designn lopok greatt though! Hoope yyou gget tthe isesue ssolved soon. Kudoss

  4. Alex Thomas says:

    Hi Jack. It would be good to know what solution options exist for this scenario. Thanks

    1. Ned Otter says:

      Alex, the solution would be to use a memory-optimized table, not a memory-optimized table variable. Please refer to Scenario B or Scenario C in this post:

      1. Lajos Nagy says:

        Thank you, this is what I just looked for. We have just installed SQL 2016, immediately I thought I test all cool feature. I have cursor which uses #temp table and works perfectly, I have spent time lot of time to optimize it.

        I saw couple nice test about the memory table variable and it is faster really faster than the temp database.
        I thought I just what I need.
        I rewrote the code and after I managed to run it, the code worked perfectly I thought I run on the server on the whole night and I have the required data by morning.
        The first thing when I got up of course to check the result and what I got the insufficient memory error message.
        I figured out the issue is the same.
        I use memory optimized table variable and I inserted 16000 records into 23 column (varchar(15) wide memory optimized table and 23 times deleted and inserted the remaining selected records to find the best only one records I need.
        I run it 60000 times and using 12 instances.
        the server has 1 TB memory. Some of the process finished the work, some of them failed and probably I used all resources of the server.
        Lesson learned.

Skip to main content