Slow performance and out of memory issues caused by large batch

Recently, I worked with a customer who reported out of memory errors from his SQL Server 2008 R2 with 32GB of RAM.   One of the errors is 701 error (Error: 701, Severity: 17, State: 123. There is insufficient system memory in resource pool ‘default’ to run this query.)

After taking a memory dump, we discovered that the particular query that consumed large amount of memory was a query like below. I rewrote is so that it doesn’t reference customer’s table names.

declare  @tempIDs TABLE (id int primary key);
insert into @tempIDs values (0);
insert into @tempIDs values (1);
insert into @tempIDs values (2);
insert into @tempIDs values (3);
insert into @tempIDs values (4);
insert into @tempIDs values (5);
insert into @tempIDs values (6);
…. 1.5 million inserts like this
select * from sys.objects   where object_id in (select id from             @tempIDs )


From application side, the following C# code will generate the query above

static void InsertMethod()
StringBuilder builder = new StringBuilder();
builder.Append(“declare @tempIDs TABLE (id int primary key);”);

for (int i = 0; i< 1000000;i++)
builder.Append(string.Format(“insert into @tempIDs values ({0}); “, i));

builder.Append(“select * from sys.objects where object_id in (select id from @tempIDs )”);
SqlConnection conn = new SqlConnection(ConnectionString);
SqlCommand cmd = conn.CreateCommand();
cmd.CommandText = builder.ToString();


This query would do 1 million inserts into a table variable and then does a join with other tables.  As you can imagine, the query batch size is quite large.   This kind of batch will cause two issues.  The performance will be slow.  In addition, it will consumes large amount of memory.   This because SQL will need to parse each statement.  It not only takes time but also consumes memory to store internal structure.  In addition, using table variable this way with large number of rows is inappropriate.   See this post for details.

So what’s the solution?

There are various solutions depending on your situation.  For example, you can use SSIS or bulk insert to get the data into a permanent table and then join with other tables.  But do NOT use IN or OR clause.  If you use IN or OR clause, you will have 1 million values in IN and OR cause.  That type of approach will cause slower performance and memory error as well with large number of values like this.

If  you have to generate the values dynamically and join with other tables, use SqlBulkCopy object to pump data into a temp table and then do joins.  The following C# code example will achieve that.  It will be fast and memory requirement will be smaller.

static void BulkCopy()
            SqlConnection conn = new SqlConnection(ConnectionString);
            SqlCommand cmd = conn.CreateCommand();
            cmd.CommandText = ”            create table #tempIDs (id int primary key);”;

            SqlBulkCopy bulkcopy = new SqlBulkCopy(conn);

            bulkcopy.DestinationTableName = “#tempIDs”;
            SqlDataAdapter schemaAdapter = new SqlDataAdapter(“select id from #tempIDs”, conn);

            DataTable dt = new DataTable();
            for (int i = 0; i < 1000000; i++)
                DataRow row = dt.NewRow();

                row[“id”] = i;


            cmd.CommandText = “select * from sys.objects   where object_id in (select id from             #tempIDs )”;
            SqlDataReader reader = cmd.ExecuteReader();
            while (reader.Read())




Jack Li  | Senior Escalation Engineer | Microsoft SQL Server Support

Comments (4)

  1. TheSQLGuru says:

    The best way (BY FAR) to accomplish this particular need is using a table of Numbers.  If you have a table with an int that has 1-N in it you can do this set-based insert:

    INSERT @tempIDs


     FROM DBATools.dbo.Numbers

    WHERE N BETWEEN 1 and 1500000

  2. Shishir Khandekar says:

    Jack, it is interesting that you mention this. We have a similar issue and I believe we also filed a RFC with the dev team for this. While we haven't run out of memory for this would you be able to shed some light if the behavior is the same for the following code that uses User Defined Table Types? We have situations where the INSERTs into the table type could run into 50000+ in number. And this batch is generated by C# code using the usual classes. In short when using UDTs, the .NET classes generate this type of batch for SQL Server.

    Is there a better way to tackle this scenario?

    Thanks in advance


    —— Code generated by .NET when using UDTs to pass data to stored procedures ——————-

    declare @p1 dbo.udt_OrderItems

    insert into @p1 values(1,11)

    insert into @p1 values(2,2)

    insert into @p1 values(3,3)

    insert into @p1 values(4,4)

    insert into @p1 values(5,5)

    insert into @p1 values(6,6)

    insert into @p1 values(7,7)

    insert into @p1 values(8,8)

    insert into @p1 values(9,9)

    insert into @p1 values(10,10)

    insert into @p1 values(11,11)

    insert into @p1 values(12,12)

    insert into @p1 values(13,13)

    insert into @p1 values(14,14)

    exec sp_AcceptOrder @OrderItems=@p1

    —— End of Code generated by .NET when using UDTs to pass data to stored procedures ————

  3. Shishir Khandekar says:

    Jack, In the above code read UDT as User Define Table Type. dbo.udt_OrderItems is a user defined table type and not user defined type.

  4. EdVassie says:

    Jeff Moden has written a good article about using a permanent Numbers table – see…/62867

    Connect item 258733 suggests adding a built-in Numbers table as a feature to SQL Server – it ceratinly has got my vote.