Too many single use plans, now what?

Hello,

I’ve been asked the question in title at least twice in the past month, but the question is really incomplete – I would rather be asked “I have too many single use plans taking up a lot of memory in my server, can I do something about it?”.

The answer would be the classic “it depends”, and it depends on the answer to some more questions:

  • What is “too much” for you?
  • Are we talking about number of plans or the amount of memory they are taking?
  • And what is that value in relation to the committed memory in SQL Server?
  • And even, what type of single use plans are we talking about?

So for the sake of this post, let’s say that for a given SQL Server we conclude that a great deal of memory is being taken by single use plans – for this scenario, ~10% of total committed memory, which is 30GB on a 32GB box.

Checking for clock hand moves over the CACHESTORE_SQLCP (where plans for adhoc and prepared plans are), we find these are frequent – but at the moment not representing a severe memory issue in the server. With almost 4GB of memory being used by single use plans, I find that this value it is too much of a waste, when that memory could be used more efficiently.

In many of these cases we might default to thinking about single use adhoc code being executed, right? And while the recommendation might usually be to refactor the code to run in stored procedures instead of running adhoc code (other strategies do exist to leverage plan-reuse as much as possible like sp_executesql), if a more administrative solution is needed, we have the “Optimize for Ad hoc Workloads” server option at our disposal. This will alleviate the scenario I started out by describing – think of this as a workaround to make conservative use of memory, when your workload relies heavily on adhoc code leading to plan cache bloating.

But what if you look in the sys.dm_exec_cached_plans, and you find the most prevalent single use plans are not *real* adhoc plans, but rather prepared plans? In this case the “Optimize for Ad hoc Workloads” server option doesn’t fit our needs. So is there any kind of administrative workaround for this scenario?

Below is the code I used in a couple occasions as a temporary workaround until a permanent, code-based solution could be implemented that would limit the high rate of single use plans. This can be used inside a job, that you schedule according to your needs, and we set what is the threshold of memory taken by prepared plans we consider to be the limit, after which we trigger a cache clear operation using DBCC FREESYSTEMCACHE (specifying WITH MARK_IN_USE_FOR_REMOVAL), and output a summary of what was done. Please see note below before using this.

 SET NOCOUNT ON;

DECLARE @sqlcmd NVARCHAR(4000), @params NVARCHAR(500), @sqlmajorver int, @ErrorMessage NVARCHAR(1000)
DECLARE @singleplan_cnt int, @singleplan_MB bigint, @committed_MB bigint -- Does not include reserved memory in the memory manager
DECLARE @memthreshold_pct smallint

-- Set percentage of committed memory used by prepared single use plans that triggers a cache eviction
SET @memthreshold_pct = 10

SELECT @sqlmajorver = CONVERT(int, (@@microsoftversion / 0x1000000) & 0xff);

IF @sqlmajorver >= 9 AND @sqlmajorver < 11
BEGIN
    SET @sqlcmd = N'SELECT @committedOUT=(bpool_committed*8)/1024 FROM sys.dm_os_sys_info (NOLOCK)'
END
ELSE IF @sqlmajorver >= 11
BEGIN
    SET @sqlcmd = N'SELECT @committedOUT=committed_kb/1024 FROM sys.dm_os_sys_info (NOLOCK)'
END

SET @params = N'@committedOUT bigint OUTPUT';

EXECUTE sp_executesql @sqlcmd, @params, @committedOUT=@committed_MB OUTPUT;

SELECT @singleplan_cnt = SUM(refcounts), @singleplan_MB = SUM(CAST(size_in_bytes AS bigint))/1024/1024
FROM sys.dm_exec_cached_plans (NOLOCK)
WHERE objtype = 'Prepared' AND usecounts = 1

IF (@singleplan_MB / @committed_MB) * 100 >= @memthreshold_pct
BEGIN
        SELECT @ErrorMessage = CONVERT(NVARCHAR(50), GETDATE()) + ': SQL Plan cache will be evicted due to single used Prepared Plans taking ' 
           + CONVERT(NVARCHAR(12), @singleplan_MB) + 'MB, over ' + CONVERT(NVARCHAR(12), @memthreshold_pct) 
           + ' percent of total committed memory (' + CONVERT(NVARCHAR(12), @committed_MB) + 'MB).'
        RAISERROR (@ErrorMessage, 10, 1, N'SQL Plan cache eviction');
        DBCC FREESYSTEMCACHE ('SQL Plans') WITH MARK_IN_USE_FOR_REMOVAL
END
ELSE
BEGIN
        SELECT @ErrorMessage = CONVERT(NVARCHAR(50), GETDATE()) + ': Single used Prepared Plans are taking ' 
           + CONVERT(NVARCHAR(12), @singleplan_MB) + 'MB, which is under ' + CONVERT(NVARCHAR(12), @memthreshold_pct) 
           + ' percent of total committed memory (' + CONVERT(NVARCHAR(12), @committed_MB) + 'MB). SQL Plan cache will NOT be evicted.'
        RAISERROR (@ErrorMessage, 10, 1, N'SQL Plan cache eviction');
END;

NOTE: keep in mind that clearing the cache regularly must not be considered a usual DBA task, and I must stress that the downside is that ALL plans in the selected cache will be cleared, not only the single use plans. This may mean higher compilation rate and CPU usage due to uncached incoming queries.

Kimberly Tripp (Blog | Twitter) has a nice post on this topic here.

Download code here: ClearSingleUse.sql

Until next time!

Disclaimer: I hope that the information on these pages is valuable to you. Your use of the information contained in these pages, however, is at your sole risk. All information on these pages is provided "as -is", without any warranty, whether express or implied, of its accuracy, completeness, fitness for a particular purpose, title or non-infringement, and none of the third-party products or information mentioned in the work are authored, recommended, supported or guaranteed by Ezequiel. Further, Ezequiel shall not be liable for any damages you may sustain by using this information, whether direct, indirect, special, incidental or consequential, even if it has been advised of the possibility of such damages.