Run sp_BlitzCache an agent job to assist performance tuning efforts / baselining query performance

execution-planperformanceperformance-tuningplan-cachetools

Just getting into using various Brent Ozar tools like sp_BlitzCache and sp_BlitzIndex to identify and correct SQL Server performance issues. Been experimenting with running sp_AskBrent (now BlitzFirst…) and sp_WhoIsActive as agent jobs as a way to baseline query performance.

Do any of you run sp_BlitzCache periodically as an agent job to capture history / provide a baseline and/or evidence for your SQL Server performance tuning efforts? If so:

  1. How specifically has this helped you in your environment?

  2. Which sp_BlitzCache sort_order do you find most helpful to log? I'm considering separate output tables by cpu, reads, duration, and executions per minute. But wondering if I should be using "avg X" for these sorts.

  3. Should I stick with the default of @top = 10? Or go for 20, 30, etc?

Here's an example where I'm using total reads and top 20:

EXEC DbaTools.dbo.sp_BlitzCache
@top='20', @sort_order = 'reads', @output_database_name = 'DbaTools', @output_schema_name = 'dbo', @output_table_name = 'BlitzCache_by_reads'

  1. How frequently would you run such a job? Weekly? Since the plan cache doesn't get wiped out until a host / service restart, an explicit command to do so (dbcc freeproccache), or in some other perhaps unexpected way (to some) such as changing MAXDOP (our Team learned the latter the hard way).

  2. This is more in line with with another StackExchange question just I posted about sp_AskBrent and sp_WhoIsActive, but how resource-intensive is it to run these SPs on a periodic basis?

That other StackExchange question is here:

sp_AskBrent not logging queries when saving to table vs. sp_WhoIsActive

Best Answer

How specifically has this helped you in your environment?

My laptop instances are running great these days!

But seriously, if you lack a monitoring tool, this is a good way to analyze your plan cache for all sorts of problems, and opportunities for tuning. We warn you about a lot of stuff that may not be obvious just looking at an execution plan.

Which sp_BlitzCache sort_order do you find most helpful to log? I'm considering separate output tables by cpu, reads, duration, and executions per minute. But wondering if I should be using "avg X" for these sorts.

You may find some overlap in the different sort orders. For instance, CPU and Reads often have a lot of overlap.

As for average vs total, that mostly depends on plan cache stability. If it doesn't change much, averages can be more interesting than totals. Sometimes small, not necessarily tune-able queries rack up a lot of total CPU time because they're run fairly frequently, as well as maintenance tasks. This is something you'll have to figure out for each environment.

Should I stick with the default of @top = 10? Or go for 20, 30, etc?

This is also subjective, but lends itself a bit to another question of yours. The more plans you choose to bring in to analyze, the more resource intensive it is. Analyzing XML in SQL Server is... challenging.

The default is 10 to keep the footprint small, but also give you some meaningful results back. Of course, if you're not finding anything of interest in the top 10, feel free to experiment with different values to see how long it takes to run, and even if it helps you find anything interesting.

How frequently would you run such a job? Weekly? Since the plan cache doesn't get wiped out until a host / service restart, an explicit command to do so (dbcc freeproccache), or in some other perhaps unexpected way (to some) such as changing MAXDOP (our Team learned the latter the hard way).

This depends a bit on plan cache stability. If the same stored procedures run over and over again, daily or weekly is fine. If your server gets a lot of ad hoc user queries pushed through, you may want to run it hourly, or at some other interval your server can handle.

You can also use the @IgnoreQueryHashes parameter to keep some of the noise out.

This is more in line with with another StackExchange question just I posted about sp_AskBrent and sp_WhoIsActive, but how resource-intensive is it to run these SPs on a periodic basis?

We aim to be as resource un-intensive as possible, but the number of plans you choose to analyze, and the size of the plans can cause spikes. Like I said, analyzing XML in SQL Server ain't fun. It's single threaded, and can use a bit of CPU. You can use the @DatabaseName parameter if you're only interested in what one database is doing, which can help focus tuning efforts.

Thanks!