My application search becomes slow some times in a day.And whenever i update statistics for a particular table,it gets fixed.
So i am thinking of creating a job to run update statistics every morning by 11 am for that table alone.Currently the statistics is set to update by itself with SQL server default sampling. Is it a bad idea to schedule a job to run Update statistics with full scan for that one particular table.This issue happens in production environment.
Sql-server – Update Statistics with full scan for a single table
sql serverstatistics
Related Question
- SQL Server – Update Statistics Full Scan Hung Up?
- SQL Server – Update Stats Takes Longer with 50% Sample Rate vs Full Scan
- Sql-server – Update stats w/full scan on SQL Server 2014 uses 100% cpu, on 2008 R2, 15%
- SQL Server – Can Updating All Statistics Separately Cause Bad Performance?
- Sql-server – way to restrict automatic statistics updates to a certain timeframe
- Sql-server – Statistics disappearing/empty randomly throughout the day
Best Answer
While you could schedule a job to run
UPDATE STATISTICS
on that particular table, you should also look into Parameter Sniffing. What's likely happening here is that a bad plan was wiped from the plan cache when you updated statistics on that table. Instead of doing a costlyUPDATE STATISTICS
, remove the bad plan from cache. Once wiped from the plan cache, the query will compile and get a new plan. Whichever parameter values were used at the time of compilation is what the execution plan will be optimized, that is as long as you are not overriding that behavior with theOPTION OPTIMIZE FOR
query hint.If the search function is a stored procedure, you can use the good ol'
sp_recompile
. Otherwise, useDBCC FREEPROCACHE
and pass that command the SQL handle or the plan handle. You can use sp_BlitzCache with ExpertMode = 1 to get the command (it's all the way to the right in the result set).To help diagnose which plan is causing the issue, I like to use sp_WhoIsActive or sp_BlitzCache. For sp_WhoIsActive, I like to log current activity to a table and then I can go back in time to compare execution plans and see who was the culprit.
For more information on Parameter Sniffing, see Erland's Slow in the Application, Fast in SSMS? post.