since SQL Server 2016 the default auto statistics update threshold changed in that way that it uses a smaller threshold depending on the table size. In our application we periodically update statistics with fullscan. With this new behavior the auto update statistics runs after a smaller modification rate and uses only small sample rate. Until our maintenance job will update statistics again with fullscan next time, the auto statistics update runs and resets the ‘fullscan’ results. This could cause a performance problem in our application.
Is there any way to use the old behavior with auto statistics update without going back to a lower compatibility level?
Sql-server – Auto Update Statistics threshold SQL Server 2016
performancesql serverstatistics
Related Question
- Sql-server – Why does SQL Server refuse to update these statistics with anything but fullscan
- Sql-server – SQL Server sample Update of Statistics misses highest RANGE_HI_KEY on ascending key column
- Sql-server – Why set Auto Update Statistics to False
- Sql-server – In what circumstances I am more likely to benefit from auto update stats async on
- Sql-server – Could an update on all statistics separately cause bad performance
- Sql-server – Statistics update with automatic sampling messes up density vector and histogram
- Sql-server – Can the SQL Server optimizer use indexes to update auto created statistics
Best Answer
This is the side effect of updating statistics with fullscan on a weekly basis, it becomes more of a guessing game when your statistics are going to be automatically updated.
As mentioned in the comments and since you hopefully are on a patch level of SQL Server 2016 SP1 CU4 or above, setting the persisted sample rate higher can work for the tables that you are updating
WITH FULLSCAN
on a weekly basis.You could try working with a lower sample rate than fullscan but still higher than the default sample rate for these tables to reduce the time it takes to update.
Another setting that can help due to your auto update statistics taking a longer time when a higher sample rate is used is to enable
ASYNC_STAT_UPDATES
. Enabling asynchronous stat updates makes your queries not wait for your statistic updates when enough changes have been made, causing them to use the old statistics which have a high sample rate.The caveat is that the lowest level possible to enable asynchronous statistic updates is for your entire database. It would be wise to test this setting in development.
To enable asynchronous statistic updates:
Remember that when you truncate the table the sample rate is no longer persisted and when new indexes and their corresponding statistics are created that they will have the default sample rate.
Another idea would be to update your statistics more often for the tables that have more changes if your server has the resources to do so.