Should Query Tuning be Proactive or Reactive

performanceperformance-testingquery

As a software developer and an aspiring DBA, I try to encorporate best practices when I design my SQL Server databases (99% of the time my software sits on top of SQL Server). I make the best possible design prior to and during development.

But, just like any other software developer, there is added functionality, bugs, and just change of requirements that demands altered/created database objects.

My question is, should query tuning be proactive or reactive? In other words, a few weeks after some heavy code/database modification, should I just set aside a day to check out query performance and tune based off of that? Even if it seems to be running okay?

Or should I just be aware that less-than-average performance should be a database check and going back to the proverbial chalkboard?

Query tuning can take up a lot of time, and depending on the initial database design it could be minimal benefit. I'm curious as to the accepted modus operandi.

Best Answer

Both, but mostly proactive

It's important to test during development against realistic volumes and quality of data. It's unbelievably common to have a query running on a developers 100 or 1000 rows then fall flat with 10 million production rows.

It allows you to make notes too about "index may help here". Or "revisit me". Or "will fix with new feature xxx in next DB version".

However, a few queries won't stand the test of time. Data distribution changes or it goes exponential because the optimiser decides to use a different join type. In this case, you can only react.

Saying that, for SQL Server at least, the various "missing index" and "longest query" DMV queries can indicate problem areas before the phone call

Edit: to clarify...

Proactive doesn't mean tune every query now. It means tune what you need to (frequently run) to a reasonable response time. Mostly ignore the weekly 3am Sunday report queries.