SQL Server – How to Prove T-SQL is Bad for Performance

performanceperformance-tuningquery-performancesql serverstored-procedurest-sql

I have recently inherited a codebase with a large amount of Stored Procedures. The system they are supporting is encountering numerous performance problems which I am looking in to.

A number of the Stored Procedures have a pattern like this:

  1. Create Temp table
  2. Build up dynamic SQL query to insert a bunch of records, e.g.

    DECLARE @sql VARCHAR(MAX)
    SET @sql = 'INSERT INTO @tempTable
    SELECT SomeColumn, SomeColumn2, SomeColumn3, etc FROM MyTable'
    
    IF @someParam = [SomeValue]
        SET @sql = @sql + 'WHERE SomeColumn = [SomeValue]';
    
    IF @someOtherParam = [SomeOtherValue]
        SET @sql = @sql + 'WHERE SomeOtherColum = [SomeOtherValue]';
    
  3. Execute this dynamic sql

    EXEC(@sql);
    
  4. Select from the temp table and bring in a bunch of additional information to return to the client.

    SELECT
    
       ...
    
    FROM
    
       @tempTable
       INNER JOIN ...
    

My immediate thoughts are:

  • There is dynamic SQL, so no cache plans, meaning plans generated every time.
  • There is an INSERT SELECT pattern, so table locking is more likely to be an issue.

I have re-written some of the Stored Procedures in this way instead:

    SELECT 

       ...

    FROM

        MyTable 
        INNER JOIN ...

    WHERE

    (
        @someParam != SomeValue
        OR
        SomeColumn = SomeValue
    )
    AND
    (
        @someOtherParam != SomeOtherValue
        OR
        SomeOtherColumn = SomeOtherValue
    )        

From comparing execution plans and client statistics in SQL Management Studio, I have not sped the Stored Procedures up so I am apprehensive about suggesting wholesale re-writes of all Stored Procedures.

I am trying to set up some profiling of a live customer scenario, but as yet have been unable to prove my thoughts.

Can anyone offer any confirmation of the theory behind my thoughts, or any better ways of proving my suspicions?

The problem is I have read that dynamic SQL is not always a closed case – i.e. it depends on how it is used. My understanding of locking also falls down at the fact that nowhere can I get 100% confirmation of how this type of INSERT SELECT will lock tables.

Best Answer

There is dynamic SQL, so no cache plans, meaning plans generated every time

Not necessarily true. Dynamic SQL can (and does) use cached plans just as well as static SQL. For dynamic search conditions resolving to dynamic SQL is oft the right answer. See Dynamic Search Conditions in T-SQL for more details.

There is an INSERT SELECT pattern, so table locking is more likely to be an issue.

Not necessarily true, specially with a @tempTable

have re-written some of the Stored Procedures in this way instead

Using multiple OR conditions like that is an anti-pattern. You are forcing the query optimizer to come up with a plan that works for any value of all those parameters. Usually the only solution is a scan, ignoring any index. The original code was better.

Can anyone offer ... any better ways of proving my suspicions?

Yes. Measure. Use a methodology like Waits and Queues. Don't relly on your intuition. Find the bottlenecks and address them accordingly.