I have a large table, about 750M rows.
I have an application using Entity Framework
to load and parse data, a query that returns 2 million rows takes 4 seconds.
The same exact query takes 8 seconds using SQL Server Management Studio
.
I noticed this happens with other queries (that all returns large data sets).
Both of the connection are using TCP/IP
.
When including client statistics, I see this line (no idea what it means):
Client processing time 2539 2539.0000
Why would a SSMS
query be slower ?
Best Answer
Client processing time is the amount of time spent in milliseconds between the first received response packet and the last received response packet by the client.
When you run the query in SSMS it render the data row by row to the GUI. Your application doesn't do that.
Setting the SSMS option to
Discard results after execution
will eliminate the time spent rendering the results to the grid, which should give you a comparable execution time: