Sql-server – Optimize query that only returns when the number of rows to be returned is less than an amount

sql serversql-server-2017

I have a query that first counts the number of rows to be returned and, if it's below a limit, it returns those rows, else it returns nothing.

For context: I have a map that displays some points only after a certain zoom level. However, my users now want that the map displays the points as long as there is space in the screen, instead of having a predefined minimum zoom level. This means that now whenever they move across the map, I'll have two queries, instead of one: one to know how many items there are, and another one to actually return them. The load will potentially double. I thought about doing it this way:

SELECT p.ACOLUMN, p.ANOTHERCOLUMN, p.LATITUDE, p.LONGITUDE 
FROM table p 
where p.LATITUDE < @latMax)
and p.LATITUDE > @latMin) 
and p.LONGITUDE < @lngMax)
and p.LONGITUDE > @lngMin)
and (SELECT count(*) 
FROM table
where LATITUDE < @latMax
and LATITUDE > @latMin
and LONGITUDE < @lngMax
and LONGITUDE > @lngMin)) < 200 -- example limit value

Looking at the actual execution plan, I see that SQL Server 2017 is seeking on the same index twice:

Actual execution plan shows the same nonclustered index seek happening twice

It's also of course reading the same amount of rows in both seeks. Is there a way to remove those duplicate seeks? Or is there maybe another, more optimal way to do what I'm trying?

Best Answer

I didn't really mean round trip all the way to the application. I consider a stored procedure one form of "application" in this context.

DECLARE @c int;

SELECT @c = COUNT(*) FROM dbo.table WHERE ...;

IF @c < 200
BEGIN
    SELECT cols FROM dbo.table WHERE ...;
END

If the application always expects either an empty or non-empty resultset, you can do this:

DECLARE @c int;

SELECT @c = COUNT(*) FROM dbo.table WHERE ...;

SELECT TOP (CASE WHEN @c <= 200 THEN @c ELSE 0 END) cols 
  FROM dbo.table WHERE ...;

Yes, you have to read the table twice (in the case where there are fewer than 200 rows; TOP (0) should short-circuit), but I don't see a way to avoid that, unless you maybe dump rows into a temporary structure first (this will be useful I suppose in cases where your I/O is way faster than your network, but you'd still only be writing 201 rows at most, which is probably still way better than moving a lot more rows across the network):

SELECT TOP (201) cols INTO #f FROM dbo.table WHERE ...;

IF @@ROWCOUNT <= 200
BEGIN
  SELECT cols FROM #f;
END

DROP TABLE #f;

You could also try this, but I'm not sure the results will be any better than your original, it all depends if SQL Server sees that it doesn't have to materialize the CTE twice:

;WITH c AS 
(
   SELECT cols, rn = ROW_NUMBER() OVER (ORDER BY (SELECT NULL))
   FROM dbo.table WHERE ...
)
SELECT cols FROM c WHERE NOT EXISTS (SELECT 1 FROM c WHERE rn > 200);