I've got a series of tables with lots of high precision data collected from various devices. The intervals that they were collected on varies and even wanders over the time series. My users want the ability to pick a date range and get an average / min / max over these variables at a specific frequency. This is the second stab I've taken at this, and it works, but I wonder if there is a better/faster way to accomplish this?
declare @start datetime declare @end datetime set @start = '3/1/2012' set @end = '3/3/2012' declare @interval int set @interval = 300 declare @tpart table( dt datetime ); with CTE_TimeTable as ( select @start as [date] union all select dateadd(ss,@interval, [date]) from CTE_TimeTable where DateAdd(ss,@interval, [date]) <= @end ) insert into @tpart select [date] from CTE_TimeTable OPTION (MAXRECURSION 0); select t.dt, avg(c.x1), min(c.x1), max(c.x2), avg(c.x2), min(c.x2), max(c.x2) from clean.data c , @tpart t where ABS(DateDIFF(ss, t.dt , c.Date) ) <= @interval /2 and Date >= @start and Date <= @end group by t.dt
Right now over 32721 rows for this 3 day period this query takes about 43 seconds to run and gives me the 577 rows I expect but I'd like to get this faster. The big hit comes from the nested loop to do the inner join.