Sql-server – How to send+update on duplicate a big time series to SQL server

bulksql server

I use a SQL Server 2008
I would like to send (and update on duplicate datetime index) a big time series (more than 1000 datapoints and eventually much more than that).
What is the best way to do so?
I have been advised to use

MERGE

and

BULK INSERT

but,

  1. can we use the 2 at the same time?
  2. I don't want to save my data first to a file. It is already in memory, why should I have to save it first to a file? Plus I can have several threads calling this method at the same time and it is going to be a bit messy.

Any advice welcome

Best Answer

If the data is in memory, you can use SQLBulkCOpy in .net or similar to send data to SQL Server. No need to instantiate a file.

And load a staging table first in SQL Server. Then use MERGE from this staging table to the actual table

If you don't want a persistent staging table, create a #temp table and use that in the subsequent MERGE. I'm not sure about the usefulness of Table Valued Parameters here

If you want to load the final table directly, then you can MERGE directly from a file using OPENROWSET(BULK…). But you don't want to instantiate a file.

In summary: there is no direct "in client memory" to final table solution: you need an intermediate table or file