I am generating about 500 records.
What I want to do is INSERT
them all at the same time because:
- The server is located elsewhere and I would rather send one big payload over.
- This seems like less overhead than sending 500 separate insert queries over the wire.
But because they're batch inserting, I don't want to rollback everything if one insert fails – I want to try to insert the rest in there and record the failure(s).
What's the best way to do this in SQL Server? Is BULK INSERT
what I want?
Best Answer
I'm not saying this is the best way, but here is one way.
My solution uses
xml
which, I think, has a max size of 2gb. Table Valued Parameters would have been ideal, but they are not supported in remote calls.Since you are only generating around 500 records, you probably won't run into any size limitations.
On the source instance, create/load a table to play with. This will be the source data to send to the remote process.
Here is some TSQL to run on the source instance, but before you can fully execute it, you'll have to define the receiving stored procedure on the remote server. I'll get to that.
Now, on the remote server, create a stored procedure to receive the passed
xml
. In the following example, I'm usingOPENXML
to basically shred thexml
and inserting directly into the remote table. You'd probably need a cursor to help you weed out the good and bad rows. Note, that I have also created thecustomer
table on the remote server (this allows me to reference that table in theOPENXML
using theWITH
clause.Note, that I'm using
with customer
on theOPENXML
. From the documentation on OPENXMLAt this point, you should be able to successfully execute the source T-SQL to call the remote stored procedure and process the data.