I have a 400k XML document trying to insert into an 86-column SQL Server 2008 table.
It takes about 35 minutes to get that XML into a temp table using this code:
with XmlNamespaces ( 'urn:xpaykj-report-xml-1.0' as xkj)
Select
xItem.value('declare namespace xkj="urn:xpaykj-report-xml-1.0"; (xkj:prop[@idx=1]) [1] ','Date') as [InputDate]
-- and then 85 other columns --
Into #loadtemp
From @xmldoc.nodes('xkj:output-data/xkj:childs/xkj:child[@name="output"]/xkj:childs/xkj:child[@name="lvla"]/xkj:childs/xkj:child[@name="lvlb"]/xkj:childs/xkj:child[@name="lvlc"]/xkj:properties') as x(xItem)
I need to optimize… Something. But I am not sure what. Can anyone help me with ideas?
Thanks.
Best Answer
Can you stage it twice? First, shred the data into rows of XML, and then read the columns from the rows. Something like:
should give you rows of XML; then apply your value statements to shred the table into columns.