I am exporting a table to a TSV file using sqlcmd, and I am running into issues. The table has 16+ million rows and about 55 columns.
The problem is that it does not export the full table, but seems to stop randomly at various points (i am guessing a timeout?) Each time a different number of rows are exported and each time the file is of a slightly different size (indicating that I am not hitting any row or size limit).
I am not using any timeout switch (meaning the default of "as long as it takes" is used).
Here is my command (with most columns removed for simplification and illustration purposes):
sqlcmd -U myUsername -P myPassword -S SERVERNAME -d "DBNAME" -Q "SELECT ROW_KEY, ISNULL(CARRIER_ID, ''),ISNULL(CONVERT(VARCHAR(19),ORDER_DATETIME,120),''),ISNULL(HEIGHT, ''),ISNULL(RESIDENTIAL_CHARGE, '') FROM MYTABLE" -o "OUTPUT_FILE.txt" -h-1 -s"|" -W
I wonder if it could have something to do with timeouts or the use of ISNULL() on all of the columns (although when i run the query in sql server management studio I get the correct number of rows returned e.g. 16 million + )?
Again, I get about 4-8 million rows each time, but never the full amount. I am in a sql server 2k5 db, and running sqlcmd from a remote machine with sql server 2k8.
Best Answer
You'll probably want to do this with bcp, which is intended for this kind of thing. It would be something like this:
Replace -c with -w if you're dealing with nvarchar/nchar, and want Unicode output.