Sql-server – Import Data from 48 GB csv File to SQL Server

csvimportsql server 2014

I am importing huge data file of size around 48 gb using SQL Server default Import Tool. It keeps on executing fine for app. 13000000 Row insertions but after it the Task Fails with following error.
I cant open the csv as its damn huge neither can i move row by row in it and analyze the stats.
I am really confused how to handle this.

ERROR- Details SQL Server

Best Answer

You can use powershell to fast import large CSV into sql server. This script High-Performance Techniques for Importing CSV to SQL Server using PowerShell - by Chrissy LeMaire (author of dbatools)

Below is the benchmark achieved :

5.35 million rows a minute for non-indexed tables and

4.35 million rows a minute for tables with clustered indexes.

The script even batches your import into 50K rows so that during import, it does not hog the memory.

Edit: ON SQL Server side -

  • Make sure you have proper recovery model set .. for such load, bulk recovery is best setting.
  • Autogrowth is set to proper MB and not Percent.
  • Both data and log files are provisioned big enough so there is minimal autogrowth events.
  • Do the import during non peak hours.
  • Follow best practices described in Data loading perf guide and or my answer.
  • Instant File Initialization is enabled and power setting is set to high perf.