Sql-server – the most efficient way of inserting 199 millon rows of data

sql-server-2005

I have a 2.2 GB text file in the format of:

1234567890
1234567890
etc.......

I need to insert each line into a table (field 1) such as:

CREATE TABLE [dbo].[table](
    [field1] [varchar](10) NOT NULL,
    [field2] [varchar](3) NOT NULL

What is the most efficient way of inserting this data?
Is there a premade program that can be used? (i.e can it be done from the command line externally)

I've created a script and I select 1,000 records create a sql query around them and then insert. This takes a while, is there a better way? (Better as in easier, faster, less overhead)

Note: I've tried 1M, 100,000, 10,000 batch sizes, but the query is long at those sizes, so am stuck with 1k for now.

(query used)

INSERT INTO table (field1, field2)
SELECT '1234567890','123'
UNION ALL
SELECT '1234567890','123'
UNION ALL
SELECT '1234567890','123'
// etc....

Best Answer

BCP is likely to be the best tool for the task. SQLFool has a good BCP basics article on its usage.

@Marian posted a comprehensive list of BCP optimisation references in response to a question I asked, which you may find useful.