I have huge tables in oracle database, appx 1 crore+ (10 million+) rows and want to migrate / copy those table and data in sql server.
Currently I am using Import functionality of SQL Server for this process. But it takes a day for this process and takes too much time.
Is there any better way? Any good outcome or step (SSIS, Any other functional step) to follow for this process?
Best Answer
Separate the two tasks.
One of the good ETL rules to follow is to get in and out of the source system as fast as possible. Dumping the Oracle data to flat files should be orders of magnitude faster than inserting direct, as should the subsequent import of those files to SQL Server.
bcp
orBULK INSERT
should yield the fastest raw loading speed. There is a useful list of bulk loading references in Optimising BCP performance for BLOB data.I've found
BULK INSERT
andbcp
to be less troublesome (i.e. I create and encounter fewer errors) if you forget CSV, avoid format files, and utilise a non-standard delimiter instead. I've got "~¬" in use on abcp
based ETL job at the moment and so far so good, the data dog I'm fetching from has yet to contain that string.