Just did a test at migrating a 120gb database onto RDS using SQL Azure Migration Wizard
Because we can't afford any downtime for "just" a test we ran the tool on the live database (meaning data was still getting inserted/updated/deleted while the tool was running)
Scripts were created ok but during the insertion of data we got the error "BCP copy in failed" "Cannot insert duplicate key"
My intuition is because the data was live when the .dat file got created, it somehow read the same row twice, but I don't really want to make an assumption I might regret when we will do the real migration, when the database will obviously be offline.
Could there be any other reason why this has happened and can I assume I'm safe using the tool on the "big day" with a read-only database?
— Update —
We have tried again to run exact same .dat file of the table that failed and it worked. Which seems to indicate the duplicated key is not in the script… I'm even more confused, could it be a bug in the tool?
Best Answer
I ran into the same problem when trying to use SQL Migration Wizard to create a local copy of my database. I know my source data didn't have duplicate key values, so I came up with the following script. It simply causes duplicate key entries to be ignored (and a warning is generated instead of error). Please read the comments in the code block below for instructions.
I know this solution is being posted some time after the original question was asked, but hopefully this will help someone else out!