I'm running an alter table, alter column on a table with nearly 30 million rows and SQL Azure fails after approximately 18 minutes saying that The session has been terminated because of excessive transaction log space usage. Try modifying fewer rows in a single transaction.
I'm guessing that it's not possible to break this down into modifying fewer rows at a time, so I'm wondering what my options are for making this change to the database. SQL Azure will not let me change the size of the transaction log (limited to 1gb).
I'm guessing that my best bet is going to be creating a new table with the new layout, migrating the data across into that one, deleting the original table and then renaming the new table to match the name of the old table. If that's the case, how is it best to structure those commands?
Scheduled downtime for our system is not an issue currently so this operation can take as long as it needs to.
Best Answer
You will want to load your data into a new table, doing this in small batches, then drop the existing table. I put together a quick example using the Sales.Customer table in AdventureWorks, something similar should work for you also.
First, create your new table, complete with the new datatype you want to use:
Then, insert your records and define your batch. I am using 10 here, but you will likely want to use something larger, say 10,000 rows at a time. For 30MM rows I'd even suggest you go to 100k row batch size at a time, that's the limit I typically used with larger tables:
I usually do a sanity check and verify the rowcounts are the same before cleaning up:
Once you are confident you have migrated your data, you can drop the original table:
Last step, rename the new table, so that users don't have to change any code:
I don't know how long this will take. I'd suggest you try doing this when you have a clear maintenance window and users aren't connected.
HTH