I have a very large production database and a very large test environment database in SQL Server 2008R2. Both databases have similar table structure but different users/logins/permissions/roles.
I need to refresh only a few tables in the test database periodically from production, about once a month.
The current way I am planning on doing this is
- Use the BCP utility to take an export of the tables I need from Production.
- Copy the bcp export file onto test server
- Disable indexes and constraints on all tables I am refreshing in Test
- Truncate the Test database tables
- Load Data back into Test database tables using BCP.
- rebuild indexes and re-enable constraints in Test
This all seems a little too complicated for such a small task. It also seems like it would generate a lot of redo (in the t-log) Is there a better way to do this?
Another way I thought of doing this is to restore a backup from Production onto the test environment–but the issue I have is that a full backup would be quite large and I don't need all tables to be refreshed, only a few–and also the users and security in the production database is different from test. That would be overwritten by the security settings in the production database if I restore the whole database.
Best Answer
There are 2 methods that will suit your needs :
(Note: If the tables are referenced by foreign key, then you won't be able to use
TRUNCATE
. You have to delete in chunks. Alternatively, you can drop all indexes + Foreign keys and load data and then recreate them).BCP OUT and BULK INSERT INTO destination database.
Enable Trace Flag 610 - minimally logged inserts into indexed tables.
--
Method 2 : SSIS - My preferred method in this case.
Reference : The Data Loading Performance Guide and my answer for - Insert into table select * from table vs bulk insert