To get this to work I scheduled switching off the 'immediate sync' to very low traffic period
EXEC sp_changepublication
@publication = 'statics',
@property = 'Immediate_sync',
@value = 'false'
GO
I then ran a snapshot which only took a fraction of the time at 6 mins.
I then had to drop the current subscription for that article only
EXEC sp_dropsubscription
@publication = 'statics',
@article = 'dbo.SupplierCorporatePayAccountLink_tbl',
@subscriber = 'SURVIVOR'
and rerun the snapshot. Then i could add the subscription back into the publication
EXEC sp_addsubscription
@publication = 'statics',
@article = 'dbo.SupplierCorporatePayAccountLink_tbl',
@subscriber = 'SURVIVOR',
@destination_db = 'MLT-Replicated',
@reserved='Internal'
but it would not allow this without the fairly obscure setting
@reserved ='Internal'
or I got an error message saying I needed to subscribe to all articles.... replication then automatically sync'd the new table. I'm a happy man as this has been causing me some long nights!
What version of SQL Server are you using ?
I just setup a publication and subscription in the method you describe on SQL Server 2012 (11.0.2218) and it appears to be working fine.
I tested inserting, updating, and deleting records from the source - and all transactions made it successfully to the destination.
It is true that the timestamp column is of interest but I didn't have to use the option you mentioned in your update. From this SQL 2000 TechNet article note that
"The literal values for a timestamp column are replicated, but the data type for the replicated values is changed to binary (8) on the Subscriber."
Since this was for SQL 2000 I can only imagine things may have changed... as my subscriber table, just like the publication, has a timestamp column not a binary(8).
Therefore the values that are there when you backup and restore will match. Future inserts will have different timestamp values in the source and destination.
I scripted the entire thing out into one file. The order of events:
- create the publication (with @allow_initialize_from_backup = N'true');
- backup the source database (with format, init);
- restore the database to the destination;
- create the subscription (with @sync_type = N'initialize with backup', @backupdevicetype = 'disk', @backupdevicename = '\myUNCPath\Backup\Backup.bak').
I ran into Msg 21397
during my first try - and an interesting read, explanation, solution are in this blog post.
Best Answer
There really isn't a supported way to do what you are asking. You can use a backup/restore but that would be to a new database. You could try a filtered article where you only grab data after a certain create date.
Or you could use a manual process such as an SSIS package or T-SQL Scripts to stage the schema and data. But the snapshot is more or less a required item.
I would probably do a single article snapshot or maybe look to filter if you can filter create_date > getdate() and then add in rows manually after if you have such a column, or an identity column, etc.
A few links:
About manually preparing.
About using backups or other methods.
About filtering.