Sql-server – SQL Server replicating data to an archive

archivereplicationsql servertransactional-replication

I have the following scenario:

Multiple SQL Servers where production machines log about 80GB of data (mostly images) daily in a database (each machine has a server with one database). Since these servers have about 4TB of space the amount of days they can store is limited. So the data on these servers is overridden frequently. But we want to make reports, that date back to up to 2 years.

I could code stored procedures to fetch all new data entries and copy them to a backup SQL Server which has more space every 2 hours or so. But I am talking about a database with approximately 25 tables and I think there might be a better way.

Do you think transactional replication or log shipping might be better options in this scenario?

If the SQL Server on the production site roles over the data (I have not yet had the opportunity to check how this actually works on the systems), I don't want the backup server to replicate this. The Backup server should keep all records and only add new ones.

Any advice on this would be appreciated.

Best Answer

if i understand correctly log shipping and transactional replication is not an option because they basically update on the other side whatever you do on the primary server. you will get an updated database and if the info is overridden so is the secondary database. i would maybe try create a planned DWH database that i would copy the fresh data every period of time in some way.. maybe SSIS or bulk insert.

looks like it'll take some work like understanding when is data being overridden: every 5 minutes? at random times when they update it in the app? and then decide accordingly on what to do(maybe even use a trigger on insert or update to bulk insert it to a DWH table in s different server). I hope this helps. keep in mind if the updates and insert happen often this triggers may cause performance issues in some cases if done constantly.