Sql-server – How to make this backup scheme more friendly for reporting

backupprocesssql server

Currently, we have around 50 remote servers, each with a production database. Each night, all 50 of these servers run a backup job and zips the backup. Each morning, a local server runs a job for each site that copies the zip file over, and restores the database to a 'temp' server. The job then copies individual tables (not all tables are copied) to their corresponding database on a 'reporting' server. The database on the reporting server may contain custom tables to track information, so we can't simply drop the database every night.

I feel this process is somewhat redundant. I don't like having the same data in two places, and I don't like have to create scripts to copy individual tables for each remote server added (roughly 1 a month). However, I can't find a good way to do this and keep the custom tables we use.

One of my ideas was to turn the 'temp' server into a repository for the production database backups, and use the 'reporting' server solely for custom tables, but this leads to needing to use linked server joins.

Does anyone have an idea on how I can make this process better?

Best Answer

Log shipping would also be a viable approach I think.

You could set it so that logs are shipped and restored overnight, but that gets suspended during business hours - or runs a lot less frequently. Since restoring new logs requires exclusive access to the database. So you may run the restores up until 8:30 AM, and then restore the next set at noon. This way their reports in the morning reflect up until 8:30, and in the afternoon they have caught up a little bit more.

The benefits over replication and/or your current approach?

  1. Depending on the lag, this can also save you from a "shot yourself in the foot" scenario in production. If you delay the log shipping you may still be able to get at the "before mistake" data on the reporting server without having to go through the full process of restoring the production data and then working forward to get the most recent pre-mistake data.

  2. Far less pounding of the server than shipping and restoring whole backups all at once nightly.

Log shipping isn't the panacea by any means, but it isn't difficult to set up, works in any edition (even though not all editions support a hokey wizard), and solves a couple of different problems.