Typically, backups to file are appended with a timestamp (e.g. MyDatabase_FULL_201202060900.bak) so you have the option of using a script to generate the restore sequence given a directory of files. Google will give you dozens of examples, as will the SSC script library. I have this script in my library currently.
If the server you've taken the backups from is available, you can generate the restore script from the msdb
tables dbo.backupset
and dbo.backupmediafamily
, example here. Note SSMS will generate the correct script for you and you can use Profiler to capture the queries it uses to do so.
As with all things in SQL Server, it depends.
First thing you need to do is make sure you understand what each type of backup does.
Books Online has all the gooey details, but here's my summary.
A FULL backup contains everything within the database. A DIFFERENTIAL backup is cumlative NOT incremental. In your example, if your database failed on the 12th, then you'd only need to restore the full backup from the 1st and then the most recent differential on the 12th, then followed by all the transaction log backups upto the failure. A TRANSACTION LOG backup is only needed for databases using the full or bulk-logged recovery model. If you're using the simple recovery model then transaction log backups are not needed.
Now that we've cleared that up...Designing a backup schedule really depends on how much data you need to recovery and how fast you need to recover it in the event of a diaster. I would recommend starting with a full backup each day. You can always reduce the frequency later. Remember the differential backup is cumlative since the last full, so depending on the amount change going on in your database the differential could be larger than the full backup after a few days. If you do a full backup each day, then you may not need to use differentials at all; however you could still do it once a day and schedule it at 12 noon. The transaction log backup only backs up the log. The frequency of the log backup will determine how much data you're willing to lose in the event of a failure. If you run your log backup every 15 minutes, then you would expect to lose up to the last 15 minutes of data that changed. 15 minutes is a good frequency, but every 30 minutes works perfectly for my environment.
As I said earlier, it all depends on your environment. After you've designed and setup your backup schedule, remember to test it on an alternate server. Practice restoring your full, diff, and log backups so that you know everything works like you designed it.
Books Online has some good info if you plan use Maintenance Plans, but if you really want flexibility then check out Ola Hallengren's backup scripts.
Best Answer
I'd answer that from the
backupmediafamily
andbackupset
tables. Build the restore T-SQL dynamically from that usingSTOPAT
for the time you want to restore.Or:
Use PowerShell + dbatools
pipe get-backuphistory
torestore-database
using the-restoretime
parameter.