I have a job scheduled to run a few times everyday. The job backups my databases, like:
BACKUP DATABASE [databasename]
TO DISK = N'd:\sqlserver\backup\databasename.bak'
I can track the backups in the file by running:
RESTORE HEADERONLY FROM DISK = N'd:\sqlserver\backup\databasename.bak'
And then I can restore a database file by using something like:
RESTORE DATABASE databasename
FROM DISK = N'd:\sqlserver\backup\databasename.bak'
WITH FILE = 1455,
RECOVERY,
MOVE 'databasename_data' TO 'c:\...',
MOVE 'databasename_log' TO 'c:\...'
But I have thousands of backups, and I'm reaching my storage limit.
Because I don't need so many backups, I'm wondering if there is a easy way to limit the backup file to keep only the last 90 backups.
Any tips?
UPDATE:
I have just one .bak file with n backups inside. I can read a specific backup using the withfile option in the restore statement. The bellow answers are supposing I have multiple .bak files. What I need to delete is a outdated backup inside the .bak file.
Best Answer
The easiest solution would be using a maintenance plan task to clean up old backup files. However, you could consider scripting your way out of this, using something like the following script:
Credit: http://www.sqlservercentral.com/Forums/FindPost1034057.aspx
There's also xp_delete_file: