I believe you can refer to ola.hallengren. SQL server Backup scripts.
As stated on how to use for SQL server Express:
SQL Server Express has no SQL Server Agent. Therefore, the execution
of the stored procedures must be scheduled by using cmd files and
Windows Scheduled Tasks. Follow these steps.
- Download MaintenanceSolution.sql.
- Execute MaintenanceSolution.sql. This script creates the stored procedures that you need.
- Create cmd files to execute the stored procedures; for example: sqlcmd -E -S .\SQLEXPRESS -d master -Q "EXECUTE dbo.DatabaseBackup
@Databases = 'USER_DATABASES', @Directory = N'C:\Backup', @BackupType
= 'FULL'" -b -o C:\Log\DatabaseBackup.txt
- In Windows Scheduled Tasks, create tasks to call the cmd files.
- Schedule the tasks.
- Start the tasks and verify that they are completing successfully.
Now you can modify this easily as per you're needs.
Note: In MaintenanceSolution, you can separately only download SQL server Backup scripts individually.
You've addressed a boatload of items in your question. Thanks for being so thorough!
Just a couple of things I notice off hand:
- How various hardware / load factors influence what should be done.
Are you running a 24x7 instance? What is the load around the clock? I notice you have backup compression disabled; is that by design for the test, or is it desirable for some reason to have it turned off when you put this into production? If you have tons of hardware headroom (CPU/RAM), and completing the backup in the shortest amount of time is of paramount importance, then you'd want to tune these parameters for the particular hardware you have with that goal in mind. If you want to ensure OLTP workloads are serviced around the clock, and don't want backup to impact that, you'll likely need to tune these parameters the other way around. You haven't identified your design goals since you are asking for general guidance however as you so wisely state "it depends™".
- Are there circumstances in which none of these values should be overridden?
You'd want to retain the default settings if you were concerned about supportability down the road after you no longer maintain the instance, and are uncertain about the abilities of your replacement. You'd likely want to leave the defaults in place unless you have a specific need to tune them. Let sleeping dogs lie, as they say.
- Are there pitfalls for overriding any of these that are not immediately obvious? Using up too much memory and/or disk I/O? Complicating restore operations?
As the documents you reference clearly state, upping these parameters too much can certainly have negative impacts on uptime. As with all things production-based, you need to test this thoroughly before deploying it, and leave the settings alone unless absolutely necessary.
- If I have a server with multiple instances of SQL Server running (a Default Instance and two Named Instances), and if I run the backups of all 3 Instancs concurrently, does that affect how I set these values beyond making sure that the collective (BUFFERCOUNT * MAXTRANSFERSIZE) does not exceed available RAM? Possible I/O contention?
You'll want to ensure you leave plenty of RAM for unforeseen circumstances. I would certainly be concerned about using more than 60% or 70% of available ram for backup operations unless I knew with 100% certainty that nothing else was ever going to be happening during the backup window.
I've written a blog-post with some code that shows how I do backup performance testing, at SQLServerScience.com
this may not be the best answer I've ever written, but as The Great One™ once said, "you miss 100% of the shots you don't take"
Best Answer
Set the database to simple recovery. This way the transaction log is only used during active transactions. Once transaction is committed, the active portion of the log is release and can be reused, thus minimizing log growth. Long running transactions with lot of DML will still cause the log to grow.