Given the following info:
Our support engineers need to grab database backups from clients.
and:
The most "valuable" data is "the configuration" of our system. It is contained in about 50 tables...
and:
I'm concerned about temporarily changing the recovery model for this purpose [of allowing the "Data" filegroup to become ReadOnly so that it can be omitted from the backup]
I would suggest a slightly different approach: use SqlPackage.exe (part of SQL Server Data Tools / SSDT) to either "extract" or "export" the desired tables and data into a single archive file which can then be retrieved and either "published" or "imported". SSDT is free, so there should be no licensing issue related to putting the SqlPackage.exe executable (and any dependent DLLs, etc) on the client systems.
You just need to make sure that SqlPackage.exe (and any dependent DLLs, etc) are on each client system. You can initiate SqlPackage.exe from xp_cmdshell
. You can even create a .CMD script that executes SqlPackage.exe with the appropriate command-line parameters (there are several -- or you can place most options into an XML-based "publish profile" file and specify that on the command-line). You would then transfer the archive file (.dacpac or .bacpac) locally, just like you are transfering the backup file now. If you are FTPing to your local server from theirs, you can include the FTP commands in the .CMD script and then if you place the EXEC xp_cmdshell 'ExportAndTransferData.cmd';
statement into a stored procedure, you would only need to execute that stored procedure :-).
Looking at the various options, and knowing that you want a subset of the tables, I am thinking that you should first attempt an "Extract" (i.e. /Action:Extract
. You don't want all tables, so then specify: /p:ExtractAllTableData=false
. Then, for each table that you do want, specify: /p:TableData=schema.table
(you will have around 50 of these options specified given that you want around 50 tables). Finally, it might be a good idea to verify the extraction by specifying: /p:VerifyExtraction=true
.
Once the .dacpac file is retrieved locally, you would "Publish" it (i.e. /Action:Publish
. For this action you just need a template database with all of the schema (tables, PKs, FKs, indexes, views, etc) and stored procedures / functions already there. This operation will just put the data into the proper tables.
Example CMD script (ExportAndTransfer.cmd
)
@ECHO OFF
SET SQLPATH="C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin"
%SQLPATH%\SqlPackage.exe ^
/Action:Extract ^
/OverwriteFiles:True ^
/SourceDatabaseName:ClientDatabaseName ^
/SourceServerName:(local) ^
/TargetFile:%TEMP%\ConfigurationTables.dacpac ^
/p:IgnoreExtendedProperties=True ^
/p:ExtractAllTableData=False ^
/p:TableData="dbo.Table1" ^
/p:TableData="dbo.Table2" ^
/p:VerifyExtraction=true
ftp -s:path\to\FtpScript.txt my_server_hostname
Example Stored Procedure to call CMD script
CREATE PROCEDURE dbo.RunExtract
AS
SET NOCOUNT ON;
EXEC xp_cmdshell 'C:\path\to\ExportAndTransfer.cmd';
GO
All you need after those two is the ftp script.
UPDATE
While the "Extract" operation is generally more forgiving than the "Export" operation due to "Export" not liking various types of objects, in this particular case, the "Export" operation worked for the O.P. as it does not require grabbing all tables that are related via Foreign Keys. The "Extract" operation requires that, if a table is specified that has any FKs to other tables or if any other tables FK to it, all tables linked in any way to the desired table must be included. I found this MSDN forum thread discussing the reason for this requirement:
A dacpac can be published to any database -- empty or non-empty. And when table data is published, if the target table isn't empty, all of the data is dropped and replaced. But if the target table isn't empty and it has incoming FK dependencies, then there could possibly be broken FK references. Requiring that all tables be a closed set resolves this problem, because any incoming references will be erased as well.
There are fewer options to specify for "Export", so the command-line would look more like the following:
%SQLPATH%\SqlPackage.exe ^
/Action:Export ^
/OverwriteFiles:True ^
/SourceDatabaseName:ClientDatabaseName ^
/SourceServerName:(local) ^
/TargetFile:%TEMP%\ConfigurationTables.bacpac ^
/p:TableData="dbo.Table1" ^
/p:TableData="dbo.Table2"
Best Answer
You can't restore a file or filegroup without restoring the PRIMARY filegroup. There are a number of reasons for this given your scenario:
Even under FULL recovery, the primary filegroup must be restored to allow you to restore individual files/filegroups because all the core metadata about your database is there.
The idea of Partial Backups and Piecemeal restores is to allow you to get back online in a disaster quicker by restoring the essential filegroups needed for the database to be operational, and you can restore additional filegroups later.
I would suggest you look at making some changes to your database:
These change will allow you to simplify your restore process. Assuming you'd made the above changes and wanted to restore the file you've mentioned, the process would be:
This would be quicker than partial backups under SIMPLE recovery because the undamaged filegroups do not have to get restored from backup, instead the data files on disk are simply recovered by SQL Server and transactions replayed from the log onto those files as appropriate. If the PRIMARY filegroup is small, and you're taking frequent log backups, the whole process should be quicker than restoring all the read/write filegroups as with partial backups under SIMPLE.