I looked into doing this years ago and looked into both BCP and SSIS. Neither one really provided for great flexibility. BCP needed format files to get the text-qualifiers, etc. But this was for a dynamic backup solution for a subset of tables to get archived nightly. SSIS seemed a bit more dynamic (though not fully), but couldn't do a few of the things that BCP could. I needed something that could dynamically accept changes to tables (include new tables, don't require removed tables) AND changes to columns (as I needed to do "SELECT * FROM {SchemaName}.{TableName};") AND dynamic text-qualification AND column headers, etc.
If you have a static set of tables, then SSIS, using a ForEach container, would be one way to do this (again, assuming the desire is to stay away from BCP due to requiring multiple calls).
Using PowerShell would probably also be an option that would allow for more flexibility than SSIS.
However, if you need something fully dynamic and/or need to stick with a single connection per each DB (all tables in that DB) or even a single connection per server (all tables for all databases), then the only way I have ever found to handle this is the one that I created. It is a SQLCLR stored procedure called DB_BulkExport
which has parameters for @Query
and @FilePath
that allow for easily cycling through a list of all databases, then per each one cycling through a list of tables, and building both the query and filename from those pieces. You could write a fairly simple stored proc that uses a cursor to get the Database and Table names and call this proc, hence being a single call to SQLCMD.EXE
to call that one proc. Or it can be scheduled via SQL Agent without needing to enable xp_cmdshell
.
This procedure -- DB_BulkExport
-- is part of the SQL# library (which I am the creator of), though it is only available in the Full (i.e. paid for) version, not in the Free version. I generally do not mention stuff here that is not available in the Free version, but as I said, I am not aware of any other existing means of automating this type of process. Of course, anyone is free to write this themselves, either as a SQLCLR proc as I have done, or as a stand-alone .NET app, or using PowerShell. It is more than a few lines of code in C# and I suspect it is also not overly simplistic in PowerShell either (but I haven't tried and don't have enough experience with PowerShell to really know).
EDIT:
Regarding how I have used the above mentioned stored proc in at least one instance: It was used to export data nightly, for long-term archival, the oldest day from a large table that was itself a common, short-term archival of data from 20 source servers.
Ideally there would be a local drive that would not have pagefile, mdf, or ldf files on it that could be used for the extraction and compression. For the long-term archival process noted above, that server was a dedicated archive that just collected data for a few tables from the 20 source servers. Hence the only querying of the data 99% of the time was the SELECT * FROM ArchiveTable
to do the extract; the other 1% of the time it was research.
Best Answer
You would generally not find an Application pool using the computer account to connect to SQL Server. At most I would not want to allow that type of configuration.
You should be able to accomplish it by running the process as the local system account, NT AUTHORITY\SYSTEM. You can do this by using psexec. You would just specify
-s
parameter to execute it as the system account.I think something like this should work: