I'm currently working in an environment that requires we enable SQL Server Audit, enterprise-wide. We generate .sqlaudit
files as we have retention and access guidelines, and I'm finding that we need to compile trending information about the files generated, such as number created per hour, total volume of audit information, etc. Currently, I'm only aware of DMVs/DMFs that provide audit definition information (the list can be found here), but nothing related to the output of the SQL Server Audit process.
I'm currently running PowerShell scripts to collect this information from the metadata of the .sqlaudit
files, but I'd prefer not to rely on an external process, if possible. I've looked into extended stored procedures such as xp_dirtree
, but that only returns the number of files and not size information, so it doesn't provide as complete a picture as I need here. xp_cmdshell
, while viable, is not allowed for obvious security reasons and because NIST compliance explicitly prohibits its use.
Are there any approaches within the database engine (I'm asking about things I can hit with TSQL) where I can get this information, such as an Extended Event or hidden DMV I've not yet come across? I feel like I've done some extensive digging, but I'm never surprised if I miss something.
Best Answer
I may as well post my terribad approach in case others are interested. Basically, the following is a 2-step process. The first is to create a table in a database of your choosing that will hold a log of the Audit file meta data that you wish to collect:
I'm keeping it simple by only tracking the name, creation date, and size (both compressed and uncompressed). Why compressed? Well, we have systems in our environment that generate terabytes worth of audit activity per day, so we have to compress the
.sqlaudit
files immediately after they are generated otherwise we run into a Tribble situation. Because of this, the PowerShell routine below pulls some meta data out of the.zip
files directly.The second part of this process is to setup a SQLAgent job with a PowerShell step that fires off the following:
What this routine does is identify the path where the audit files are written to as well as the timestamp of the last audit file tracked. Using that, it runs a
Get-ChildItem
against the identified directory looking for all*.zip
files returning only files older than the last file we've identified prior.ASSUMPTIONS:
WHERE sfa.on_failure = 1
).zip
files. If you do not compress your .sqlaudit files, you will run into a scenario where the meta data on the last.sqlaudit
file will likely change at the next run because the database engine is still writing to it. In this event, I would suggest ignoring the last file returned in the array, which should avoid any duplicate/inconsistent data.Finally, this is totally specific to my environment, but it does exactly what I need. If you have suggestions, updates, etc. please post another answer as more ideas are always appreciated.