I've recently started my role as a DBA and my initial task has been to take ownership and redeploy a database provided by a 3rd party to help us conform to GDPR legislation.
As a consequence, I've not really been able to look at much beyond the project itself. I'm now at the stage where my database is ready to go live, although there are a few caveats.
My existing code for some aspects creates large temporary tables (70m+ rows) and I know my dev database has grown over this project.
My question is for this and subsequent projects working with large datasets, in instances of temporary storage, should I use the temp database or create staging tables within the database I migrate to and drop them?
The reason I ask this is my live tempdb currently is less than 1gb in size whereas dev reached around 30 and should I continue with temporary tables, I would want to grow tempdb in advance to prevent waits on autogrowth.
Best Answer
Based on the additional details provided, I would recommend using staging tables versus temporary tables.
The benefits are:
Downsides provided by David Browne:
If you go this route, a common choice is to also segregate the data to a separate staging database altogether, which might help mitigate some of the negatives as this wouldn't have to be in an AG or in Full recovery mode necessarily.