Sql-server – the optimal set of database maintenance tasks for high volume of transactions

sql serversql-server-2005

I have a database that has a high volume of data being inserted and deleted each week.

Every week there are approximately one million rows of data being deleted and inserted. Each month the base table on which the data gets inserted grows by around a million rows.

I think this large number of transactions are causing the log file to grow larger and get fragmented.

I am afraid the performance may get worse each month.

What is the best way to maintain and manage such a database?

If anyone can explain or provide a link that deals with such a scenario that would be great.

Best Answer

Given the drive the file is on has enough contiguous freespace just recreate the file, or stop the dbms, a defrag it. After that it shouldn't be a problem.

You aren't using shrink are you?

If it was me I'd be more worried about my indexes being a mess that the transaction files. Some sort of drop/create / rebuild of them would be something I'd look at.