Sql-server – SQL Server database file fragmentation

sql-server-2008

I have inherited a system where the previous DBA added 7 data files to the PRIMARY filegroup (8MB initial size) and left the AUTOGROW option at 8MB. What I have now is a set of eight files each about 3 – 4GB in size that have been slowly growing over a two-year period. I'd like to remove the file fragmentation in the fastest way possible.

Here are the options that I came up with:

  1. Expand the 1st file in the PRIMARY filegroup by ~28GB (7 files x 4GB)
  2. Move the data off each of the successive files and mark them for deletion
  3. Delete the other 7 files
  4. Detach the database
  5. Copy the detached database file to a different drive on the server
  6. Copy the detached database file back to the original drive
  7. Reattach the database

or

  1. Create a new database 32GB in size (8 x 4GB)
  2. Transfer all of the objects, tables, users and permissions to the new database using SSIS
  3. Drop the old database
  4. Rename the new database

Is an OS level "Disk Defrag" still not an option?

I do not know which option is the best or if it will even work.

Also, this database is being mirrored and replicated, so the least amount of work in regard to having to rebuild that it optimal.

Thanks for your help.

Best Answer

Now, after so many increases in space, surely the physical files are fragmented and would benefit from some defragmentation at the file level.

I don't know your situation, but any server should be fast enough to serve a single 32 GB data file. I would see a benefit only if all those files are on different physical disks, otherwise I don't see the need for such hassle.

In this situation I'd go with solution 1, though, I'd create the file bigger than current size, with space that should suffice at least for the next year and with a bigger size for autogrow.

I didn't think that OS level defrag was available for live database files. I always thought that you need to detach it in order to be able to defrag a file, but seems that the guys from the Serverfault answer were able to defrag a db file while the db was online.

Both your options should be able to work, to make your db faster by using a single defragged file, though, for solution no 2, I don't see the need to have same number of files, you can create a database with a single data file and using SSIS/bcp to move everything in the tables of the new db.

I would test before, but I think that the first solution would be faster.