Sql-server – will table partitioning and compression help heavily used OLTP database

compressionpartitioningsql-server-2008

We have a very big table which is over 300gb with over half billion rows in it. This table is used for audit purpose for all critical applications. A lot of inserts are happening per seconds. Due to heavy blocking, we can't purge data while the database is online.
My boss is saying by partitioning and table compression will help the disk space and overall performance. He is from OLAP DW area. But I highly doubt this will work in OLTP. How do I convince him this will not work? Any articles?

Best Answer

Sorry to disappoint, but your boss is right on target. Some of us BI folks know about DB optimization, at least a little. =) As with any major architectural change you need to test and adapt appropriately to your unique environment, workload, servers etc.

Due to heavy blocking, we can't purge data while the database is online.

Table partitioning is exceptionally helpful in many environments, especially those with large data sets. Not only can you avoid locking issues and improve query performance, but you can reduce log impact by using Truncate after swapping out your partitions when purging old records.

See this free training by Kendra Little with Brent Ozar Unlimited for a detailed guide about table partitioning in SQL Server which, incidentally, is focused on OLTP environments.

As for data compression, see this SQL Server Customer Advisory Team white paper for data compression implementation. You will save disk space, and reduce I/O at the cost of CPU usage... This trade-off is something that you need to plan and test for before blindly implementing.