Sql-server – Huge Heap Table and table compression on SQL Server 2016

compressiondisk-spacefragmentationheapsql server

My database size is huge, and recently I noticed that a new table added few months ago is the culprit.

Here is the table script.

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

CREATE TABLE [dbo].[Entry_tracker](
    [S.Number] [int] IDENTITY(1,1) NOT NULL,
    [EntryId] [varchar](50) NOT NULL,
    [EventNumber] [varchar](18) NOT NULL,
    [Data] [varbinary](max) NOT NULL,
    [TrackDateTime] [datetime] NOT NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

ALTER TABLE [dbo].[Entry_tracker] ADD  CONSTRAINT [DF_Entry_tracker_TrackDateTime]  DEFAULT (getdate()) FOR [TrackDateTime]
GO

I collected information on this table, and learned that the table has about 16 million rows and the table size is 1.4TB.enter image description here

Still, I think that the table size is huge for the no. of records.

This table is not queried by the application. It just stores different versions of same entries from another table.

I checked the fragmentation information and it shows 0 average fragmentation with 93% average space used.
enter image description here

Since I am using SQL Server 2016, I though I could use table compression, so tried sp_estimate_data_compression_savings to estimate possible space savings.
However, the results shows no savings.

size_with_current_compression_setting(KB) is equal to size_with_requested_compression_setting(KB)

Can anyone direct me in finding the issue with this table?

It is really important, as this is taking a lot of disk space.

Appreciate your help.

Best Answer

Still i think that the table size is huge for the no. of records.

Why? What makes you say this?

what it does is stores different version of same entry from another table.

So it holds versions, which could be large or small, especially given:

[Data] [varbinary](max) NOT NULL,

Have you checked the datalength() on the rows to see if you have some large rows in there eating space?

Seems to me, based on the data the size it is given the function seems reasonable. Versioning is rarely small unless it's just changed bytes and you use various algorithms to understand which bytes, how, and if it builds off previously changed versions or not.