I'm trying to better understand numeric types in SQL and have read that the decimal type will always require 17 bytes. However, the MS Docs list a table indicating the amount of space used depends on the decimal's precision. So I tried to test it using the datalength
function.
create table tbl_TestDec(dec1 decimal(19,4), dec2 decimal(20,4), dec3 decimal(9,4))
insert into tbl_TestDec
select 1, 1, 1
select datalength(dec1), datalength(dec2), datalength(dec3) from tbl_TestDec
This outputs:
5 5 5
I was expecting either 9 13 5
or 17 17 17
. I'm using SQL Server 2005. Are all decimals vardecimal or am I misunderstanding the datalength function?
Best Answer
I think you will find quite a bit of interesting information about this topic in Reducing Database Size by Using Vardecimal Storage Format, but to answer your question (quoting from the post, highlighting mine):
Try inserting values with larger precision and your results will change.
On another note:
You can enable Vardecimal storage at the database and table level. Using SSMS Object Explorer, you can look at your database options and table storage properties to verify whether you have Vardecimal storage enabled. You can also use
sp_db_vardecimal_storage_format
, which notes:That said, Enterprise-only vardecimal storage was deprecated in SQL Server 2008 in favour of row compression, which is a superset of vardecimal storage capabilities. Starting with SQL Server 2016 SP1, row and page data compression are available in all editions, and do not require a database configuration option to use.