I have a scenario where some inserts/updates are failing due to maximum row size of 8060 limitation. This is happening for a table having lots of varchar 250 to 500, numerics and datetime columns. I was trying to reproduce this same scenario in my test environment but unable to do so.
Do you guys have any idea how to reproduce it in Test environment? Any examples would be great for explanation.
My environment is SQL server 2005 Ent. Edition. My table has ~470 columns. I know its a bad design but its not in my control.
I tried below but it doesn't work:
create table test (col1 varchar(5000), col2 varchar(5000))
insert into test values (replicate('A', 5000), replicate('B', 5000))
Even though record length is greater than 8060, it inserts successfully.
Best Answer
SQL Server has a limitation of 8060 bytes per row. See "Bytes per row" in Maximum Capacity Specifications for SQL Server. Additional information for this line has this phrase:
So, when you insert two
varchar
values each 5000 bytes long in one row, one or both of them would be stored off-row and occupy only 24 bytes instead of 5000+2.But, if you have 470
varchar
columns in a table, you can insert only8060/24=335
non-nullvarchar
values in the same row of such table. In fact, even slightly less than 335, because there are other structures in a row that take some space.MSDN: Row-Overflow Data Exceeding 8 KB
So, to reproduce it in test environment, create a table with at least 336
varchar
columns and try to insert 336 non-null values that are longer than 24 characters in the same row.