PostgreSQL – JSON Field Size Limit Before or After Compression

jsonpostgresql

The size limit of a json field is 1GB (source: this StackOverflow answer):

json is the same as a text datatype but with JSON validation. The text datatype's maximum size is 1GB.

I ran some experiments with inserting json of varying sizes into a table in Postgres.
Taking pg_sizeof of rows here seems to indicate that Postgres does compression of the json data.

My experiment:

create table json_size (n integer, j json);

For n with values of 1, 10, 100, 1000, 10 000, 100 000, 1 000 000.

with q as (select generate_series(1, n)::bigint as x)               
insert into json_size (n, j)
select 1, coalesce(json_agg(to_json(q.*)), '[]'::json) from q;
select n, pg_column_size(j) as size from json_size;

Results

n       size
1       10
10      92
100     996
1000    3224
10000   33488
100000  335115
1000000 3351365

Is the size limit of 1GB before or after this compression is done?

Best Answer

The limit of 1GB on varlena data types like json is before compression.

Note that you will need a lot of RAM to read and write such values. It is usually a problem and bad design to store such large values in a database.