i am testing the numeric type in postgres according the precision and scale i declared the numeric type with (1000,900) which gives me 900 digits decimal points but when i test it by inserting and halving 1 it stops after the 1023 halving and throws the error: ERROR: value out of range: overflow
SQL state: 22003.
I use this statement to insert:
insert into numeric_test
select ((1 / power(2, i))::numeric)+10000000, pg_column_size(( 1/power(2, i)::numeric)+10000000)
from generate_series(1, 1024) as i;
And that is my table:
create table numeric_test (
pos numeric(1000,900),
sz integer
);
alter table numeric_test add column index_x integer;
Is there a limit how big and numeric type entry could get? The size of the 102 3iteration is 127 bytes and the 1024 iteration throws an error of overflow.
Best Answer
The
numeric(1000,900)
column has nothing to do with it; theselect
alone produces the overflow error.Calling
power()
with integer arguments will default to the floating-point version of the function.power(2,1024)
is causing the overflow, as the maximum base-2 double-precision exponent is 1023.There is also a version of
power()
which takesnumeric
arguments, and will handle 21024 without overflowing. If you cast your inputs tonumeric
, it will call this version instead: