I new to Postgres and v9.6.3 on a Linux system.
From the docs and the PostgreSQL: Up and Running book, I learned that the timestamptz type just subtracts the offset of a timestamp on insertion and then just stores UTC on disk. The timezone information is 'lost' so to say.
Now I'm wondering how the psql client can display the timezone correctly despite that:
The following row was created by something like:
insert into table values ('1999-12-31T00:00:00+0100'::timestamptz, ...);
and is displayed as
select time from table;
time | field1 | field2 |
------------------------+--------+---------
1999-12-31 00:00:00+01 | 1 | 1 |
...
and
select time AT TIMEZONE 'Europe/Berlin' from table;
time
---------------------
1999-12-31 00:00:00
...
You can see in the first snippet that postgres seems to know the correct timezone from the insertion ('Europe/Berlin' which has +01h offset to GMT in winter) But how does this come?
I would have expected it do display UTC time.
Best Answer
I think it's a bit simpler than I assumed. What we're seeing here is that the top is returning a
timestamp with time zone
in ISO 8601, with tz offset. The bottom, because you're usingAT TIMEZONE
is returning atimestamp without time zone
in the time zone specified. It gets stringified differently because the type is different and the timezone portion has been lost.Using
IS OF
to verify this.(1 row)
Thanks goes out to @johto on irc.freenode.net/#postgresql.