Try here for a basic intro to PostgreSQL
and JSON
.
Also, PostgreSQL documentation is pretty good, so try it here. Check out the pretty_bool
option.
Your original question was "Is there a way to export postgres table data as JSON
". You wanted it in this format
{'id':1,'name':'David'}
{'id':2,'name':'James'}
...
I didn't have a running instance of PostgreSQL
so I downloaded, compiled and installed 9.4.
To answer this, I first CREATE
ed a table (fred)
CREATE TABLE fred (mary INT, jimmy INT, paulie VARCHAR(20));
INSERT INTO fred VALUES (2, 43, 'asfasfasfd' );
INSERT INTO fred VALUES (3, 435, 'ererere' );
INSERT INTO fred VALUES (6, 43343, 'eresdfssfsfasfae');
Then, to check:
test=# select * from fred;
mary | jimmy | paulie
------+-------+------------------
2 | 43 | asfasfasfd
3 | 435 | ererere
6 | 43343 | eresdfssfsfasfae
Then I issued this command
test=# COPY (SELECT ROW_TO_JSON(t)
test(# FROM (SELECT * FROM fred) t)
test-# TO '/paulstuff/sware/db/postgres/inst/myfile';
COPY 3
test=#
I then quit psql and listed the file myfile.
test=# \q
[pol@polhost inst]$ more myfile
{"mary":2,"jimmy":43,"paulie":"asfasfasfd"}
{"mary":3,"jimmy":435,"paulie":"ererere"}
{"mary":6,"jimmy":43343,"paulie":"eresdfssfsfasfae"}
[pol@polhost inst]$
(you can experiment with the output from
COPY (SELECT ROW_TO_JSON(t, TRUE) -- <-- Note addition of "TRUE" here!
at your leisure).
It was pointed out by @offby1 that the output (while corresponding to the OP's question) is not correct JSON
. @EvanCarroll pointed out that \o
is also a way of outputting to a file, so I combined the solutions to these two niggles in this statement (with help from here):
test=# \o out.json
test=# SELECT array_to_json(array_agg(fred), FALSE) AS ok_json FROM fred;
-- <-- "TRUE" here will produce plus
("+) signs in the output. "FALSE"
is the default anyway.
test=# \o
gives:
[pol@polhost inst]$ more out.json
ok_json
----------------------------------------------------------------------------------------------------------------------------------------------
[{"mary":2,"jimmy":43,"paulie":"asfasfasfd"},{"mary":3,"jimmy":435,"paulie":"ererere"},{"mary":6,"jimmy":43343,"paulie":"eresdfssfsfasfae"}]
(1 row)
[pol@polhost inst]$
FINALLY, there is the backslash (\
) problem alluded to by @AdamGent in his post. This was a bit tricky, but it is possible without resorting to post-query processing. Voilà:
INSERT INTO fred VALUES (35, 5, 'wrew\sdfsd');
INSERT INTO fred VALUES (3, 44545, '\sdfs\\\sfs\\gf');
And using REGEXP_REPLACE thus (note the cast ::TEXT) removes the excess blackslashes.
test=# \o slash.json
test=# SELECT REGEXP_REPLACE(ROW_TO_JSON(t)::TEXT, '\\\\', '\\', 'g')
test=# FROM (SELECT * FROM fred) AS t; -- I found that using a CTE was helpful for legibility
test=# \o
test=# \q
gives:
[pol@polhost inst]$ more slash.json
regexp_replace
------------------------------------------------------
{"mary":2,"jimmy":43,"paulie":"asfasfasfd"}
{"mary":3,"jimmy":435,"paulie":"ererere"}
{"mary":6,"jimmy":43343,"paulie":"eresdfssfsfasfae"}
{"mary":35,"jimmy":5,"paulie":"wrew\sdfsd"}
{"mary":3,"jimmy":44545,"paulie":"\sdfs\\\sfs\\gf"}
(5 rows)
[pol@polhost inst]$
(p.s. As for @Zoltán 's comment - this may be a version thing - unable to reproduce!).
There is no way that saves you from specifying all JSON attributes in the select list, whether implicitly through the "dummy type" or explicitly, e.g. using something like this:
select json_column ->> 'text1' as text1,
json_column ->> 'text2' as text2,
...
from the_table;
What you can do, is to make this simpler by automatically creating a view with all attributes based on the distinct attributes in the JSON documents.
The following code will re-create a view with all distinct keys from the JSON column:
do
$$
declare
l_keys text;
begin
drop view if exists v_json_view cascade;
select string_agg(distinct format('json_column ->> %L as %I',jkey, jkey), ', ')
into l_keys
from the_table, json_object_keys(json_column) as t(jkey);
execute 'create view v_json_view as select '||l_keys||' from the_table';
end;
$$
;
You will need to run the above every time the list of keys in all json documents changes. Theoretically this could be done in a trigger but that's probably not a good idea if you run many updates on that table.
If the total number of JSON keys is somewhat "stable", you could schedule a cron job to re-create that view in regular intervals.
You are also limited by the maximum number of columns in a table or view. If you have more (distinct) keys than approx. 1600 (maybe less) the above will fail.
Best Answer
You can use json_build_object function which allows you to calculate keys or values.
The result will be like;
Here is sample fiddle