Consider a Postgres table containing a range variable:
CREATE TABLE my_table (t_range tsrange, field1 text);
If I want to manually insert a row via psql
, I can run:
INSERT INTO my_table (t_range, field1) VALUES ('[2014-06-02 07:00:00,infinity)'::tsrange, 'sometext');
Instead of inserting rows manually, I want to perform a bulk insert from a CSV file. I created a CSV file called my.csv
using a similar format (using pipe as the delimiter to avoid having to escape the comma in the tsrange
variable, and removing the explicit quote characters) with some content:
t_range|field1
'[2014-06-02 07:00:00,infinity)'::tsrange|'sometext1'
'[2014-06-02 07:00:00,infinity)'::tsrange|'sometext2'
Then from psql
I ran
\copy my_table FROM 'my.csv' DELIMITER '|' QUOTE '''' CSV HEADER;
but this failed with the error
ERROR: malformed range literal: "[2014-06-02 07:00:00,infinity)::tsrange"
DETAIL: Junk after right parenthesis or bracket.
CONTEXT: COPY my_table, line 2, column t_range: "[2014-06-02 07:00:00,infinity)::tsrange"
How should I format a CSV file with range variables for use with \copy
?
Best Answer
Removing the explicit cast to
tstrange
worked. The new CSV file looks likeIt seems to me that
\copy
is fine with automatically attempting to cast to the appropriate datatypes, but the explicit cast causes it to choke.One can also simplify the CSV file a bit in this case by removing the quotes:
and modifying the import command: