Postgresql – How to pass a table type with an array field to a function in postgresql

arraydatatypesfunctionspostgresqlpostgresql-9.1

i have a table called book

CREATE TABLE book
(
  id smallint NOT NULL DEFAULT 0,       
  bname text,       
  btype text,
  bprices numeric(11,2)[],
  CONSTRAINT key PRIMARY KEY (id )
)

and a function save_book

CREATE OR REPLACE FUNCTION save_book(thebook book)
  RETURNS text AS
$BODY$
DECLARE 
myoutput text :='Nothing has occured';
BEGIN

    update book set 
    bname=thebook.bname,
    btype=thebook.btype,bprices=thebook.bprices  WHERE id=thebook.id;

    IF FOUND THEN
        myoutput:= 'Record with PK[' || thebook.id || '] successfully updated';
        RETURN myoutput;
    END IF;

    BEGIN
        INSERT INTO book values(thebook.id,thebook.bname,thebook.btype,
        thebook.bprices);
        myoutput:= 'Record successfully added';           
    END;
 RETURN myoutput;

    END;
$BODY$
  LANGUAGE plpgsql VOLATILE
  COST 100;

now when i call the function

SELECT save_book('(179,the art of war,fiction,{190,220})'::book);

i get the error

ERROR: malformed array literal: "{190"
SQL state: 22P02
Character: 18

i dont understand because i dont see any error in the format of the array, any help?

Best Answer

This sort of thing gets complicated. I am working on some related projects right now. The basic tweak is that PostgreSQL uses a format which uses double quotes internally in tuple representation to represent literal values, so:

SELECT save_book('(179,the art of war,fiction,"{190,220}")'::book);

should work. In essence a neat trick is creating a csv and enclosing in tuple or array identifiers. The big issue is that you have to deal with escaping (doubling quotes at every level as needed). So the following is exactly equivalent:

SELECT save_book('(179,"the art of war","fiction","{""190"",""220""}")'::book);

The second approach is to use a row constructor:

SELECT save_book(row(179,'the art of war','fiction', array[190,220])::book);

The first solution has the obvious advantage of being able to take advantage of existing programming frameworks for CSV generation and escaping. The second is cleanest in SQL. They can be mixed and matched.