Thread: pl/python composite type array as input parameter

pl/python composite type array as input parameter

From
Filipe Pina
Date:
Hello,

I'm building an app in Django and I want to have some functions directly in postgres. I'd prefer to use pl/python for the functions as it'd look better in Django migration files (python code within python code, instead of using PLPGSQL).

But one of the functions I need to create needs to accept an array of records.

The example of what I'm trying to do:

CREATE TABLE employee (
  name text,
  salary integer,
  age integer
);

CREATE OR REPLACE FUNCTION testp(e employee)
  RETURNS integer
AS $$
  plpy.notice('type', e.__class__)
$$ LANGUAGE plpythonu;

select testp(
        ('asd',10,10)::employee
        );

CREATE OR REPLACE FUNCTION testp2(es employee[])
  RETURNS integer
AS $$
  for e in es:
    plpy.notice('here', e.__class__)
$$ LANGUAGE plpythonu;

select testp2(
        ARRAY[
        ('asd',10,10)::employee
        ]::employee[]
        );

Running this .sql yelds:

CREATE TABLE
CREATE FUNCTION
psql:nfun.sql:15: NOTICE: ('type', <type 'dict'>)
CONTEXT: PL/Python function "testp"
 testp 
-------
      
(1 row)

CREATE FUNCTION
psql:nfun.sql:28: NOTICE: ('here', <type 'str'>)
CONTEXT: PL/Python function "testp2"
 testp2 
--------
       
(1 row)

so testp() that receives a single "employee" has the correct cast to "employee" and yelds type __dict__ as expected.

testp2() that receives an array of "employee", yelds type __str__ for each element of the array...

Should I declare the function some other way?
Is there any way I can force the cast on each element? I found some hits on google on *_fromDatum() internal functions but I didn't understand how/if I can call them explicitly..

Thanks,
Filipe

Re: pl/python composite type array as input parameter

From
Peter Eisentraut
Date:
On 5/18/15 10:52 AM, Filipe Pina wrote:
> But one of the functions I need to create needs to accept an array of
> records.

PL/Python doesn't support that.  Some more code needs to be written to
support that.  You did everything correctly.  I don't know of a good
workaround.



Re: pl/python composite type array as input parameter

From
Filipe Pina
Date:
Thanks for the reply anyway, it's a pity though, it'd be useful..

Another bump I've found along the pl/python road: insert ROWTYPE in table..
Maybe you have some hint on that? :)

So, in PLPGSQL I can:

DECLARE
  my_var my_table;
BEGIN
  my_var.col1 := 'asd';
  INSERT INTO my_table VALUES(my_table.*);
END;

How would I do something like that in pl/python?

First, how to declare a ROW-TYPE variable, as they're all python mappings?

my_var = { 'col1': 'asd' } enough? it'd would miss all the other columns...

Second, how to insert it?

plpy.prepare and .execute say they don't support composite types, so I cannot simply pass

pl = plpy.prepare('INSERT INTO core_customer VALUES ($1)', ['my_table'])

Any workarounds for this? (meaning I wouldn't have to specify any columns in the insert statement)

Thanks

On Sex, Mai 29, 2015 at 2:00 , Peter Eisentraut <peter_e@gmx.net> wrote:
On 5/18/15 10:52 AM, Filipe Pina wrote:
But one of the functions I need to create needs to accept an array of records.
PL/Python doesn't support that. Some more code needs to be written to support that. You did everything correctly. I don't know of a good workaround.

Re: pl/python composite type array as input parameter

From
Adrian Klaver
Date:
On 06/01/2015 07:42 AM, Filipe Pina wrote:
> Thanks for the reply anyway, it's a pity though, it'd be useful..
>
> Another bump I've found along the pl/python road: insert ROWTYPE in table..
> Maybe you have some hint on that? :)
>
> So, in PLPGSQL I can:
>
> DECLARE
>    my_var my_table;
> BEGIN
>    my_var.col1 := 'asd';
>    INSERT INTO my_table VALUES(my_table.*);
> END;
>
> How would I do something like that in pl/python?
>
> First, how to declare a ROW-TYPE variable, as they're all python mappings?
>
> my_var = { 'col1': 'asd' } enough? it'd would miss all the other columns...
>
> Second, how to insert it?
>
> plpy.prepare and .execute say they don't support composite types, so I
> cannot simply pass
>
> pl = plpy.prepare('INSERT INTO core_customer VALUES ($1)', ['my_table'])
>
> Any workarounds for this? (meaning I wouldn't have to specify any
> columns in the insert statement)

http://www.postgresql.org/docs/9.4/interactive/sql-insert.html

pl = plpy.prepare('INSERT INTO core_table SELECT * FROM my_table')

>
> Thanks
>
> On Sex, Mai 29, 2015 at 2:00 , Peter Eisentraut <peter_e@gmx.net> wrote:
>> On 5/18/15 10:52 AM, Filipe Pina wrote:
>>
>>     But one of the functions I need to create needs to accept an array
>>     of records.
>>
>> PL/Python doesn't support that. Some more code needs to be written to
>> support that. You did everything correctly. I don't know of a good
>> workaround.


--
Adrian Klaver
adrian.klaver@aklaver.com


Re: pl/python composite type array as input parameter

From
Rémi Cura
Date:
Hey,
the only straight workaround I know (which is pretty bad)
is to cast down your record to text.
Then you have an array of text, which is manageable.

For this you can either 'flatten' your record into a unique text,
or cast each part of your record to text, then emulate an array of array (you need to know the length of the inner array in your function though).

I used this to emulate a 2D numpy vector (N*3)(for numpy).

You'll need a custom aggregate, like this one.

The other more sane solution is to pass the information about the row you want to retrieve, and retrieve the row directly within the python.
For instance, here you would pass an array of id of the employee you want to work with.
This is saner, but as a design I don't really like to have specific SQL code into a generic python function.

I agree it is cumbersome, and I also badly miss more powerful input for python function (after all, plpython can already return composite types, which is awesome)


Cheers,
Rémi-C

2015-06-02 2:44 GMT+02:00 Adrian Klaver <adrian.klaver@aklaver.com>:
On 06/01/2015 07:42 AM, Filipe Pina wrote:
Thanks for the reply anyway, it's a pity though, it'd be useful..

Another bump I've found along the pl/python road: insert ROWTYPE in table..
Maybe you have some hint on that? :)

So, in PLPGSQL I can:

DECLARE
   my_var my_table;
BEGIN
   my_var.col1 := 'asd';
   INSERT INTO my_table VALUES(my_table.*);
END;

How would I do something like that in pl/python?

First, how to declare a ROW-TYPE variable, as they're all python mappings?

my_var = { 'col1': 'asd' } enough? it'd would miss all the other columns...

Second, how to insert it?

plpy.prepare and .execute say they don't support composite types, so I
cannot simply pass

pl = plpy.prepare('INSERT INTO core_customer VALUES ($1)', ['my_table'])

Any workarounds for this? (meaning I wouldn't have to specify any
columns in the insert statement)

http://www.postgresql.org/docs/9.4/interactive/sql-insert.html

pl = plpy.prepare('INSERT INTO core_table SELECT * FROM my_table')


Thanks

On Sex, Mai 29, 2015 at 2:00 , Peter Eisentraut <peter_e@gmx.net> wrote:
On 5/18/15 10:52 AM, Filipe Pina wrote:

    But one of the functions I need to create needs to accept an array
    of records.

PL/Python doesn't support that. Some more code needs to be written to
support that. You did everything correctly. I don't know of a good
workaround.


--
Adrian Klaver
adrian.klaver@aklaver.com



--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Re: pl/python composite type array as input parameter

From
Rémi Cura
Date:
OUps,

I forget another strategy I used :
instead of having
testp2(es employee[])

you can use
testp2( names text[], salaries integer[], ages integer[])

This might be the solution with the less work, but it is absolutely terrible practice,
because it will be hard to change you record type (evolution difficult)
, and having many columns will make you create function with many arguments,
which is often a bad idea.

Cheers,
Rémi-C

2015-06-02 10:36 GMT+02:00 Rémi Cura <remi.cura@gmail.com>:
Hey,
the only straight workaround I know (which is pretty bad)
is to cast down your record to text.
Then you have an array of text, which is manageable.

For this you can either 'flatten' your record into a unique text,
or cast each part of your record to text, then emulate an array of array (you need to know the length of the inner array in your function though).

I used this to emulate a 2D numpy vector (N*3)(for numpy).

You'll need a custom aggregate, like this one.

The other more sane solution is to pass the information about the row you want to retrieve, and retrieve the row directly within the python.
For instance, here you would pass an array of id of the employee you want to work with.
This is saner, but as a design I don't really like to have specific SQL code into a generic python function.

I agree it is cumbersome, and I also badly miss more powerful input for python function (after all, plpython can already return composite types, which is awesome)


Cheers,
Rémi-C

2015-06-02 2:44 GMT+02:00 Adrian Klaver <adrian.klaver@aklaver.com>:
On 06/01/2015 07:42 AM, Filipe Pina wrote:
Thanks for the reply anyway, it's a pity though, it'd be useful..

Another bump I've found along the pl/python road: insert ROWTYPE in table..
Maybe you have some hint on that? :)

So, in PLPGSQL I can:

DECLARE
   my_var my_table;
BEGIN
   my_var.col1 := 'asd';
   INSERT INTO my_table VALUES(my_table.*);
END;

How would I do something like that in pl/python?

First, how to declare a ROW-TYPE variable, as they're all python mappings?

my_var = { 'col1': 'asd' } enough? it'd would miss all the other columns...

Second, how to insert it?

plpy.prepare and .execute say they don't support composite types, so I
cannot simply pass

pl = plpy.prepare('INSERT INTO core_customer VALUES ($1)', ['my_table'])

Any workarounds for this? (meaning I wouldn't have to specify any
columns in the insert statement)

http://www.postgresql.org/docs/9.4/interactive/sql-insert.html

pl = plpy.prepare('INSERT INTO core_table SELECT * FROM my_table')


Thanks

On Sex, Mai 29, 2015 at 2:00 , Peter Eisentraut <peter_e@gmx.net> wrote:
On 5/18/15 10:52 AM, Filipe Pina wrote:

    But one of the functions I need to create needs to accept an array
    of records.

PL/Python doesn't support that. Some more code needs to be written to
support that. You did everything correctly. I don't know of a good
workaround.


--
Adrian Klaver
adrian.klaver@aklaver.com



--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


Re: pl/python composite type array as input parameter

From
Filipe Pina
Date:
HI Adrian,

I had a typo in the email:

INSERT INTO my_table VALUES(my_table.*);

was actually

INSERT INTO my_table VALUES(my_var.*);

So I meant to insert the variable I had in memory (dict representing a row), not the rows from the table..

> On 02/06/2015, at 01:44, Adrian Klaver <adrian.klaver@aklaver.com> wrote:
>
> On 06/01/2015 07:42 AM, Filipe Pina wrote:
>> Thanks for the reply anyway, it's a pity though, it'd be useful..
>>
>> Another bump I've found along the pl/python road: insert ROWTYPE in table..
>> Maybe you have some hint on that? :)
>>
>> So, in PLPGSQL I can:
>>
>> DECLARE
>>   my_var my_table;
>> BEGIN
>>   my_var.col1 := 'asd';
>>   INSERT INTO my_table VALUES(my_table.*);
>> END;
>>
>> How would I do something like that in pl/python?
>>
>> First, how to declare a ROW-TYPE variable, as they're all python mappings?
>>
>> my_var = { 'col1': 'asd' } enough? it'd would miss all the other columns...
>>
>> Second, how to insert it?
>>
>> plpy.prepare and .execute say they don't support composite types, so I
>> cannot simply pass
>>
>> pl = plpy.prepare('INSERT INTO core_customer VALUES ($1)', ['my_table'])
>>
>> Any workarounds for this? (meaning I wouldn't have to specify any
>> columns in the insert statement)
>
> http://www.postgresql.org/docs/9.4/interactive/sql-insert.html
>
> pl = plpy.prepare('INSERT INTO core_table SELECT * FROM my_table')
>
>>
>> Thanks
>>
>> On Sex, Mai 29, 2015 at 2:00 , Peter Eisentraut <peter_e@gmx.net> wrote:
>>> On 5/18/15 10:52 AM, Filipe Pina wrote:
>>>
>>>    But one of the functions I need to create needs to accept an array
>>>    of records.
>>>
>>> PL/Python doesn't support that. Some more code needs to be written to
>>> support that. You did everything correctly. I don't know of a good
>>> workaround.
>
>
> --
> Adrian Klaver
> adrian.klaver@aklaver.com



Re: pl/python composite type array as input parameter

From
Filipe Pina
Date:
Thanks Rémi,

Indeed I needed something more generic and easy to maintain (regarding table schema evolution) so I ended up going back to PL/PGSQL (for that specific function) in the meantime.

On 02/06/2015, at 09:41, Rémi Cura <remi.cura@gmail.com> wrote:

OUps,

I forget another strategy I used :
instead of having
testp2(es employee[])

you can use
testp2( names text[], salaries integer[], ages integer[])

This might be the solution with the less work, but it is absolutely terrible practice,
because it will be hard to change you record type (evolution difficult)
, and having many columns will make you create function with many arguments,
which is often a bad idea.

Cheers,
Rémi-C

2015-06-02 10:36 GMT+02:00 Rémi Cura <remi.cura@gmail.com>:
Hey,
the only straight workaround I know (which is pretty bad)
is to cast down your record to text.
Then you have an array of text, which is manageable.

For this you can either 'flatten' your record into a unique text,
or cast each part of your record to text, then emulate an array of array (you need to know the length of the inner array in your function though).

I used this to emulate a 2D numpy vector (N*3)(for numpy).

You'll need a custom aggregate, like this one.

The other more sane solution is to pass the information about the row you want to retrieve, and retrieve the row directly within the python.
For instance, here you would pass an array of id of the employee you want to work with.
This is saner, but as a design I don't really like to have specific SQL code into a generic python function.

I agree it is cumbersome, and I also badly miss more powerful input for python function (after all, plpython can already return composite types, which is awesome)


Cheers,
Rémi-C

2015-06-02 2:44 GMT+02:00 Adrian Klaver <adrian.klaver@aklaver.com>:
On 06/01/2015 07:42 AM, Filipe Pina wrote:
Thanks for the reply anyway, it's a pity though, it'd be useful..

Another bump I've found along the pl/python road: insert ROWTYPE in table..
Maybe you have some hint on that? :)

So, in PLPGSQL I can:

DECLARE
   my_var my_table;
BEGIN
   my_var.col1 := 'asd';
   INSERT INTO my_table VALUES(my_table.*);
END;

How would I do something like that in pl/python?

First, how to declare a ROW-TYPE variable, as they're all python mappings?

my_var = { 'col1': 'asd' } enough? it'd would miss all the other columns...

Second, how to insert it?

plpy.prepare and .execute say they don't support composite types, so I
cannot simply pass

pl = plpy.prepare('INSERT INTO core_customer VALUES ($1)', ['my_table'])

Any workarounds for this? (meaning I wouldn't have to specify any
columns in the insert statement)

http://www.postgresql.org/docs/9.4/interactive/sql-insert.html

pl = plpy.prepare('INSERT INTO core_table SELECT * FROM my_table')


Thanks

On Sex, Mai 29, 2015 at 2:00 , Peter Eisentraut <peter_e@gmx.net> wrote:
On 5/18/15 10:52 AM, Filipe Pina wrote:

    But one of the functions I need to create needs to accept an array
    of records.

PL/Python doesn't support that. Some more code needs to be written to
support that. You did everything correctly. I don't know of a good
workaround.


--
Adrian Klaver
adrian.klaver@aklaver.com



--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general



Re: pl/python composite type array as input parameter

From
Adrian Klaver
Date:
On 06/02/2015 03:10 AM, Filipe Pina wrote:
> HI Adrian,
>
> I had a typo in the email:
>
> INSERT INTO my_table VALUES(my_table.*);
>
> was actually
>
> INSERT INTO my_table VALUES(my_var.*);

Aah, that is different:)

>
> So I meant to insert the variable I had in memory (dict representing a row), not the rows from the table..

So where is the variable getting its data?

Or can we see a simple example of what you are trying to do?

>
>> On 02/06/2015, at 01:44, Adrian Klaver <adrian.klaver@aklaver.com> wrote:
>>
>> On 06/01/2015 07:42 AM, Filipe Pina wrote:
>>> Thanks for the reply anyway, it's a pity though, it'd be useful..
>>>
>>> Another bump I've found along the pl/python road: insert ROWTYPE in table..
>>> Maybe you have some hint on that? :)
>>>
>>> So, in PLPGSQL I can:
>>>
>>> DECLARE
>>>    my_var my_table;
>>> BEGIN
>>>    my_var.col1 := 'asd';
>>>    INSERT INTO my_table VALUES(my_table.*);
>>> END;
>>>
>>> How would I do something like that in pl/python?
>>>
>>> First, how to declare a ROW-TYPE variable, as they're all python mappings?
>>>
>>> my_var = { 'col1': 'asd' } enough? it'd would miss all the other columns...
>>>
>>> Second, how to insert it?
>>>
>>> plpy.prepare and .execute say they don't support composite types, so I
>>> cannot simply pass
>>>
>>> pl = plpy.prepare('INSERT INTO core_customer VALUES ($1)', ['my_table'])
>>>
>>> Any workarounds for this? (meaning I wouldn't have to specify any
>>> columns in the insert statement)
>>
>> http://www.postgresql.org/docs/9.4/interactive/sql-insert.html
>>
>> pl = plpy.prepare('INSERT INTO core_table SELECT * FROM my_table')
>>
>>>
>>> Thanks
>>>
>>> On Sex, Mai 29, 2015 at 2:00 , Peter Eisentraut <peter_e@gmx.net> wrote:
>>>> On 5/18/15 10:52 AM, Filipe Pina wrote:
>>>>
>>>>     But one of the functions I need to create needs to accept an array
>>>>     of records.
>>>>
>>>> PL/Python doesn't support that. Some more code needs to be written to
>>>> support that. You did everything correctly. I don't know of a good
>>>> workaround.
>>
>>
>> --
>> Adrian Klaver
>> adrian.klaver@aklaver.com
>
>


--
Adrian Klaver
adrian.klaver@aklaver.com


Re: pl/python composite type array as input parameter

From
Filipe Pina
Date:
Basically, in an (maybe-over)simplified example:

CREATE OR REPLACE FUNCTION add_transaction(transaction core_transaction)
    RETURNS integer AS $$
DECLARE
    transaction2 core_transaction;
BEGIN
    transaction.field1 := 'lapse’;
    transaction2.field2 := transaction.field2;
    transaction2.field1 := 'lapse2’;
    INSERT INTO core_transaction VALUES(transaction.*);
    INSERT INTO core_transaction VALUES(transaction2.*);
    RETURN 1;
END
$$
LANGUAGE plpgsql;

So, I wanted to do the same in plpython…

CREATE OR REPLACE FUNCTION add_transaction(transaction core_transaction)
    RETURNS integer AS $$
    transaction['field1'] = ‘lapse’
    transaction2 = { ‘field1’: ‘lapse2’, ‘field2’: transaction[‘field1’] } 

    # not this but something that would work without enumericating all columns/fields
    pl = plpy.execute('INSERT INTO core_transaction VALUES(transaction.*)’)
    pl = plpy.execute('INSERT INTO core_transaction VALUES(transaction2.*)')
    return 1
END
$$
LANGUAGE plpythonu;



On 02/06/2015, at 15:51, Adrian Klaver <adrian.klaver@aklaver.com> wrote:

On 06/02/2015 03:10 AM, Filipe Pina wrote:
HI Adrian,

I had a typo in the email:

INSERT INTO my_table VALUES(my_table.*);

was actually

INSERT INTO my_table VALUES(my_var.*);

Aah, that is different:)


So I meant to insert the variable I had in memory (dict representing a row), not the rows from the table..

So where is the variable getting its data?

Or can we see a simple example of what you are trying to do?


On 02/06/2015, at 01:44, Adrian Klaver <adrian.klaver@aklaver.com> wrote:

On 06/01/2015 07:42 AM, Filipe Pina wrote:
Thanks for the reply anyway, it's a pity though, it'd be useful..

Another bump I've found along the pl/python road: insert ROWTYPE in table..
Maybe you have some hint on that? :)

So, in PLPGSQL I can:

DECLARE
  my_var my_table;
BEGIN
  my_var.col1 := 'asd';
  INSERT INTO my_table VALUES(my_table.*);
END;

How would I do something like that in pl/python?

First, how to declare a ROW-TYPE variable, as they're all python mappings?

my_var = { 'col1': 'asd' } enough? it'd would miss all the other columns...

Second, how to insert it?

plpy.prepare and .execute say they don't support composite types, so I
cannot simply pass

pl = plpy.prepare('INSERT INTO core_customer VALUES ($1)', ['my_table'])

Any workarounds for this? (meaning I wouldn't have to specify any
columns in the insert statement)

http://www.postgresql.org/docs/9.4/interactive/sql-insert.html

pl = plpy.prepare('INSERT INTO core_table SELECT * FROM my_table')


Thanks

On Sex, Mai 29, 2015 at 2:00 , Peter Eisentraut <peter_e@gmx.net> wrote:
On 5/18/15 10:52 AM, Filipe Pina wrote:

   But one of the functions I need to create needs to accept an array
   of records.

PL/Python doesn't support that. Some more code needs to be written to
support that. You did everything correctly. I don't know of a good
workaround.


--
Adrian Klaver
adrian.klaver@aklaver.com




-- 
Adrian Klaver
adrian.klaver@aklaver.com

Re: pl/python composite type array as input parameter

From
Adrian Klaver
Date:
On 06/02/2015 12:33 PM, Filipe Pina wrote:
> Basically, in an (maybe-over)simplified example:
>
> CREATE OR REPLACE FUNCTION add_transaction(transaction core_transaction)
>      RETURNS integer AS $$
> DECLARE
>      transaction2 core_transaction;
> BEGIN
>      transaction.field1 := 'lapse’;
>      transaction2.field2 := transaction.field2;
>      transaction2.field1 := 'lapse2’;
>      INSERT INTO core_transaction VALUES(transaction.*);
>      INSERT INTO core_transaction VALUES(transaction2.*);
>      RETURN 1;
> END
> $$
> LANGUAGE plpgsql;
>
> So, I wanted to do the same in plpython…
>
> CREATE OR REPLACE FUNCTION add_transaction(transaction core_transaction)
>      RETURNS integer AS $$
>      transaction['field1'] = ‘lapse’
>      transaction2 = { ‘field1’: ‘lapse2’, ‘field2’: transaction[‘field1’] }
>
>      # not this but something that would work without enumericating all
> columns/fields
>      pl = plpy.execute('INSERT INTO core_transaction VALUES(transaction.*)’)
>      pl = plpy.execute('INSERT INTO core_transaction
> VALUES(transaction2.*)')
>      return 1
> END
> $$
> LANGUAGE plpythonu;

Yea, I do not see a way of doing that. plpgsql is more tightly coupled
to Postgres then plpythonu, so you get a lot more shortcuts. This why I
tend to use plpgsql even though I prefer programming in Python. That
being said, the feature set of plpythonu has been extended a good deal
over the last couple of Postgres versions and I would expect that to
continue.

>
>
>
>> On 02/06/2015, at 15:51, Adrian Klaver <adrian.klaver@aklaver.com
>> <mailto:adrian.klaver@aklaver.com>> wrote:
>>
>> On 06/02/2015 03:10 AM, Filipe Pina wrote:
>>> HI Adrian,
>>>
>>> I had a typo in the email:
>>>
>>> INSERT INTO my_table VALUES(my_table.*);
>>>
>>> was actually
>>>
>>> INSERT INTO my_table VALUES(my_var.*);
>>
>> Aah, that is different:)
>>
>>>
>>> So I meant to insert the variable I had in memory (dict representing
>>> a row), not the rows from the table..
>>
>> So where is the variable getting its data?
>>
>> Or can we see a simple example of what you are trying to do?
>>
>>>
>>>> On 02/06/2015, at 01:44, Adrian Klaver <adrian.klaver@aklaver.com
>>>> <mailto:adrian.klaver@aklaver.com>> wrote:
>>>>
>>>> On 06/01/2015 07:42 AM, Filipe Pina wrote:
>>>>> Thanks for the reply anyway, it's a pity though, it'd be useful..
>>>>>
>>>>> Another bump I've found along the pl/python road: insert ROWTYPE in
>>>>> table..
>>>>> Maybe you have some hint on that? :)
>>>>>
>>>>> So, in PLPGSQL I can:
>>>>>
>>>>> DECLARE
>>>>>   my_var my_table;
>>>>> BEGIN
>>>>>   my_var.col1 := 'asd';
>>>>>   INSERT INTO my_table VALUES(my_table.*);
>>>>> END;
>>>>>
>>>>> How would I do something like that in pl/python?
>>>>>
>>>>> First, how to declare a ROW-TYPE variable, as they're all python
>>>>> mappings?
>>>>>
>>>>> my_var = { 'col1': 'asd' } enough? it'd would miss all the other
>>>>> columns...
>>>>>
>>>>> Second, how to insert it?
>>>>>
>>>>> plpy.prepare and .execute say they don't support composite types, so I
>>>>> cannot simply pass
>>>>>
>>>>> pl = plpy.prepare('INSERT INTO core_customer VALUES ($1)',
>>>>> ['my_table'])
>>>>>
>>>>> Any workarounds for this? (meaning I wouldn't have to specify any
>>>>> columns in the insert statement)
>>>>
>>>> http://www.postgresql.org/docs/9.4/interactive/sql-insert.html
>>>>
>>>> pl = plpy.prepare('INSERT INTO core_table SELECT * FROM my_table')
>>>>
>>>>>
>>>>> Thanks
>>>>>
>>>>> On Sex, Mai 29, 2015 at 2:00 , Peter Eisentraut <peter_e@gmx.net>
>>>>> wrote:
>>>>>> On 5/18/15 10:52 AM, Filipe Pina wrote:
>>>>>>
>>>>>>    But one of the functions I need to create needs to accept an array
>>>>>>    of records.
>>>>>>
>>>>>> PL/Python doesn't support that. Some more code needs to be written to
>>>>>> support that. You did everything correctly. I don't know of a good
>>>>>> workaround.
>>>>
>>>>
>>>> --
>>>> Adrian Klaver
>>>> adrian.klaver@aklaver.com
>>>
>>>
>>
>>
>> --
>> Adrian Klaver
>> adrian.klaver@aklaver.com <mailto:adrian.klaver@aklaver.com>
>


--
Adrian Klaver
adrian.klaver@aklaver.com