Thread: inserting many rows

inserting many rows

From
SunWuKung
Date:
I will need to insert multiple rows into a table from php.
The data will come in 'packages' of 50-500 rows (they are responses from
different questionnaires). As there will be many people sending their
results in at the same time I need an effective method for this.

What do you suggest is the most effective way to insert this type of
data into the db? Issuing multiple inserts from php seems to be a waste
of resources.

I was thinking of writing the responses into a pg array field with a
single insert and than explode the content of that field into rows with
a function.

Could you suggest an efficient aproach?
Thanks for the help.

Balazs


Re: inserting many rows

From
Tyler MacDonald
Date:
Hi,
    You probably want something like "COPY table_name FROM STDIN";
here's an example from a table that defines flags:

COPY auth_flag (id, name, description) FROM stdin;
2    Admin    System Administrator Access
4    New Password    User must change password on next login
8    Super Admin    Allow this administrator to edit other administrators
\.

    (Those are real tabs between the fields, not spaces). See
http://www.postgresql.org/docs/8.1/interactive/sql-copy.html for more info.

    One caveat: If any of your columns are sequences, you'll have to
update their values manually after doing the COPY.

    Chees,
        Tyler


SunWuKung <Balazs.Klein@axelero.hu> wrote:
> I will need to insert multiple rows into a table from php.
> The data will come in 'packages' of 50-500 rows (they are responses from
> different questionnaires). As there will be many people sending their
> results in at the same time I need an effective method for this.
>
> What do you suggest is the most effective way to insert this type of
> data into the db? Issuing multiple inserts from php seems to be a waste
> of resources.
>
> I was thinking of writing the responses into a pg array field with a
> single insert and than explode the content of that field into rows with
> a function.
>
> Could you suggest an efficient aproach?
> Thanks for the help.
>
> Balazs
>
>
> ---------------------------(end of broadcast)---------------------------
> TIP 9: In versions below 8.0, the planner will ignore your desire to
>        choose an index scan if your joining column's datatypes do not
>        match
>

Re: inserting many rows

From
Sean Davis
Date:


On 1/2/06 5:34 PM, "SunWuKung" <Balazs.Klein@axelero.hu> wrote:

> I will need to insert multiple rows into a table from php.
> The data will come in 'packages' of 50-500 rows (they are responses from
> different questionnaires). As there will be many people sending their
> results in at the same time I need an effective method for this.
>
> What do you suggest is the most effective way to insert this type of
> data into the db? Issuing multiple inserts from php seems to be a waste
> of resources.
>
> I was thinking of writing the responses into a pg array field with a
> single insert and than explode the content of that field into rows with
> a function.
>
> Could you suggest an efficient aproach?

You could look at using COPY to insert many records very quickly.  However,
inserting inside a transaction may be all that you need.  Have you tried
simulating your application under expected loads so that you are sure that
you are making the right choice?

Sean



Re: inserting many rows

From
John DeSoi
Date:
On Jan 2, 2006, at 5:34 PM, SunWuKung wrote:

> What do you suggest is the most effective way to insert this type of
> data into the db? Issuing multiple inserts from php seems to be a
> waste
> of resources.
>
> I was thinking of writing the responses into a pg array field with a
> single insert and than explode the content of that field into rows
> with
> a function.

This may help. But other than using some form of COPY, you'll
probably get the best performance increase using prepared statements.
Unfortunately, I think this is only available in PHP 5.1 and later.



John DeSoi, Ph.D.
http://pgedit.com/
Power Tools for PostgreSQL