Re: tracking scripts... - Mailing list pgsql-general

From Rémi Cura
Subject Re: tracking scripts...
Date
Msg-id CAJvUf_vHUSTgsvS4v_6eH2W03G-1gkJxbDRvSUYOLM8xMSn1Rw@mail.gmail.com
Whole thread Raw
In response to Re: tracking scripts...  (Albe Laurenz <laurenz.albe@wien.gv.at>)
Responses Re: tracking scripts...
Re: tracking scripts...
List pgsql-general
I'm not an expert,
 I would think if you can spare using only one transaction , it would be way way faster to do it !

the system simply could skip keeping log to be ready to roll back for a 1 billion row update !

Of course it would be preferable to use psql to execute statement by statement as separate transactions , and do it with X several parallel psql (splitting the big text file into X parts), yet Joey seemed reluctant to use console =)


Cheers,
Rémi-C


2013/11/27 Albe Laurenz <laurenz.albe@wien.gv.at>
John R Pierce wrote:
> On 11/26/2013 9:24 AM, Joey Quinn wrote:
>> When I ran that command (select * from pg_stat_activity"), it returned
>> the first six lines of the scripts. I'm fairly sure it has gotten a
>> bit beyond that (been running over 24 hours now, and the size has
>> increased about 300 GB). Am I missing something for it to tell me what
>> the last line processed was?
>
> that means your GUI lobbed the entire file at postgres in a single
> PQexec call, so its all being executed as a single statement.
>
> psql -f "filename.sql" dbname   would have processed the queries one at
> a time.

Yes, but that would slow down processing considerably, which would
not help in this case.

I'd opt for
psql -1 -f "filename.sql" dbname
so it all runs in a single transaction.

Yours,
Laurenz Albe

--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

pgsql-general by date:

Previous
From: Albe Laurenz
Date:
Subject: Re: tracking scripts...
Next
From: John R Pierce
Date:
Subject: Re: tracking scripts...