A little reluctant, yes, but not 100%. I'm new to Postgres, but if I end up using it enough, then I will also end up learning some command line stuff. If it continues to look like a good/robust solution for this particular project (think ERIPP plus Shodan and whatever else I can come up with) then I'll be here awhile...
I'm not an expert, I would think if you can spare using only one transaction , it would be way way faster to do it !
the system simply could skip keeping log to be ready to roll back for a 1 billion row update !
Of course it would be preferable to use psql to execute statement by statement as separate transactions , and do it with X several parallel psql (splitting the big text file into X parts), yet Joey seemed reluctant to use console =)
John R Pierce wrote: > On 11/26/2013 9:24 AM, Joey Quinn wrote: >> When I ran that command (select * from pg_stat_activity"), it returned >> the first six lines of the scripts. I'm fairly sure it has gotten a >> bit beyond that (been running over 24 hours now, and the size has >> increased about 300 GB). Am I missing something for it to tell me what >> the last line processed was? > > that means your GUI lobbed the entire file at postgres in a single > PQexec call, so its all being executed as a single statement. > > psql -f "filename.sql" dbname would have processed the queries one at > a time.
Yes, but that would slow down processing considerably, which would not help in this case.
I'd opt for psql -1 -f "filename.sql" dbname so it all runs in a single transaction.