Re: Load experimentation - Mailing list pgsql-performance

From Ben Brehmer
Subject Re: Load experimentation
Date
Msg-id 4B1DFEA2.3070804@gmail.com
Whole thread Raw
In response to Re: Load experimentation  (Greg Smith <greg@2ndquadrant.com>)
Responses Re: Load experimentation  (Greg Smith <greg@2ndquadrant.com>)
Re: Load experimentation  (Scott Marlowe <scott.marlowe@gmail.com>)
List pgsql-performance
Thanks for all the responses. I have one more thought;

Since my input data is split into about 200 files (3GB each), I could potentially spawn one load command for each file. What would be the maximum number of input connections Postgres can handle without bogging down? When I say 'input connection' I mean "psql -U postgres -d dbname -f one_of_many_sql_files".

Thanks,
Ben



On 07/12/2009 12:59 PM, Greg Smith wrote:
Ben Brehmer wrote:
By "Loading data" I am implying: "psql -U postgres -d somedatabase -f sql_file.sql".  The sql_file.sql contains table creates and insert statements. There are no indexes present nor created during the load.
COPY command: Unfortunately I'm stuck with INSERTS due to the nature this data was generated (Hadoop/MapReduce).
Your basic options here are to batch the INSERTs into bigger chunks, and/or to split your data file up so that it can be loaded by more than one process at a time.  There's some comments and links to more guidance here at http://wiki.postgresql.org/wiki/Bulk_Loading_and_Restores

-- 
Greg Smith    2ndQuadrant   Baltimore, MD
PostgreSQL Training, Services and Support
greg@2ndQuadrant.com  www.2ndQuadrant.com 

pgsql-performance by date:

Previous
From: Greg Smith
Date:
Subject: Re: Dynamlically updating the estimated cost of a transaction
Next
From: Greg Smith
Date:
Subject: Re: Load experimentation