Re: huge price database question.. - Mailing list pgsql-general

From Michael Nolan
Subject Re: huge price database question..
Date
Msg-id CAOzAquKmRNcfiq_ubC7S211cP_+vOy3NZinpt4DmNj3uBTJcRg@mail.gmail.com
Whole thread Raw
In response to huge price database question..  (Jim Green <student.northwestern@gmail.com>)
Responses Re: huge price database question..
List pgsql-general


right now I am having about 7000 tables for individual stock and I use
perl to do inserts, it's very slow. I would like to use copy or other
bulk loading tool to load the daily raw gz data. but I need the split
the file to per stock files first before I do bulk loading. I consider
this a bit messy.

Are you committing each insert separately or doing them in batches using 'begin transaction' and 'commit'?

I have a database that I do inserts in from a text file. Doing a commit every 1000 transactions cut the time by over 90%.  
--
Mike Nolan

pgsql-general by date:

Previous
From: Jim Green
Date:
Subject: huge price database question..
Next
From: Tom Lane
Date:
Subject: Re: Index on System Table