large numbers of inserts out of memory strategy - Mailing list pgsql-general

From Ted Toth
Subject large numbers of inserts out of memory strategy
Date
Msg-id CAFPpqQGux3uT=CNN-z=zXr4qSmfBw9tDCURimELqEWUBrBpL7A@mail.gmail.com
Whole thread Raw
Responses Re: large numbers of inserts out of memory strategy
Re: large numbers of inserts out of memory strategy
Re: large numbers of inserts out of memory strategy
List pgsql-general
I'm writing a migration utility to move data from non-rdbms data
source to a postgres db. Currently I'm generating SQL INSERT
statements involving 6 related tables for each 'thing'. With 100k or
more 'things' to migrate I'm generating a lot of statements and when I
try to import using psql postgres fails with 'out of memory' when
running on a Linux VM with 4G of memory. If I break into smaller
chunks say ~50K statements then thde import succeeds. I can change my
migration utility to generate multiple files each with a limited
number of INSERTs to get around this issue but maybe there's
another/better way?

Ted


pgsql-general by date:

Previous
From: Robert Haas
Date:
Subject: Re: ERROR: too many dynamic shared memory segments
Next
From: Rob Sargent
Date:
Subject: Re: large numbers of inserts out of memory strategy