Re: large numbers of inserts out of memory strategy - Mailing list pgsql-general

From Rob Sargent
Subject Re: large numbers of inserts out of memory strategy
Date
Msg-id 74C96E30-97C3-45D1-887F-F876602355DE@gmail.com
Whole thread Raw
In response to large numbers of inserts out of memory strategy  (Ted Toth <txtoth@gmail.com>)
Responses Re: large numbers of inserts out of memory strategy  (Ted Toth <txtoth@gmail.com>)
List pgsql-general
> On Nov 28, 2017, at 10:17 AM, Ted Toth <txtoth@gmail.com> wrote:
> 
> I'm writing a migration utility to move data from non-rdbms data
> source to a postgres db. Currently I'm generating SQL INSERT
> statements involving 6 related tables for each 'thing'. With 100k or
> more 'things' to migrate I'm generating a lot of statements and when I
> try to import using psql postgres fails with 'out of memory' when
> running on a Linux VM with 4G of memory. If I break into smaller
> chunks say ~50K statements then thde import succeeds. I can change my
> migration utility to generate multiple files each with a limited
> number of INSERTs to get around this issue but maybe there's
> another/better way?
> 
> Ted
> 
what tools / languages ate you using?


pgsql-general by date:

Previous
From: Ted Toth
Date:
Subject: large numbers of inserts out of memory strategy
Next
From: Tomas Vondra
Date:
Subject: Re: large numbers of inserts out of memory strategy