Re: extreme memory use when loading in a lot of data - Mailing list pgsql-general

From Stephan Szabo
Subject Re: extreme memory use when loading in a lot of data
Date
Msg-id 20040521144734.V56577@megazone.bigpanda.com
Whole thread Raw
In response to extreme memory use when loading in a lot of data  (Vivek Khera <khera@kcilink.com>)
Responses Re: extreme memory use when loading in a lot of data
List pgsql-general
On Fri, 21 May 2004, Vivek Khera wrote:

> I have some historic data that I want to analyze.  To do this I set up
> postgres on a spare box I picked up for cheap, which just lucked into
> having tons of RAM (1.5G).   I set up postgres to use 10000 buffers,
> and recompiled the kernel to allow 2Gb data size limit per process.
>
> Since this is historical data, I'm actually merging a couple of dumps
> that span the time range.  I've dealt with eliminating any conflicting
> data (ie, clashing unique keys) but I'm not 100% sure that the foreign
> key constraints are all met.  Thus, when loading the data from the
> second dump, I am leaving the FK triggers on.

I'd suggest dropping the constraints, adding the data and adding the
constraint again. If you're using 7.4 the speed will be better for
checking the constraint, and if the constraint is not satisfied, you'll
need to remove the offending row and recreate the constraint, but that's
better than having to reimport.

> Now, this is where my trouble has begun... On importing row 29,796,801
> for the first big table, I get this (after 27 hours!):

I'd wonder if some large portion of the memory is the deferred trigger
queue which doesn't yet spill over to disk when it gets too large.

pgsql-general by date:

Previous
From: "Carl E. McMillin"
Date:
Subject: Re: Am I locking more than I need to?
Next
From: Tom Lane
Date:
Subject: Re: extreme memory use when loading in a lot of data