Thread: Postgres using up all my memory
I'm trying to fill a table with several million rows that are obtained directly from a complex query. For whatever reason, Postgres at one point starts using several gigabytes of memory, which eventually slows down the system until it no longer responds. At first I assumed I had unintentionally assigned to much memory to Postgres, but I observe the same behavior even if I use the default postrgresql.conf. Then I thought there may be some problem with the system itself, but it has passed several load tests, and I observed the same problem on a second system. I am was using 7.4, and now 8.0, on a machine running Fedora Core 2. Any ideas? Is this a known problem, or should Postgres be able to handle this? May be tricky to reproduce the problem, as a lot of data is required, but I can post the DDL/DML statements I am using if this helps.
On Fri, 4 Feb 2005, Eric Jain wrote: > I'm trying to fill a table with several million rows that are obtained > directly from a complex query. > > For whatever reason, Postgres at one point starts using several > gigabytes of memory, which eventually slows down the system until it no > longer responds. > > At first I assumed I had unintentionally assigned to much memory to > Postgres, but I observe the same behavior even if I use the default > postrgresql.conf. > > Then I thought there may be some problem with the system itself, but it > has passed several load tests, and I observed the same problem on a > second system. > > I am was using 7.4, and now 8.0, on a machine running Fedora Core 2. > > Any ideas? Is this a known problem, or should Postgres be able to handle > this? May be tricky to reproduce the problem, as a lot of data is > required, but I can post the DDL/DML statements I am using if this helps. Explain output would also be useful. I would wonder if it's a problem with a hash that misestimated the necessary size; you might see if analyzing the tables involved changes its behavior.
On Fri, Feb 04, 2005 at 05:59:26 -0800, Stephan Szabo <sszabo@megazone.bigpanda.com> wrote: > On Fri, 4 Feb 2005, Eric Jain wrote: > > > I'm trying to fill a table with several million rows that are obtained > > directly from a complex query. > > > > For whatever reason, Postgres at one point starts using several > > gigabytes of memory, which eventually slows down the system until it no > > longer responds. > > Any ideas? Is this a known problem, or should Postgres be able to handle > > this? May be tricky to reproduce the problem, as a lot of data is > > required, but I can post the DDL/DML statements I am using if this helps. > > Explain output would also be useful. I would wonder if it's a problem > with a hash that misestimated the necessary size; you might see if > analyzing the tables involved changes its behavior. I think deferred triggers can also use a lot of memory.
Stephan Szabo wrote: > Explain output would also be useful. I would wonder if it's a problem > with a hash that misestimated the necessary size; you might see if > analyzing the tables involved changes its behavior. I executed ANALYZE just before running the problematic statement. Will post the output of EXPLAIN, if no other explanation comes up.
Bruno Wolff III wrote: > I think deferred triggers can also use a lot of memory. I do indeed have several columns with REFERENCES x DEFERRABLE INITIALLY DEFERRED... Next time I run the procedure, I will try dropping the foreign key constraints first. Incidently, would be nice if Postgres had something like Oracle's NOVALIDATE clause, as I can be certain that the foreign keys are valid!