I'm trying to fill a table with several million rows that are obtained
directly from a complex query.
For whatever reason, Postgres at one point starts using several
gigabytes of memory, which eventually slows down the system until it no
longer responds.
At first I assumed I had unintentionally assigned to much memory to
Postgres, but I observe the same behavior even if I use the default
postrgresql.conf.
Then I thought there may be some problem with the system itself, but it
has passed several load tests, and I observed the same problem on a
second system.
I am was using 7.4, and now 8.0, on a machine running Fedora Core 2.
Any ideas? Is this a known problem, or should Postgres be able to handle
this? May be tricky to reproduce the problem, as a lot of data is
required, but I can post the DDL/DML statements I am using if this helps.