Justin Foster <jfoster@corder-eng.com> writes:
> I am running a test which performs 1000 transactions of 1000 updates
> of a single column in a single table, or (1 tranaction = 1000 updates)
> * 1000. I have no indecies for any of the columns and the table has 3
> columns and 200 records. I do a VACUUM ANALYZE after every
> transaction. A single transaction takes about 3-6 seconds.
> It appears that RAM decreases at about 10 to 100K a second until it is
> all gone.
When you say "RAM decreases", do you mean that the process size of the
backend is growing?
We have some known problems with memory leakage during a query
(hopefully 7.1 will solve this), but I'm not aware of any problems
that would cause leakage that accumulates across queries --- at least
not for such a simple case as you describe. Normally, all memory used
during a query is freed at query end, so the test you describe ought
to run in a static backend process size.
Could we see the exact query/queries you are running, and the full
definition of the table?
regards, tom lane