Large data sets and FOR LOOP's - Mailing list pgsql-sql

From Bo Lorentsen
Subject Large data sets and FOR LOOP's
Date
Msg-id 1016012241.1094.90.camel@netgroup
Whole thread Raw
Responses Re: Large data sets and FOR LOOP's  (Tom Lane <tgl@sss.pgh.pa.us>)
Re: Large data sets and FOR LOOP's  (Jan Wieck <janwieck@yahoo.com>)
List pgsql-sql
Hi ...

I have been working on some server side scripts in Plpgsql and I'm quite
happy about them (nice and fast way to work with large data sets) :-)

But, if I, in my code, make "for loop" on a huge select result (about
1.7 mil. rows), my function begin to consume alot of memory. The code
looks like this : 

FOR trec IN SELECT f1,f2 FROM large_tabel LOOP...
END LOOP;

When selecting the result with a small limit (about 1000) the function
still consumes much memory, but when the function is successfully done
the memory are freed nicely again. When using the full select I run out
of memory (I got 2G in this mashine + swap :-)).

Are there a way to release this memory, or would a cursor be the answer.
Does Plpgsql not use normal scope rules in FOR LOOP's ?

Anyway, I fixed the problem by removing the topmost FOR loop, and then
called the function from a SELECT, multibly times. Like this :

SELECT myfunc( f1, f2 ) FROM large_table;

Is this a known issue, or is it at bug, or are there something I have
misunderstood ?

Who maintain the part of PostgreSQL, anyway ?

/BL 





pgsql-sql by date:

Previous
From: Stephan Szabo
Date:
Subject: Re: [INIMSS] How to foreign key
Next
From: Andre Schubert
Date:
Subject: REFERENCES