I have a function that takes a comma seperated string from another table and
parses it into another table. When I run it on a string with 130 ids it
works fine, when I run it on a string with 50,000 ids (each about 6
characters long) I get the Error: Memory exhausted in AllocSetAlloc(269039).
The system is a dual cpu intel with 1gig of ram and 1gig of swap space.
Before I run the function I start top and it shows about 800mb of ram in
use. This quickly excellerates until all ram is full and then fills up the
swap space then postgres pukes the error message. It is running postgres
7.1.
The questions I have are: Is my script that inefficent that I am using up all of my ram? (I know the
string of ids in another table is horribly inefficent, but it is a
historical nessesity I would rather avoid changing.) What is taking up all of this ram, how can I even trace the
problemdown?
CREATE FUNCTION membidsintotable(INTEGER) RETURNS boolean AS '
DECLARE string_rec RECORD; resdataid ALIAS FOR $1;
BEGIN
SELECT INTO string_rec membids FROM listresultmembids WHERE listresdataid
= resdataid;
IF NOT FOUND THEN RETURN FALSE; END IF;
WHILE char_length(string_rec.membids) > 0 LOOP
INSERT INTO listmembids VALUES (resdataid, substring(string_rec.membids
FROM 0 FOR position('','' in string_rec.membids)));
string_rec.membids := trim(leading substring(string_rec.membids FROM 0
FOR position('','' in string_rec.membids)) FROM string_rec.membids); string_rec.membids := trim(leading '','' FROM
string_rec.membids);END LOOP;
RETURN TRUE;
END;
' LANGUAGE 'plpgsql';
Thanks for the help,
Jeff Barrett