Hello,
I've fixed and tested a memory leak bug in dblink. Could you review and
commit this? I'll add this to the CommitFest shortly.
[Problem]
A user reported a problem in pgsql-jp ML that he encountered a "out of
memory" error when he ran the ran the following function on 32-bit
PostgreSQL 9.3:
CREATE OR REPLACE FUNCTION aaa(
character varying)
RETURNS character varying AS
$BODY$
DECLARE
...
BEGIN
PERFORM (SELECT DBLINK_CONNECT('conn','dbname=DB-B user=postgres'));
DELETE FROM tbl0010 dba
WHERE EXISTS
(
SELECT tbl0010_cd FROM tbl0010
INNER JOIN (
SELECT * FROM DBLINK
('conn','
SELECT tbl0411_cd FROM tbl0411
INNER JOIN(
...
The above query calls dblink() hundreds of thousands of times. You should
reproduce the problem with a simpler query like this:
CREATE TABLE mytable (col int);
INSERT INTO mytable ...; /* insert many rows */
SELECT *
FROM mytable
WHERE EXISTS
(SELECT *
FROM dblink(
'con',
'SELECT * FROM mytable WHERE col = ' || col)
t(col));
[Cause]
Hundreds of thousands of the following same line were output in the server
log:
dblink temporary context: 8192 total in 1 blocks; 8176 free (0 chunks); 16
used
Each dblink function call creates an instance of this memory context, but it
fails to delete it. This bug seems to have been introduced in 9.2.0 by this
performance improvement:
Improve efficiency of dblink by using libpq's new single-row
processingmode(Kyotaro Horiguchi, Marko Kreen)
Regards
MauMau