Java Out-of-memory errors on attempts to read tables with millions of rows - Mailing list pgsql-performance

From Rich Cullingford
Subject Java Out-of-memory errors on attempts to read tables with millions of rows
Date
Msg-id 3F12DFFE.3090207@sysd.com
Whole thread Raw
Responses Re: Java Out-of-memory errors on attempts to read tables with millions of rows  (Evil Azrael <evilazrael@evilazrael.de>)
List pgsql-performance
Greetings,
We have several tables (in a PG 7.3.3 database on RH Linux 7.3) with 2M+
rows (each row 300-400 bytes in length) that we SELECT into a JDBC
ResultSet for display to the user. We expected that the driver would not
actually transmit data from the database until the application began
issuing getXXX() calls. (IIRC, this is the way the Oracle driver works,
and we had created a buffering mechanism to use it.) Instead, the driver
appears to be attempting to create the whole rowset in Java memory
before returning, and the application runs out of memory. (Java has been
configured to use up to 1.5G on the machine this occurs on.)

Now the SELECT is preceded by a COUNT of the rows that the same query
would return, so perhaps that's what's causing the problem. But the
question is, is this the way a ResultSet is supposed to work? Are there
any configuration options available that modify this behavior? Are there
commercial implementations of PG JDBC that don't have this problem?
(Shame on me, but I have to ask. :)

Any help will be greatly appreciated!

                                 Rich Cullingford
                                 rculling@sysd.com


pgsql-performance by date:

Previous
From: Richard Huxton
Date:
Subject: Re: Tunning FreeeBSD and PostgreSQL
Next
From: Evil Azrael
Date:
Subject: Re: Java Out-of-memory errors on attempts to read tables with millions of rows