... - Mailing list pgsql-jdbc

From gurkan@resolution.com
Subject ...
Date
Msg-id 1125617538.43178f82aaebf@www.resolution.com
Whole thread Raw
Responses Re: setting large bytea values  (Oliver Jowett <oliver@opencloud.com>)
List pgsql-jdbc
Hi,
I have been working on conversion program from Informix to PostgreSQL db, and I
have a one table which has large data(bytea).

I need to be able to copy this one row (docdata column) that I have which is
close to 32MB. Program seems to read the data but cannot copy to postgres
(PreparedStatement), it gives OutOfMemoryError. I ran the program with these
heap too.
java -Xms200m -Xmx700m AddDoc
and I do not want to use LargeObject api (I do partion of data based on their
dates going from one table to many tables; data seems to be stored at
pg_largeobjects tbale). Here is the part of the code, and any help apriciated.
How do I copy large data like 32MB or greater?
Thanks

wbin = informixRs.getBinaryStream("docdata");
while ((wbyte = wbin.read ()) != -1) {
    wbout.write (wbyte);
}
outln ("Size of KB is: " + wbout.toByteArray().length/1024);
size += wbout.toByteArray().length;
inp = new ByteArrayInputStream(wbout.toByteArray());
wbin = null;

postgresStmt = postgresConn.prepareStatement("INSERT INTO "+tableName+"
(id,docdata) VALUES (?,?)");
postgresStmt.setInt(1,id);
if(docdef_id.compareTo("12720") == 0 ) {
    outln("\n\nbefore out of memory");
    postgresStmt.setBinaryStream (2, inp, inp.available());
    outln("\n\nafter out of memory");//cannot reach here
} else {
    postgresStmt.setBinaryStream (2, inp, inp.available());
}

-------------------------------------------------
This mail sent through IMP: www.resolution.com

pgsql-jdbc by date:

Previous
From: Oliver Jowett
Date:
Subject: Re: Prepared statement not using an index
Next
From: Oliver Jowett
Date:
Subject: Re: setting large bytea values