Re: Inserting large BLOBs via JDBC - OutOfMemoryError - Mailing list pgsql-jdbc

From hhaag@gmx.de
Subject Re: Inserting large BLOBs via JDBC - OutOfMemoryError
Date
Msg-id 19217.1029485671@www9.gmx.net
Whole thread Raw
In response to Inserting large BLOBs via JDBC - OutOfMemoryError  (hhaag@gmx.de)
List pgsql-jdbc
There might be a workaround:

From JDBC 2.0 on, the interface java.sql.Blob allows to manipulate BLOBs.
AbstractJdbc2Statement.setBlob() (code posted below) seems to create a BLOB
upfront storing it in the database. The actual INSERT command will then only
contain the OID, avoiding all memory problems.

I'll let you know how it goes.



    public void setBlob(int i, Blob x) throws SQLException
    {
        InputStream l_inStream = x.getBinaryStream();
        int l_length = (int) x.length();
        LargeObjectManager lom = connection.getLargeObjectAPI();
        int oid = lom.create();
        LargeObject lob = lom.open(oid);
        OutputStream los = lob.getOutputStream();
        try
        {
            // could be buffered, but then the OutputStream returned by LargeObject
            // is buffered internally anyhow, so there would be no performance
            // boost gained, if anything it would be worse!
            int c = l_inStream.read();
            int p = 0;
            while (c > -1 && p < l_length)
            {
                los.write(c);
                c = l_inStream.read();
                p++;
            }
            los.close();
        }
        catch (IOException se)
        {
            throw new PSQLException("postgresql.unusual", se);
        }
        // lob is closed by the stream so don't call lob.close()
        setInt(i, oid);
    }



--
GMX - Die Kommunikationsplattform im Internet.
http://www.gmx.net


pgsql-jdbc by date:

Previous
From: hhaag@gmx.de
Date:
Subject: Re: Inserting large BLOBs via JDBC - OutOfMemoryError
Next
From: Barry Lind
Date:
Subject: Re: Inserting large BLOBs via JDBC - OutOfMemoryError