Thread: JDBC Blob API bug?

JDBC Blob API bug?

From
"David Wall"
Date:
It's hard to fault the PG JDBC library for this, but it does appear to be a
problem with the java.sql.Blob API (or at least it's not documented well).

I'm running 7.2.2.

If you retrieve a Blob and then use the Blob.getBytes(0,blob.size()) method
to suck in the entire blob into a byte array, there is no mechanism to
"close" the Blob.  So, with PG JDBC, the routine does a seek and read
against the LargeObject, but there's no mechanism to close it, so the stream
stays open.  This results in strange errors in subsequent calls (like sql
exception "No results were returned by the query.").

The only workaround I've seen is to use the Blob.getBinaryStream(),suck in
the data, then close the stream which then closes the underlying
LargeObject.

Here's a utility routine I used for converting a Blob into a byte[] when
doing a SELECT:

    public byte[] blobToBytes(java.sql.Blob b)
    {
        java.io.InputStream is = null;
        try
        {
            is = b.getBinaryStream();
            byte[] bytes = new byte[(int)b.length()];
            is.read(bytes);
            return bytes;
        }
        catch( java.sql.SQLException e )
        {
            return null;
        }
        catch( java.io.IOException e )
        {
            return null;
        }
        finally
        {
            try
            {
                if ( is != null )
                    is.close();
            }
            catch( Exception e ) {}
        }
    }


David


Re: JDBC Blob API bug?

From
Barry Lind
Date:
David,

I think this is fixed in current sources.  The stream isn't closed any
sooner, but you should no longer receive the error you were seeing.  Can
you try the development build from jdbc.postgresql.org?

thanks,
--Barry

David Wall wrote:

 >It's hard to fault the PG JDBC library for this, but it does appear to
be a
 >problem with the java.sql.Blob API (or at least it's not documented well).
 >
 >I'm running 7.2.2.
 >
 >If you retrieve a Blob and then use the Blob.getBytes(0,blob.size())
method
 >to suck in the entire blob into a byte array, there is no mechanism to
 >"close" the Blob.  So, with PG JDBC, the routine does a seek and read
 >against the LargeObject, but there's no mechanism to close it, so the
stream
 >stays open.  This results in strange errors in subsequent calls (like sql
 >exception "No results were returned by the query.").
 >
 >The only workaround I've seen is to use the Blob.getBinaryStream(),suck in
 >the data, then close the stream which then closes the underlying
 >LargeObject.
 >
 >Here's a utility routine I used for converting a Blob into a byte[] when
 >doing a SELECT:
 >
 >    public byte[] blobToBytes(java.sql.Blob b)
 >    {
 >        java.io.InputStream is = null;
 >        try
 >        {
 >            is = b.getBinaryStream();
 >            byte[] bytes = new byte[(int)b.length()];
 >            is.read(bytes);
 >            return bytes;
 >        }
 >        catch( java.sql.SQLException e )
 >        {
 >            return null;
 >        }
 >        catch( java.io.IOException e )
 >        {
 >            return null;
 >        }
 >        finally
 >        {
 >            try
 >            {
 >                if ( is != null )
 >                    is.close();
 >            }
 >            catch( Exception e ) {}
 >        }
 >    }
 >
 >
 >David
 >
 >
 >---------------------------(end of broadcast)---------------------------
 >TIP 5: Have you checked our extensive FAQ?
 >
 >http://www.postgresql.org/users-lounge/docs/faq.html
 >
 >
 >