Thread: JDBC - large objects

JDBC - large objects

From
Rolland Crunk
Date:

I am having some problem getting the jdbc driver to work properly with
large objects using standard jdbc interfaces.  The tables are pretty much
standard relational tables except for one column that I serialize 
implementations of the java.security.acl.Acl as objects into. 

The error I get is: ERROR:  lo_write: invalid large obj descriptor (0)

This was the same error I got running the blobtest until I applied Tatsuo Ishii's
patch I found in the mailing list archives. I tried the same thing in my code
(turn on explicit transactions when storing a blob) but it doesn't seem to
have any effect.

I have tried defining the acl field in my create table statement as both:
:    :acl    oid,:    :

and
:    :acl    char[]:    :

And see same thing.

The same (java) code runs fine using Oracle 8 and their thin driver. 

I guess what I need to know is: is what I am trying to do possible using
PostgreSQL/JDBC without using the PostgreSQL extensions? (not an
option for me).  If so, what should I use for serialized columns in the
create table sql statement?  Can it be done without turning off autocommit?

Thanks in advance for any help anyone can provide.

Cordially,

rc

ps: My environment is:
Solaris 2.7 (intel)jdk 1.2 (jdk 1.1 fares no better)PostgreSQL 6.5


Re: [INTERFACES] JDBC - large objects

From
Rolland Crunk
Date:
Update. This particular problem is resolved. Here what it was:

The failing code as per my original posting went something like this:
PreparedStatement statement = conn.prepareStatement("...");    :        ://  add values to the prepared statement    :
     :conn.setAutoCommit(false);statement.executeUpdate("...");conn.commit();
 

When I changed it to go something like this:
conn.setAutoCommit(false);PreparedStatement statement = conn.prepareStatement("...");    :        ://  add values to
theprepared statement    :        :statement.executeUpdate("...");conn.commit();
 

The "ERROR:  lo_write: invalid large obj descriptor (0)" problem went away.

To summarize, autocommit needs to be turned off prior to statement creation.
(at least for prepared statements).

rc

At 02:03 AM 7/6/99 -0700, Rolland Crunk wrote:
>
>
>I am having some problem getting the jdbc driver to work properly with
>large objects using standard jdbc interfaces.  The tables are pretty much
>standard relational tables except for one column that I serialize 
>implementations of the java.security.acl.Acl as objects into. 
>
>The error I get is: ERROR:  lo_write: invalid large obj descriptor (0)
>
>This was the same error I got running the blobtest until I applied Tatsuo 
>Ishii's
>patch I found in the mailing list archives. I tried the same thing in my code
>(turn on explicit transactions when storing a blob) but it doesn't seem to
>have any effect.
>
>I have tried defining the acl field in my create table statement as both:
>
>    :    :
>    acl    oid,
>    :    :
>
>and
>
>    :    :
>    acl    char[]
>    :    :
>
>And see same thing.
>
>The same (java) code runs fine using Oracle 8 and their thin driver. 
>
>I guess what I need to know is: is what I am trying to do possible using
>PostgreSQL/JDBC without using the PostgreSQL extensions? (not an
>option for me).  If so, what should I use for serialized columns in the
>create table sql statement?  Can it be done without turning off autocommit?
>
>Thanks in advance for any help anyone can provide.
>
>Cordially,
>
>rc
>
>ps: My environment is:
>
>    Solaris 2.7 (intel)
>    jdk 1.2 (jdk 1.1 fares no better)
>    PostgreSQL 6.5