Thread: JDBC + large objects problem

JDBC + large objects problem

From
Patrick Goodwill
Date:
I'm trying to use JDBC and BLOBS to store large amounts of text in a
database.  I get a strange error when I try to use the conventional JDBC
interface... it comes out with a SQL Exceptions of:

"InputStream as parameter not supported"

for the code:
    Connection conn = pool.getConnection();            PreparedStatement pstmt =     conn.prepareStatement("INSERT INTO
t"+ book                   + "_data (author_id, title, text,                                     type) VALUES ( ?, ?,
?,?)");    pstmt.setInt(1, userId);    pstmt.setString(2, title);    InputStream textStream = stringToStream(text);
pstmt.setBinaryStream(3,textStream, text.length());    pstmt.setInt(4, type);    pstmt.executeUpdate();
pstmt.close();


... with some helper functions....
   private InputStream stringToStream(String string) {byte[] bytes = string.getBytes();ByteArrayInputStream stream =
newByteArrayInputStream(bytes);return (InputStream) stream;   }
 
   private String      streamToString(InputStream stream) {try {    int length = stream.available();    byte[] bytes =
newbyte[length];    stream.read(bytes);    return new String(bytes);} catch (IOException e) {    System.out.println("No
Stream");}returnnull;   }
 

with an abbreviated schema of....

>> \d t1_data                              Table "t1_data"Attribute |  Type   |                       Modifier
-----------+---------+-------------------------------------------------------data_id   | integer | not null default
                        nextval('t1_data_data_id_seq'::text)author_id | integer |title     | text    |text      | oid
 |type      | integer |time      | time    |
 
Index: t1_data_pkey   
   
.... using postgresql 7.0 and the newest JDBC driver from retep.org.uk


if ya'll have any ideas why it does what it does, i just might kiss your
feet.  =)

-Patrick.




Re: JDBC + large objects problem

From
Rob Judd
Date:
The problem is that the JDBC interface has not yet implemented a lot of
the functionality, so you get these "not supported" Exceptions.  On the
bright side, for every problem like this there seems to be enough
functionality to achieve what you need.

In this case it is a little simpler:

Instead of:

InputStream textStream = stringToStream(text);
pstmt.setBinaryStream(3, textStream, text.length());

you can use:

pstmt.setBytes(3, string.getBytes()) ;

and to retrieve you can use ResultSet.getBytes() to return the byte array.

Hope this helps, Rob

> I'm trying to use JDBC and BLOBS to store large amounts of text in a
> database.  I get a strange error when I try to use the conventional JDBC
> interface... it comes out with a SQL Exceptions of:
> 
> "InputStream as parameter not supported"
> 
> for the code:
> 
>         Connection conn = pool.getConnection();        
>         PreparedStatement pstmt = 
>         conn.prepareStatement("INSERT INTO t" + book 
>                       + "_data (author_id, title, text,
>                                       type) VALUES ( ?, ?, ?, ?)");
>         pstmt.setInt(1, userId);
>         pstmt.setString(2, title);
>         InputStream textStream = stringToStream(text);
>         pstmt.setBinaryStream(3, textStream, text.length());
>         pstmt.setInt(4, type);
>         pstmt.executeUpdate();        
>         pstmt.close();
> 
> 
> ... with some helper functions....
> 
>     private InputStream stringToStream(String string) {
>     byte[] bytes = string.getBytes();
>     ByteArrayInputStream stream = new ByteArrayInputStream(bytes);
>     return (InputStream) stream;
>     }
> 
>     private String      streamToString(InputStream stream) {
>     try {
>         int length = stream.available();
>         byte[] bytes = new byte[length];
>         stream.read(bytes);
>         return new String(bytes);
>     } catch (IOException e) {
>         System.out.println("No Stream");
>     }
>     return null;
>     }
> 
> with an abbreviated schema of....
> 
> >> \d t1_data
>                                Table "t1_data"
>  Attribute |  Type   |                       Modifier
> -----------+---------+-------------------------------------------------------
>  data_id   | integer | not null default
>                                nextval('t1_data_data_id_seq'::text)
>  author_id | integer |
>  title     | text    |
>  text      | oid     |
>  type      | integer |
>  time      | time    |
> Index: t1_data_pkey   
> 
>     
> .... using postgresql 7.0 and the newest JDBC driver from retep.org.uk
> 
> 
> if ya'll have any ideas why it does what it does, i just might kiss your
> feet.  =)
> 
> -Patrick.
> 
> 



Re: JDBC + large objects problem

From
Joachim Achtzehnter
Date:
Today, in a message to pgsql-interfaces, Patrick Goodwill wrote:
>
> "InputStream as parameter not supported"
> 
>         PreparedStatement pstmt = ...
>           ...
>         pstmt.setBinaryStream(3, textStream, text.length());

As the error message says, streams are not yet supported for blobs.
Instead, read the contents of the stream into a byte array and use the
setBytes() method of PreparedStatement.

> if ya'll have any ideas why it does what it does, i just might kiss your
> feet.  =)

Better yet, you could consult the archives of this mailing list. This
question seems to come up every couple of weeks. You can find an archive
at http://www.geocrawler.com/lists/3/Databases/105/0/.

Joachim

-- 
work:     joachima@realtimeint.com  (http://www.realtimeint.com)
private:  joachim@kraut.bc.ca       (http://www.kraut.bc.ca)



Re: JDBC + large objects problem

From
"Peter Mount"
Date:
Streams are not supported yet, hence the message.

There are various ways around this, all of which are discussed recently in
the list archives.

Peter

--
Peter T Mount  peter@retep.org.uk, me@petermount.com
Homepage: http://www.retep.org.uk Contact details: http://petermount.com
PostgreSQL JDBC: http://www.retep.org.uk/postgres/
Java PDF Generator: http://www.retep.org.uk/pdf/
----- Original Message -----
From: "Patrick Goodwill" <goodwill@cheese.stanford.edu>
To: <pgsql-interfaces@postgresql.org>
Sent: Friday, August 25, 2000 10:25 PM
Subject: [INTERFACES] JDBC + large objects problem


> I'm trying to use JDBC and BLOBS to store large amounts of text in a
> database.  I get a strange error when I try to use the conventional JDBC
> interface... it comes out with a SQL Exceptions of:
>
> "InputStream as parameter not supported"
>
> for the code:
>
>     Connection conn = pool.getConnection();
>     PreparedStatement pstmt =
> conn.prepareStatement("INSERT INTO t" + book
>       + "_data (author_id, title, text,
>                                       type) VALUES ( ?, ?, ?, ?)");
>     pstmt.setInt(1, userId);
>     pstmt.setString(2, title);
>     InputStream textStream = stringToStream(text);
>     pstmt.setBinaryStream(3, textStream, text.length());
>     pstmt.setInt(4, type);
>     pstmt.executeUpdate();
>     pstmt.close();
>
>
> ... with some helper functions....
>
>     private InputStream stringToStream(String string) {
> byte[] bytes = string.getBytes();
> ByteArrayInputStream stream = new ByteArrayInputStream(bytes);
> return (InputStream) stream;
>     }
>
>     private String      streamToString(InputStream stream) {
> try {
>     int length = stream.available();
>     byte[] bytes = new byte[length];
>     stream.read(bytes);
>     return new String(bytes);
> } catch (IOException e) {
>     System.out.println("No Stream");
> }
> return null;
>     }
>
> with an abbreviated schema of....
>
> >> \d t1_data
>                                Table "t1_data"
>  Attribute |  Type   |                       Modifier
> -----------+---------+----------------------------------------------------
---
>  data_id   | integer | not null default
>                                nextval('t1_data_data_id_seq'::text)
>  author_id | integer |
>  title     | text    |
>  text      | oid     |
>  type      | integer |
>  time      | time    |
> Index: t1_data_pkey
>
>
> .... using postgresql 7.0 and the newest JDBC driver from retep.org.uk
>
>
> if ya'll have any ideas why it does what it does, i just might kiss your
> feet.  =)
>
> -Patrick.
>
>