Thread: Is there a size limit on setBinaryStream?

Is there a size limit on setBinaryStream?

From
jonathan.lister@vaisala.com
Date:

I am using pg74.215.jdbc3.jar and sever version() is: PostgreSQL 7.4.6 on i486-pc-linux-gnu, compiled by GCC gcc (GCC) 3.3.4 20040623 (Gentoo Linux 3.3.4-r1, ssp-3.3.2-2, pie-8.7.6). Server setup is "out of the box" apart from PGOPTS="-N 16 -B 2048 -i" (N was reduced from 1024 to get the server to start .. host machine has 128mb ram)

My code is very similar to the example given in the documentation. It seems to work OK for files up to around 780k, but when I try a larger file (e.g. 5,498k it seems to just "hang" .. no exception, no error report).

 
          c.setAutoCommit(false);
          try
          {
            File f2 = new File(fullPath);           
            FileInputStream is = new FileInputStream(f2);
            int image_type_id = 1;
            log.info("uploading file " + fullPath + " size " + f2.length());
            PreparedStatement ps = c.prepareStatement("insert into photo (catalog_id, image_type_id, width, height, picture) values(?, ?, ?, ?, ? )");

            ps.setInt(1,catalog_id);
            ps.setInt(2,image_type_id);
            ps.setInt(3,width);
            ps.setInt(4,height);
            // setBinaryStream seems to work ok with small files, not with larger files
            ps.setBinaryStream(5,is,(int)f2.length());
            ps.executeUpdate();
            ps.close();
            is.close();
            c.commit();
         }

and the definition of photo table is:
create table photo
(
  catalog_id int,
  image_type_id int,
  width int,
  height int,
  picture bytea,
    foreign key (catalog_id) references catalog(id),
    foreign key (image_type_id) references image_type(id)
);
create index photo_cat_id_idx on photo (catalog_id);

Re: Is there a size limit on setBinaryStream?

From
jonathan.lister@vaisala.com
Date:

Thanks for good suggestion Eric, code modified to:
if (f2.length() < Integer.MAX_VALUE)
{
   .. do insert
}
but my files are smaller than MAX_SIZE, e.g.

FileSize: 5629061 Integer.MAX_SIZE: 2147483647

and still setBinaryStream just goes to sleep. Nothing obvious in the postgres server log either.

-----Original Message-----
From: Éric Paré [mailto:paree@LEXUM.UMontreal.CA]
Sent: 27 November 2004 13:57
To: jonathan.lister@vaisala.com
Cc: pgsql-jdbc@postgresql.org
Subject: Re: [JDBC] Is there a size limit on setBinaryStream?

Check to see if f2.length() is larger than Integer.MAX_SIZE with your
big file.

--
Éric Paré for/pour LexUM
Université de Montréal
Centre de recherche en droit public
C.P. 6128, succ. Centre-ville
Montréal (Qc) Canada H3C 3J7
+1 514-343-6111 #0873
paree@lexum.umontreal.ca

Re: Is there a size limit on setBinaryStream?

From
"Éric Paré"
Date:
Check to see if f2.length() is larger than Integer.MAX_SIZE with your
big file.

--
Éric Paré for/pour LexUM
Université de Montréal
Centre de recherche en droit public
C.P. 6128, succ. Centre-ville
Montréal (Qc) Canada H3C 3J7
+1 514-343-6111 #0873
paree@lexum.umontreal.ca
I am using pg74.215.jdbc3.jar and sever version() is: PostgreSQL 7.4.6 on
i486-pc-linux-gnu, compiled by GCC gcc (GCC) 3.3.4 20040623 (Gentoo Linux
3.3.4-r1, ssp-3.3.2-2, pie-8.7.6). Server setup is "out of the box" apart
from PGOPTS="-N 16 -B 2048 -i" (N was reduced from 1024 to get the server to
start .. host machine has 128mb ram)

My code is very similar to the example given in the documentation. It seems
to work OK for files up to around 780k, but when I try a larger file (e.g.
5,498k it seems to just "hang" .. no exception, no error report).

          c.setAutoCommit(false);
          try
          {
            File f2 = new File(fullPath);
            FileInputStream is = new FileInputStream(f2);
            int image_type_id = 1;
            log.info("uploading file " + fullPath + " size " + f2.length());
            PreparedStatement ps = c.prepareStatement("insert into photo
(catalog_id, image_type_id, width, height, picture) values(?, ?, ?, ?, ?
)");
            ps.setInt(1,catalog_id);
            ps.setInt(2,image_type_id);
            ps.setInt(3,width);
            ps.setInt(4,height);
            // setBinaryStream seems to work ok with small files, not with
larger files
            ps.setBinaryStream(5,is,(int)f2.length());
            ps.executeUpdate();
            ps.close();
            is.close();
            c.commit();
         }

and the definition of photo table is:
create table photo
(
  catalog_id int,
  image_type_id int,
  width int,
  height int,
  picture bytea,
    foreign key (catalog_id) references catalog(id),
    foreign key (image_type_id) references image_type(id)
);
create index photo_cat_id_idx on photo (catalog_id);


Re: Is there a size limit on setBinaryStream?

From
Oliver Jowett
Date:
jonathan.lister@vaisala.com wrote:
> I am using pg74.215.jdbc3.jar [...]

> My code is very similar to the example given in the documentation. It
> seems to work OK for files up to around 780k, but when I try a larger
> file (e.g. 5,498k it seems to just "hang" .. no exception, no error report).

I suspect continuous GC is the cause, especially if you're running with
a small heap. Try turning on -verbose:gc and see what you get.

You will have more success with large binary objects if you use the
development driver. The stable driver creates a lot of intermediate
representations when you use setBytes() or setBinaryStream(). This means
you need perhaps 10-20 times the stream's length in temporary heap space.

The development driver streams the data directly, and should only need a
small constant-sized intermediate buffer regardless of the stream length.

-O

Re: Is there a size limit on setBinaryStream?

From
jonathan.lister@vaisala.com
Date:

Excellent diagnostics Oliver - I downloaded pg80b1.308.jdbc.jar and the calls to setBytes() started to work with files in the 4 - 7 Gig range without any other changes to my code.

Many thanks!

-----Original Message-----
From: Oliver Jowett [mailto:oliver@opencloud.com]
Sent: 28 November 2004 12:40
To: jonathan.lister@vaisala.com
Cc: pgsql-jdbc@postgresql.org
Subject: Re: [JDBC] Is there a size limit on setBinaryStream?

jonathan.lister@vaisala.com wrote:
> I am using pg74.215.jdbc3.jar [...]

> My code is very similar to the example given in the documentation. It
> seems to work OK for files up to around 780k, but when I try a larger
> file (e.g. 5,498k it seems to just "hang" .. no exception, no error report).

I suspect continuous GC is the cause, especially if you're running with
a small heap. Try turning on -verbose:gc and see what you get.

You will have more success with large binary objects if you use the
development driver. The stable driver creates a lot of intermediate
representations when you use setBytes() or setBinaryStream(). This means
you need perhaps 10-20 times the stream's length in temporary heap space.

The development driver streams the data directly, and should only need a
small constant-sized intermediate buffer regardless of the stream length.

-O

---------------------------(end of broadcast)---------------------------
TIP 7: don't forget to increase your free space map settings

Re: Is there a size limit on setBinaryStream?

From
jonathan.lister@vaisala.com
Date:

oops, I meant *setBinaryStream()* of course!

-----Original Message-----
From: Lister Jonathan PJL
Sent: 28 November 2004 16:55
To: pgsql-jdbc@postgresql.org
Cc: 'Oliver Jowett'
Subject: RE: [JDBC] Is there a size limit on setBinaryStream?

Excellent diagnostics Oliver - I downloaded pg80b1.308.jdbc.jar and the calls to setBytes() started to work with files in the 4 - 7 Gig range without any other changes to my code.

Many thanks!

-----Original Message-----
From: Oliver Jowett [mailto:oliver@opencloud.com]
Sent: 28 November 2004 12:40
To: jonathan.lister@vaisala.com
Cc: pgsql-jdbc@postgresql.org
Subject: Re: [JDBC] Is there a size limit on setBinaryStream?

jonathan.lister@vaisala.com wrote:
> I am using pg74.215.jdbc3.jar [...]

> My code is very similar to the example given in the documentation. It
> seems to work OK for files up to around 780k, but when I try a larger
> file (e.g. 5,498k it seems to just "hang" .. no exception, no error report).

I suspect continuous GC is the cause, especially if you're running with
a small heap. Try turning on -verbose:gc and see what you get.

You will have more success with large binary objects if you use the
development driver. The stable driver creates a lot of intermediate
representations when you use setBytes() or setBinaryStream(). This means
you need perhaps 10-20 times the stream's length in temporary heap space.

The development driver streams the data directly, and should only need a
small constant-sized intermediate buffer regardless of the stream length.

-O

---------------------------(end of broadcast)---------------------------
TIP 7: don't forget to increase your free space map settings