Check to see if f2.length() is larger than Integer.MAX_SIZE with your
big file.
--
Éric Paré for/pour LexUM
Université de Montréal
Centre de recherche en droit public
C.P. 6128, succ. Centre-ville
Montréal (Qc) Canada H3C 3J7
+1 514-343-6111 #0873
paree@lexum.umontreal.ca
I am using pg74.215.jdbc3.jar and sever version() is: PostgreSQL 7.4.6 on
i486-pc-linux-gnu, compiled by GCC gcc (GCC) 3.3.4 20040623 (Gentoo Linux
3.3.4-r1, ssp-3.3.2-2, pie-8.7.6). Server setup is "out of the box" apart
from PGOPTS="-N 16 -B 2048 -i" (N was reduced from 1024 to get the server to
start .. host machine has 128mb ram)
My code is very similar to the example given in the documentation. It seems
to work OK for files up to around 780k, but when I try a larger file (e.g.
5,498k it seems to just "hang" .. no exception, no error report).
c.setAutoCommit(false);
try
{
File f2 = new File(fullPath);
FileInputStream is = new FileInputStream(f2);
int image_type_id = 1;
log.info("uploading file " + fullPath + " size " + f2.length());
PreparedStatement ps = c.prepareStatement("insert into photo
(catalog_id, image_type_id, width, height, picture) values(?, ?, ?, ?, ?
)");
ps.setInt(1,catalog_id);
ps.setInt(2,image_type_id);
ps.setInt(3,width);
ps.setInt(4,height);
// setBinaryStream seems to work ok with small files, not with
larger files
ps.setBinaryStream(5,is,(int)f2.length());
ps.executeUpdate();
ps.close();
is.close();
c.commit();
}
and the definition of photo table is:
create table photo
(
catalog_id int,
image_type_id int,
width int,
height int,
picture bytea,
foreign key (catalog_id) references catalog(id),
foreign key (image_type_id) references image_type(id)
);
create index photo_cat_id_idx on photo (catalog_id);