Yes. I figured lowering the batch size would help, and it did. But why
should the new driver behave differently is still a mystery. Thanks
anyways.
PS: I'm not using bytea for my system, so I'm not sure if I can help
with your testing. However if you need me to setup some testing, let me
know.
Thanks and Regards,
Sharib Anis
Senior Research Engineer
Wireless Intellect Labs Pte Ltd
A MobileOne Company
http://www.wilabs.com
sharib.anis@wilabs.com
DID: +65-6843 8672; Fax: +65-6560-4950 (TZ: +0800 GMT)
There are 10 kinds of people in the world, those who understand binary
and those who don't.
DISCLAIMER: This email (including any attachments) is intended for the
recipient(s) named above and may contain information that is
confidential
to Wireless Intellect Labs Pte Ltd. Any use of the information
(including,
but not limited to, total or partial reproduction, distribution or
dissemination in any form) by persons other than the intended
recepient(s)
is prohibited. If you are not an intended recipient of this email,
please
notify the sender immediately and delete it. Any views expressed in this
message are those of the individual sender, except where the sender
states
them, with requisite authority, to be those of Wireless Intellect Labs
Pte Ltd.
-----Original Message-----
From: Oliver Jowett [mailto:oliver@opencloud.com]
Sent: 01 June 2004 08:39
To: Sharib Anis
Cc: pgsql-jdbc@postgresql.org
Subject: Re: [JDBC] Postgresql 7.4.2 and OutOfMemoryError
Sharib Anis wrote:
> Hello All,
>
> I'm using Postgresql for our java application. I started with Postgres
> 7.2.1 and everything worked fine. Then however, I decided to upgrade
to
> 7.4.2.
>
> Since then I'm facing this frustrating problem where I'm always
> getting
> OutOfMemoryError. Basically there is an operation which involves
> inserts/updates in batches of 20K records. With the earlier version of
> Postgresql and driver, it worked smoothly, but not now. Any ideas why?
> What should I do to overcome this? I'm using the latest 7.4 driver
too.
> I suspect it's an issue with the driver.
20k sounds high for batch updates. The driver has to keep the batched
statements in memory until executeBatch(). Lowering the batch size may
help. I find that, over a 100mb network, there's little-to-no benefit
from batch sizes above 100 (@ 1k data per insert), even when the driver
is really doing batching (which the current driver doesn't do).
If you don't mind trying a very-experimental driver I have a version
with substantial improvements to bytea support & batch updates waiting
in the wings. I've seen a 5-fold speedup (and drastic reduction in
memory use) of a microbenchmark that does batch inserts of bytea data.
Let me know if you'd like to try that driver.
(I'm looking for testers if any other adventurous people want to try it
out..)
-O