Thread: java.lang.OutOfMemory Exception with a large number of inserts
Hi. At one point in my program, I was inserting a large number of rows (~600,000) using a prepared statement. Here are some things I noticed. 1. It's very slow! :-) 2. Initially, the cpu monitor shows a high level of cpu utilisation, but this slowly drops off with a blip every second or so. Basically, things slow down as the process chugs along. 3. I eventually get a java.Lang.OutOfMemory exception after approx 1:00 - 1:30 hrs. I tried running the JVM with heap size set to 50 Mbytes. I still get the out of mem error. I can't use the 'load' command instead of individual inserts. Is there a faster/better way? Any ideas on how I can get rid of the out of mem errors? Thanks in advance --Arsalan ------------------------------------------------- WAR IS PEACE FREEDOM IS SLAVERY IGNORANCE IS STRENGTH --1984
On Wed, 3 Oct 2001 12:43:12 +0530, you wrote: >At one point in my program, I was inserting a large number of rows >(~600,000) using a prepared statement. Here are some things I noticed. Can you post a code snippet that reproduces the problem? Regards, René Pijlman <rene@lab.applinet.nl>
At 12:13 AM 10/3/2001, Arsalan Zaidi wrote: >At one point in my program, I was inserting a large number of rows >(~600,000) using a prepared statement. Here are some things I noticed. > >1. It's very slow! :-) >2. Initially, the cpu monitor shows a high level of cpu utilisation, but >this slowly drops off with a blip every second or so. Basically, things slow >down as the process chugs along. >3. I eventually get a java.Lang.OutOfMemory exception after approx 1:00 - >1:30 hrs. This is a total guess, but is it possible that the PreparedStatement is gathering warnings on every execution? Try doing stmt.clearWarnings(); inside the loop. I assume that setting parameter values replaces old values immediately, so there''s no real reason to do stmt.clearParameters(); each time as well, but it probably couldn't hurt. :) Peace, Dave
You wrote: > Here's the offending loop. BTW, I messed up with the > total number of rows, it's ~6,000,000, not 600,000. [...] > while(dbResult.next()) Unfortunately, PostgreSQL clients always retrieve the entire resultset from the server before it is processed by the .next() loop. In your case this means that the 6.000.000 rows are first read by the Java client into one huge ResultSet. This will no doubt require much more than 50M of memory. Can't you perform this operation in one or more SQL statements or procedurally in PLSQL on the server, instead of retrieving all data to the client? Cheers, René Pijlman
Re: [INTERFACES] java.lang.OutOfMemory Exception with a large number of inserts
From
"Arsalan Zaidi"
Date:
Would it help if I moved this particular bit of code on to the server side as a server side function? Would there be a significant increase in speed/efficiency? Thanks in advance. --Arsalan > break up you insert into several trunks. > > > Arsalan Zaidi wrote: > > > Hi. > > > > At one point in my program, I was inserting a large number of rows > > (~600,000) using a prepared statement. Here are some things I noticed. > > > > 1. It's very slow! :-) > > 2. Initially, the cpu monitor shows a high level of cpu utilisation, but > > this slowly drops off with a blip every second or so. Basically, things slow > > down as the process chugs along. > > 3. I eventually get a java.Lang.OutOfMemory exception after approx 1:00 - > > 1:30 hrs. > > > > I tried running the JVM with heap size set to 50 Mbytes. I still get the out > > of mem error. > > > > I can't use the 'load' command instead of individual inserts. > > > > Is there a faster/better way? Any ideas on how I can get rid of the out of > > mem errors? > > > > Thanks in advance > > > > --Arsalan > > ------------------------------------------------- > > WAR IS PEACE > > FREEDOM IS SLAVERY > > IGNORANCE IS STRENGTH > > --1984 > > > > ---------------------------(end of broadcast)--------------------------- > > TIP 6: Have you searched our list archives? > > > > http://archives.postgresql.org > >