Hi,
I've upgrade
SLES9 with pg73b1jdbc3.jar, IBMjava1.4.2, postgres 7.4.3
to
SLES10 with postgresql-8.1-404.jdbc3.jar, sunjava1.4.2, postgres 8.1.4
and get error in well working program previously with 10,000,000+ rows:
Exception in thread "main" java.lang.OutOfMemoryError
on ResultSet.insertRow() in application source row after 100,000+ inserted
rows from Oracle to Postgres table.
With java -Xmx128M i can insert 200,000+ rows.
But i need 10,000,000+ rows - previosly inserted without problems.
Then I make suppose what it's memory java managment isue
there fore atempt close and open outputResultSet every 100,000 inserted
records
and reach 1,000,000+ inserted records.
But i need 10,000,000,000+ and with java -Xmx can't.
Next i check with pg73b1jdbc3.jar and get the exacly same result.
I've check SELECT * FROM outputTable WHERE oid=0,
because i need append only.
I found in google "Reduce the amount of memory you need during processing of
the ResultSets".
setFetchSize(1) not impact.
How to do?
Next I plan remove sun java and will install IBM java?
Why my program work previously with default configuration?
What next?
I can't believe that it is problem for me only.
May be I'm too new to java and miss some requirements.
Any advices are wellcome.
Vidas