Thread: Memory exeception
Hi I have a problem in processing large numbers of rows in ResultSets. Here's the code fragment where the problem occurs: ... ResultSet rset = stmt.executeQuery( "select ... from ... where ... order by ..." ) ; while(rset.next() ) { // Process the row } ... The query addresses some 100'000 rows. When stmt.executeQuery() is executed suddenly an exception occurs. The same program works fine with other JDBC drivers. Any ideas ? Greetings Peter
Attachment
> The query addresses some 100'000 rows. When stmt.executeQuery() is > executed suddenly an exception occurs. > The same program works fine with other JDBC drivers. Any other clues? Does it just say "memory exception", or is there more? I've had terrible performance occur when retrieving huge ResultSets (presumably because I had caused swapping to occur at the OS level), but not an exception. -Nick -------------------------------------------------------------------------- Nick Fankhauser nickf@ontko.com Phone 1.765.935.4283 Fax 1.765.962.9788 Ray Ontko & Co. Software Consulting Services http://www.ontko.com/
----- Original Message ----- From: "Peter Wasem" <peter.wasem@itag.ch> To: <pgsql-jdbc@postgresql.org> Sent: Wednesday, November 07, 2001 11:18 AM Subject: [JDBC] Memory exeception > Hi > > I have a problem in processing large numbers of rows in ResultSets. > Here's the code fragment where the problem occurs: > > ... > ResultSet rset = stmt.executeQuery( "select ... from ... where ... > order by ..." ) ; > while(rset.next() ) > { // Process the row } > ... > > The query addresses some 100'000 rows. When stmt.executeQuery() is > executed suddenly an exception occurs. > The same program works fine with other JDBC drivers. You haven´t been clear on what exact exception is thrown, but a wild guess is that executeQuery throws an exception due to the fact that the SQL of the query is illigal for postgresql. Of course, without having seen your query, and the exception , this is all speculation. Could you post it? Regards Per-Olof Norén
----- Original Message ----- From: "Nick Fankhauser" <nickf@ontko.com> To: "Peter Wasem" <peter.wasem@itag.ch>; <pgsql-jdbc@postgresql.org> Sent: Wednesday, November 07, 2001 2:52 PM Subject: Re: [JDBC] Memory exeception > > > The query addresses some 100'000 rows. When stmt.executeQuery() is > > executed suddenly an exception occurs. > > The same program works fine with other JDBC drivers. > > Any other clues? Does it just say "memory exception", or is there more? > > I've had terrible performance occur when retrieving huge ResultSets > (presumably because I had caused swapping to occur at the OS level), but not > an exception. I am using viennaSQL (a swing based jdbc gui client) to retrieve huge (800 000) Resultsets. The java VM needs to be launched with the appropriate memory settings. for example: "java -Xmx512m MyClass" which allows this vm 512 Megabytes of memory. I think this exception should have been OutOfMemoryException and not MemoryException for this to be the reason. Anyway, try starting your app with more memory available. ps disregard my last posting, i wasn´t reading the subject line to well :-) ds
Peter, Postgres will return the entire result into memory first before processing the first row of data. If you don't want this behavior you should use explicit cursors and fetch statements to get data a few rows at a time. See the postgres documentation on the 'cursor' and 'fetch' sql statements. thanks, --Barry Peter Wasem wrote: > Hi > > I have a problem in processing large numbers of rows in ResultSets. > Here's the code fragment where the problem occurs: > > ... > ResultSet rset = stmt.executeQuery( "select ... from ... where ... > order by ..." ) ; > while(rset.next() ) > { // Process the row } > ... > > The query addresses some 100'000 rows. When stmt.executeQuery() is > executed suddenly an exception occurs. > The same program works fine with other JDBC drivers. > > Any ideas ? > > Greetings > Peter > > ---------------------------(end of broadcast)--------------------------- > TIP 2: you can get off all lists at once with the unregister command > (send "unregister YourEmailAddressHere" to majordomo@postgresql.org) >
Hi Barry This is exactly what I observed. The physical memory dropped down to almost zero (but no swap was used) before the exception was thrown. I'm not sure if I can use cursors (other than ResultSet) since my application is Java-based. Thanks Barry Lind wrote: > Peter, > > Postgres will return the entire result into memory first before > processing the first row of data. If you don't want this behavior you > should use explicit cursors and fetch statements to get data a few rows > at a time. See the postgres documentation on the 'cursor' and 'fetch' > sql statements. > > thanks, > --Barry > > Peter Wasem wrote: > > > Hi > > > > I have a problem in processing large numbers of rows in ResultSets. > > Here's the code fragment where the problem occurs: > > > > ... > > ResultSet rset = stmt.executeQuery( "select ... from ... where ... > > order by ..." ) ; > > while(rset.next() ) > > { // Process the row } > > ... > > > > The query addresses some 100'000 rows. When stmt.executeQuery() is > > executed suddenly an exception occurs. > > The same program works fine with other JDBC drivers. > > > > Any ideas ? > > > > Greetings > > Peter > > > > ---------------------------(end of broadcast)--------------------------- > > TIP 2: you can get off all lists at once with the unregister command > > (send "unregister YourEmailAddressHere" to majordomo@postgresql.org) > >
Attachment
Is it possible to take this approach with a query that has an "order by" in it? -Nick > -----Original Message----- > Hi Barry > > This is exactly what I observed. The physical memory dropped down > to almost > zero (but no swap was used) before the exception was thrown. I'm > not sure if I > can use cursors (other than ResultSet) since my application is Java-based. > > Thanks > > Barry Lind wrote: > > > Peter, > > > > Postgres will return the entire result into memory first before > > processing the first row of data. If you don't want this behavior you > > should use explicit cursors and fetch statements to get data a few rows > > at a time. See the postgres documentation on the 'cursor' and 'fetch' > > sql statements. > > > > thanks, > > --Barry > > > > Peter Wasem wrote: > > > > > Hi > > > > > > I have a problem in processing large numbers of rows in ResultSets. > > > Here's the code fragment where the problem occurs: > > > > > > ... > > > ResultSet rset = stmt.executeQuery( "select ... from ... where ... > > > order by ..." ) ; > > > while(rset.next() ) > > > { // Process the row } > > > ... > > > > > > The query addresses some 100'000 rows. When stmt.executeQuery() is > > > executed suddenly an exception occurs. > > > The same program works fine with other JDBC drivers. > > > > > > Any ideas ? > > > > > > Greetings > > > Peter > > > > > > ---------------------------(end of > broadcast)--------------------------- > > > TIP 2: you can get off all lists at once with the unregister command > > > (send "unregister YourEmailAddressHere" to > majordomo@postgresql.org) > > > >
Peter, You certainly can do this with java. I do in my code in many places. Consider the following example that processes all rows in the table 'bar' by fetching 10 rows at a time. thanks, --Barry PreparedStatement l_cursorstmt = connection.PrepareStatement("declare foo_cursor cursor for select foo from bar"); PreparedStatement l_fetchstmt = connection.PrepareStatement("fetch forward 10 from foo_cursor"); //open the cursor l_cursorstmt.execute(); ResultSet l_rset; int l_count; do { //continue looping until you get no rows returned in a fetch l_count = 0; //fetch rows from the cursor using the fetch statement l_rset = l_fetchstmt.executeQuery(); while (l_rset.next) { l_count++; //process the rows here } } while (l_count > 0); //don't forget to close the cursor PreparedStatement l_closestmt = connection.PrepareStatement("close foo_cursor"); l_closestmt.execute(); Peter Wasem wrote: > Hi Barry > > This is exactly what I observed. The physical memory dropped down to almost > zero (but no swap was used) before the exception was thrown. I'm not sure if I > can use cursors (other than ResultSet) since my application is Java-based. > > Thanks > > Barry Lind wrote: > > >>Peter, >> >>Postgres will return the entire result into memory first before >>processing the first row of data. If you don't want this behavior you >>should use explicit cursors and fetch statements to get data a few rows >>at a time. See the postgres documentation on the 'cursor' and 'fetch' >>sql statements. >> >>thanks, >>--Barry >> >>Peter Wasem wrote: >> >> >>>Hi >>> >>>I have a problem in processing large numbers of rows in ResultSets. >>>Here's the code fragment where the problem occurs: >>> >>>... >>> ResultSet rset = stmt.executeQuery( "select ... from ... where ... >>>order by ..." ) ; >>> while(rset.next() ) >>> { // Process the row } >>>... >>> >>>The query addresses some 100'000 rows. When stmt.executeQuery() is >>>executed suddenly an exception occurs. >>>The same program works fine with other JDBC drivers. >>> >>>Any ideas ? >>> >>>Greetings >>>Peter >>> >>>---------------------------(end of broadcast)--------------------------- >>>TIP 2: you can get off all lists at once with the unregister command >>> (send "unregister YourEmailAddressHere" to majordomo@postgresql.org) >>> >>>
Nick, If by 'this approach' you mean using explicit cursors and fetch statements, the answer is yes. The query in the cursor is run in the same way as without a cursor. Each fetch statement allows you to get an additional set of rows. There is also a MOVE statement that lets you move to a specific row in the query to return it, also fetch allows you to fetch backwards or forwards through the cursor. You can think of cursors as simply having the result set maintained on the server instead of the client. Fetch and move allow you to access the rows. thanks, --Barry Nick Fankhauser wrote: > Is it possible to take this approach with a query that has an "order by" in > it? > > -Nick > > >>-----Original Message----- >> > >>Hi Barry >> >>This is exactly what I observed. The physical memory dropped down >>to almost >>zero (but no swap was used) before the exception was thrown. I'm >>not sure if I >>can use cursors (other than ResultSet) since my application is Java-based. >> >>Thanks >> >>Barry Lind wrote: >> >> >>>Peter, >>> >>>Postgres will return the entire result into memory first before >>>processing the first row of data. If you don't want this behavior you >>>should use explicit cursors and fetch statements to get data a few rows >>>at a time. See the postgres documentation on the 'cursor' and 'fetch' >>>sql statements. >>> >>>thanks, >>>--Barry >>> >>>Peter Wasem wrote: >>> >>> >>>>Hi >>>> >>>>I have a problem in processing large numbers of rows in ResultSets. >>>>Here's the code fragment where the problem occurs: >>>> >>>>... >>>> ResultSet rset = stmt.executeQuery( "select ... from ... where ... >>>>order by ..." ) ; >>>> while(rset.next() ) >>>> { // Process the row } >>>>... >>>> >>>>The query addresses some 100'000 rows. When stmt.executeQuery() is >>>>executed suddenly an exception occurs. >>>>The same program works fine with other JDBC drivers. >>>> >>>>Any ideas ? >>>> >>>>Greetings >>>>Peter >>>> >>>>---------------------------(end of >>>> >>broadcast)--------------------------- >> >>>>TIP 2: you can get off all lists at once with the unregister command >>>> (send "unregister YourEmailAddressHere" to >>>> >>majordomo@postgresql.org) >> >
Hi Barry Looks good! I thought cursors are only available in embedded SQL. So this approach seems to work for me. Thanks for your help. Peter Barry Lind wrote: > Peter, > > You certainly can do this with java. I do in my code in many places. > Consider the following example that processes all rows in the table > 'bar' by fetching 10 rows at a time. > > thanks, > --Barry > > PreparedStatement l_cursorstmt = connection.PrepareStatement("declare > foo_cursor cursor for select foo from bar"); > PreparedStatement l_fetchstmt = connection.PrepareStatement("fetch > forward 10 from foo_cursor"); > > //open the cursor > l_cursorstmt.execute(); > > ResultSet l_rset; > int l_count; > do { > //continue looping until you get no rows returned in a fetch > l_count = 0; > //fetch rows from the cursor using the fetch statement > l_rset = l_fetchstmt.executeQuery(); > while (l_rset.next) { > l_count++; > //process the rows here > } > } while (l_count > 0); > > //don't forget to close the cursor > PreparedStatement l_closestmt = connection.PrepareStatement("close > foo_cursor"); > l_closestmt.execute(); > > Peter Wasem wrote: > > > Hi Barry > > > > This is exactly what I observed. The physical memory dropped down to almost > > zero (but no swap was used) before the exception was thrown. I'm not sure if I > > can use cursors (other than ResultSet) since my application is Java-based. > > > > Thanks > > > > Barry Lind wrote: > > > > > >>Peter, > >> > >>Postgres will return the entire result into memory first before > >>processing the first row of data. If you don't want this behavior you > >>should use explicit cursors and fetch statements to get data a few rows > >>at a time. See the postgres documentation on the 'cursor' and 'fetch' > >>sql statements. > >> > >>thanks, > >>--Barry > >> > >>Peter Wasem wrote: > >> > >> > >>>Hi > >>> > >>>I have a problem in processing large numbers of rows in ResultSets. > >>>Here's the code fragment where the problem occurs: > >>> > >>>... > >>> ResultSet rset = stmt.executeQuery( "select ... from ... where ... > >>>order by ..." ) ; > >>> while(rset.next() ) > >>> { // Process the row } > >>>... > >>> > >>>The query addresses some 100'000 rows. When stmt.executeQuery() is > >>>executed suddenly an exception occurs. > >>>The same program works fine with other JDBC drivers. > >>> > >>>Any ideas ? > >>> > >>>Greetings > >>>Peter > >>> > >>>---------------------------(end of broadcast)--------------------------- > >>>TIP 2: you can get off all lists at once with the unregister command > >>> (send "unregister YourEmailAddressHere" to majordomo@postgresql.org) > >>> > >>> > > ---------------------------(end of broadcast)--------------------------- > TIP 5: Have you checked our extensive FAQ? > > http://www.postgresql.org/users-lounge/docs/faq.html