Thread: CURSOR OR OFFSET/LIMIT

CURSOR OR OFFSET/LIMIT

From
Kaloyan Iliev
Date:
Hi,

I am working on Web Based application using Perl and Apache.
I have to show to the users some query results by pages.
Some time the result can be over 1000 rows (but can be more).
The question is how to make this.

The one way is to use OFFSET and LIMIT. That's OK but every time the
whole query must be parsed and executed.

If I use cursors it's better but my problem is that cursors live only in
the current transaction.
So when the Web Server finish I've lost the transaction and the cursor.

There is some software written from my coleagues that on every server
request open a transaction and cursor. Move to the requested
page and show the result(After that the script finishes, so is the
transaction). So my question is.
Should I rewrte this by using OFFSET/LIMIT or it is better every time to
create the cursor and use it to get the rows.
Is there a way to save the cursor between separe Browser request (and to
give it time to live)? Or After all OFFSET and LIMIT?

Thanks in advance.

Kaloyan Iliev


Re: CURSOR OR OFFSET/LIMIT

From
John DeSoi
Date:
On Apr 6, 2006, at 10:48 AM, Kaloyan Iliev wrote:

> If I use cursors it's better but my problem is that cursors live
> only in the current transaction.
> So when the Web Server finish I've lost the transaction and the
> cursor.


Cursors can live outside the transaction if you declare them WITH
HOLD specified. But that still may not help you in a web environment
if you want to break the results into pages served on separate
requests (and possibly different connections).

http://www.postgresql.org/docs/8.1/interactive/sql-declare.html

> Is there a way to save the cursor between separe Browser request
> (and to give it time to live)?

Sure, but you need to add a lot of connection management to do this.
You would need to keep track of the cursors and make sure a
subsequent request uses the right connection.






John DeSoi, Ph.D.
http://pgedit.com/
Power Tools for PostgreSQL