Matthew Hagerty wrote:
>
> Greetings,
>
> I keep seeing reference to an 8K limit on query results. So how to deal
> with query results greater than 8K? Am I under the wrong impression that
> when you run a query from, say libpq, that each retch row is returned only
> when you execute a fetchrow? Or is the whole query stuffed in to a buffer
> which has an 8K limit?
the current 8K limit is for queries (and unrelatedly single rows),
not query _results_
> I have to deal with some queries that create a huge result, and/or doing
> reports requires a large result set. How can I deal with these larger results?
If you dont want to digest a huge result, you can do a
DECLARE CURSOR - FETCH -FETCH - FETCH - CLOSE CURSOR thing to get it in
pieces
(needs to be inside BEGIN TRANSACTION - END TRANSACTION)
> I'll probably get flamed for asking this one but, how can I step through a
> table? Hey, sometimes you just have to do it.
See above.
Actually postgres is quite happy to return gigabytes of data, if you
have
the resources (disk space, memory, time ;)
--------------
Hannu