Thread: Application takes longer time to fetch large no of records from Postgresql 9.0.3

Hi,
My application takes longer time (we see time out even) when data to be fetched from Postgresql 9.0.3 is around 1900000 records. I am doing an improvement at application level, but from database side any performance tuning do i need to do?

I have not changed any parameters in postgresql.conf, so using default values.

Any recommendations to improve the performance.

My PC has RAM of 4 GB (Windows 2003 R2 64 BITS)

Regards
On 10/16/2012 12:24 PM, Deven Thaker wrote:
> Hi,
> My application takes longer time (we see time out even) when data to be
> fetched from Postgresql 9.0.3 is around 1900000 records. I am doing an
> improvement at application level, but from database side any performance
> tuning do i need to do?
>
> I have not changed any parameters in postgresql.conf, so using default
> values.
>
> Any recommendations to improve the performance.

I'm pretty sure I sent you a link that would help you answer that
yourself or write a better question that would make it pratical for
others to help you.

Allow me to repeat myself.

Please read this:

     https://wiki.postgresql.org/wiki/Slow_Query_Questions

then follow up with a complete question including exact query text,
EXPLAIN (ANALYZE, BUFFERS) results, etc.

--
Craig Ringer


On 10/15/12 9:24 PM, Deven Thaker wrote:
>
> I have not changed any parameters in postgresql.conf, so using default
> values.
>
> Any recommendations to improve the performance.

yes.   the default values are suitable for a VERY small computer.

increasing shared_buffers, work_mem with have a significant performance
increase.


You've got a 64 bit Windows Server with just 4GB of ram?   thats rather
limiting in this day and age where 16GB of ram is like $70.    Many x86
systems with 4GB can actually only use about 3.2GB, give or take, and
then Windows Server likely is using on the order of 1GB of that, leaving
maybe 2GB for your database server and any other services and
applications you're running on the same machine.

you say, your query is returning almost 2 million records?   how large
are these records?  if they average 500 bytes each, thats 1GB right
there just to hold a copy of the result set from that query.





--
john r pierce                            N 37, W 122
santa cruz ca                         mid-left coast





On Mon, Oct 15, 2012 at 9:24 PM, Deven Thaker <devenmthaker@gmail.com> wrote:
Hi,
My application takes longer time (we see time out even) when data to be fetched from Postgresql 9.0.3 is around 1900000 records. I am doing an improvement at application level, but from database side any performance tuning do i need to do?

When you say "fetched" do you mean the select query takes a long time, or that fetch operations on a cursor get slow after a certain point?

Best Wishes,
Chris Travers

Regards

On 10/16/2012 12:40 PM, Craig Ringer wrote:
> On 10/16/2012 12:24 PM, Deven Thaker wrote:
>> Hi,
>> My application takes longer time (we see time out even) when data to be
>> fetched from Postgresql 9.0.3 is around 1900000 records. I am doing an
>> improvement at application level, but from database side any performance
>> tuning do i need to do?
>>
>> I have not changed any parameters in postgresql.conf, so using default
>> values.
>>
>> Any recommendations to improve the performance.

Hi

My earlier reply was a tad grumpy; my apologies. The point stands, but
the wording could've been nicer.

It isn't really clear *where* the slowness is. That's why I'm asking for
EXPLAIN (BUFFERS, ANALYZE) results. If that's fast then it tells you the
problem is somewhere else.

What is the client application? What database driver does it use -
PgJDBC? libpq? psqlODBC? npgsql? Something else? What language is it
written in? Does it read the whole result set into memory at once, or
does it use a cursor?

If you're reading the whole result set into memory at once you might
want to consider using DECLARE and FETCH:

   http://www.postgresql.org/docs/current/static/sql-declare.html

so you don't have to read the whole result into memory at once.

--
Craig Ringer