Re: Query killed with Out of memory - Mailing list pgsql-general

From Chris Mair
Subject Re: Query killed with Out of memory
Date
Msg-id 182da2374c6b79edce4f5baabc05afdb@smtp.hushmail.com
Whole thread Raw
In response to Query killed with Out of memory  (Job <Job@colliniconsulting.it>)
List pgsql-general
> With a heavy query, when line number results raise over 600k query hangs with out of memory.
>
> Here is the explain analyze:
> [...]
> Work_mem is.512mb, shared buffers 3084mb and system Ram 10Gb. Postgres version is 8.4.8 and for some months i cannot
upgrade.
>
> Is there a way to solve the problem?

Hi,

a few ideas:

  - what goes out of memory? The client? if you query from a programming language you should set the fetch/batch size
    to some value so that it won't fetch the whole 600k rows into memory... for the psql client you can do
    \set FETCH_COUNT 1000

  - work_mem 512 MB is high-ish unless you have a small value for max_connection...

  - 8.4.8 was released in 2011, the latest 8.4 release is 8.4.22, you'r missing lots of patches (and 8.4 was EOLed more
    than two years ago)

Bye,
Chris.










pgsql-general by date:

Previous
From: Job
Date:
Subject: Query killed with Out of memory
Next
From: otheus uibk
Date:
Subject: Unexpected trouble from pg_basebackup