Out of Memory and Configuration Problems (Big Computer) - Mailing list pgsql-general

From Tom Wilcox
Subject Out of Memory and Configuration Problems (Big Computer)
Date
Msg-id 4BFFBE3C.9020204@gmail.com
Whole thread Raw
Responses Re: Out of Memory and Configuration Problems (Big Computer)  (Stephen Frost <sfrost@snowman.net>)
Re: Out of Memory and Configuration Problems (Big Computer)  (Bill Moran <wmoran@potentialtech.com>)
List pgsql-general
Hi,

I am fairly new to postgres and I have been using it with Python to develop a set of functions that operate on some moderately large tables (30million rows with 13-15 columns).

I have spent some time messing with the configuration file to get PG to use indexes when planning queries, etc.

At the moment, I have a fairly fundamental, recurring problem which is that a query has ran out of memory. This particular query is a sequential scan UPDATE query that is operating on a table with approximately 27,778,000 rows:

UPDATE tbl SET f1 = COALESCE(f2,'') || ' ' || COALESCE(f3);

ERROR: out of memory
DETAIL: Failed on request of size 36
Run time (est): 6,000,000ms

EXPLAIN shows the query plan as:

Seq Scan on tbl (cost=0.00..2088542.83 rows=59868855 width=128)

I have noticed that the "rows=59xxxxxx" suggests that the plan features 2 x the number of rows in the table. Perhaps I am writing poor SQL?

Can anyone suggest reasons why I might be running out of memory on such a simple query? Is it possible that, because it is executed as a transaction, the changes are not being committed until the query is complete and therefore the whole table is being stored in memory?

Also, can anyone give me any pointers for configuring postgres to use ALL 96GB of RAM in my new machine? I would like to know it was using everything available.. especially when it is possible to load an entire 30m row table into memory! I am currently using the default configuration from standard installation.

Any help/suggestions are very much appreciated.

Cheers,
Tom

pgsql-general by date:

Previous
From: Craig Ringer
Date:
Subject: Re: INSERTing lots of data
Next
From: Joachim Worringen
Date:
Subject: Re: INSERTing lots of data