Re: Tuning Postgres for single user manipulating large amounts of data - Mailing list pgsql-general

From Andy Colson
Subject Re: Tuning Postgres for single user manipulating large amounts of data
Date
Msg-id 4D00EEE8.70408@squeakycode.net
Whole thread Raw
In response to Re: Tuning Postgres for single user manipulating large amounts of data  (Andy Colson <andy@squeakycode.net>)
Responses Re: Tuning Postgres for single user manipulating large amounts of data  (Reid Thompson <Reid.Thompson@ateb.com>)
List pgsql-general
On 12/9/2010 8:50 AM, Andy Colson wrote:
> On 12/9/2010 6:25 AM, Paul Taylor wrote:
>> Hi, Im using Postgres 8.3 on a Macbook Pro Labtop.
>> I using the database with just one db connection to build a lucene
>> search index from some of the data, and Im trying to improve
>> performance. The key thing is that I'm only a single user but
>> manipulating large amounts of data , i.e processing tables with upto 10
>> million rows in them, so I think want to configure Postgres so that it
>> can create large temporary tables in memory
>>
>> I've tried changes various parameters such as shared_buffers, work_mem
>> and checkpoint_segments but I don't really understand what they values
>> are, and the documentation seems to be aimed towards configuring for
>> multiple users, and my changes make things worse. For example my machine
>> has 2GB of memory and I read if using as a dedicated server you should
>> set shared memory to 40% of total memory, but when I increase to more
>> than 30MB Postgres will not start complaining about my SHMMAX limit.
>>
>> Paul
>>
>
> You need to bump up your SHMMAX is your OS.

sorry: SHMMAX _in_ your OS.

its an OS setting not a PG one.

-Andy


pgsql-general by date:

Previous
From: Andy Colson
Date:
Subject: Re: Tuning Postgres for single user manipulating large amounts of data
Next
From: Reid Thompson
Date:
Subject: Re: Tuning Postgres for single user manipulating large amounts of data