Tuning Postgres for single user manipulating large amounts of data - Mailing list pgsql-general

From Paul Taylor
Subject Tuning Postgres for single user manipulating large amounts of data
Date
Msg-id 4D00B9F6.2060409@fastmail.fm
Whole thread Raw
Responses Re: Tuning Postgres for single user manipulating large amounts of data  (tv@fuzzy.cz)
List pgsql-general
Hi, Im using Postgres 8.3 on a Macbook Pro Labtop.
I using the database with just one db connection to build a lucene
search index from some of the data, and Im trying to improve
performance. The key thing is that I'm only a single user but
manipulating large amounts of data , i.e processing tables with upto 10
million rows in them, so I think want to configure Postgres so that it
can create large temporary tables in memory

Ive tried changes various paramters such as shared_buffers, work_mem and
checkpoint_segments but I don't really understand what they values are,
and the documentation seems to be aimed towards configuring for multiple
users, and my changes make things worse. For example my machine has 2GB
of memory and I read if using as a dedicated server you should set
shared memory to 40% of total memory, but when I increase to more than
30MB Postgres will not start complaining about my SHMMAX limit.

Paul

pgsql-general by date:

Previous
From: Vick Khera
Date:
Subject: Re: SELECT is immediate but the UPDATE takes forever
Next
From: Maxim Boguk
Date:
Subject: Quite a fast lockless vacuum full implemenation