Out of memory on vacuum analyze - Mailing list pgsql-general

From John Cole
Subject Out of memory on vacuum analyze
Date
Msg-id 76758090F8686C47A44B6FF52514A1D307909AE1@hermes.uai.int
Whole thread Raw
Responses Re: Out of memory on vacuum analyze  (Jeff Davis <pgsql@j-davis.com>)
List pgsql-general
I have a large table (~55 million rows) and I'm trying to create an index
and vacuum analyze it.  The index has now been created, but the vacuum
analyze is failing with the following error:

ERROR:  out of memory
DETAIL:  Failed on request of size 943718400.

I've played with several settings, but I'm not sure what I need to set to
get this to operate.  I'm running on a dual Quad core system with 4GB of
memory and Postgresql 8.2.3 on W2K3 Server R2 32bit.

Maintenance_work_mem is 900MB
Max_stack_depth is 3MB
Shared_buffers is 900MB
Temp_buffers is 32MB
Work_mem is 16MB
Max_fsm_pages is 204800
Max_connections is 50

Any help would be greatly appreciated.

Thanks,

John Cole

--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.5.441 / Virus Database: 268.18.2/692 - Release Date: 2/18/2007
4:35 PM

This email and any files transmitted with it are confidential and intended solely for the use of the individual or
entityto whom they are addressed. If you have received this email in error please notify the sender. This message
containsconfidential information and is intended only for the individual named. If you are not the named addressee you
shouldnot disseminate, distribute or copy this e-mail. 

pgsql-general by date:

Previous
From: Richard Broersma Jr
Date:
Subject: Re: complex referential integrity constraints
Next
From: Bruno Wolff III
Date:
Subject: Re: Why *exactly* is date_trunc() not immutable ?