On Nov 15, 2007 9:51 PM, Tom Lane <tgl@sss.pgh.pa.us> wrote:
> "Sean Davis" <sdavis2@mail.nih.gov> writes:
> > I am trying to build a full-text index (gin(to_tsvector('english',
> > title || abstract))) on about 18 million abstracts and titles from
> > medical literature. However, I keep getting out-of-memory errors. (I
> > am on a 32Gb linux system with maintenance_work_mem set to 20Gb and
> > shared buffers at 4Gb; postgres 8.3beta). Does creation of a
> > full-text index require that the entire index fit into memory?
>
> I can't reproduce any memory-leak issue here. I wonder whether your
> maintenance_work_mem setting is optimistically large (like, higher
> than the ulimit restriction on the postmaster).
Thanks, Tom. ulimit -a shows unlimited, but there may be something
else going on. I'll try leaving it lower and see what that does for
me.
Sean