Re: Table partition for very large table - Mailing list pgsql-general

From Scott Marlowe
Subject Re: Table partition for very large table
Date
Msg-id 1112032374.12450.28.camel@state.g2switchworks.com
Whole thread Raw
In response to Table partition for very large table  (Yudie Gunawan <yudiepg@gmail.com>)
List pgsql-general
On Mon, 2005-03-28 at 11:32, Yudie Gunawan wrote:
> I have table with more than 4 millions records and when I do select
> query it gives me "out of memory" error.
> Does postgres has feature like table partition to handle table with
> very large records.
> Just wondering what do you guys do to deal with very large table?

Is this a straight "select * from table" or is there more being done to
the data?

If it's a straight select, you are likely running out of memory to hold
the result set, and need to look at using a cursor to grab the result in
pieces.

pgsql-general by date:

Previous
From: "Janning Vygen"
Date:
Subject: Re: pg_xlog disk full error, i need help
Next
From: "Joshua D. Drake"
Date:
Subject: Re: Table partition for very large table