Re: work_mem greater than 2GB issue - Mailing list pgsql-general

From Henry
Subject Re: work_mem greater than 2GB issue
Date
Msg-id 20090514193103.13743hjv1b5nj0ow@zenmail.co.za
Whole thread Raw
In response to work_mem greater than 2GB issue  (wickro <robwickert@gmail.com>)
List pgsql-general
Quoting wickro <robwickert@gmail.com>:
> I have a largish table (> 8GB). I'm doing a very simple single group
> by on.

This doesn't answer your question, but you might want to take
advantage of table partitioning:
http://www.postgresql.org/docs/8.3/interactive/ddl-partitioning.html

I've recently gone through this exercise (several tables were 10GB+,
some almost 30GB) and if your WHERE clauses qualify, then expect
significant performance gains with /much/ better memory consumption.

You only have one large table, so partitioning it should be painless
and not take too long (unlike our scenario).

Cheers
Henry


Attachment

pgsql-general by date:

Previous
From: "George Kao"
Date:
Subject: Re: how to extract data from bytea so it is be used in blob for mysql database
Next
From: Sam Mason
Date:
Subject: Re: postgresql on windows98