Re: Are 50 million rows a problem for postgres ? - Mailing list pgsql-admin

From Ron Mayer
Subject Re: Are 50 million rows a problem for postgres ?
Date
Msg-id POEDIPIPKGJJLDNIEMBEAEDFDJAA.ron@intervideo.com
Whole thread Raw
In response to Are 50 million rows a problem for postgres ?  (Vasilis Ventirozos <vendi@cosmoline.com>)
List pgsql-admin
> Hi all, i work in a telco and i have huge ammount of data, (50 million)
> but i see a lack of performance at huge tables with postgres,
> are 50 million rows the "limit" of postgres ? (with a good performance)

I have worked on a datawarehouse (postgresql 7.3) with a
pretty standard star schema with over 250 million rows on
the central 'fact' table, and anywhere from 100 to 10+ million
records in the surrounding 'dimension' tables.

The most common queries were simple joins between 3 tables, with
selects on one of the ids.  These took a few (1-60) seconds.
About 500,000 new records were loaded each night; and the ETL
processing and creating some aggregates took about 11 hours/night
with 7.3, and 9 hours/night with 7.4beta.

Hope this helps.


pgsql-admin by date:

Previous
From: Achilleus Mantzios
Date:
Subject: Re: Conditional row grained replication with DBMirror
Next
From: "Gaetano Mendola"
Date:
Subject: Report Generator Proposal