Re: Are 50 million rows a problem for postgres ? - Mailing list pgsql-admin

From Sam Barnett-Cormack
Subject Re: Are 50 million rows a problem for postgres ?
Date
Msg-id Pine.LNX.4.50.0309081014440.22692-100000@short.lancs.ac.uk
Whole thread Raw
In response to Are 50 million rows a problem for postgres ?  (Vasilis Ventirozos <vendi@cosmoline.com>)
List pgsql-admin
On Mon, 8 Sep 2003, Vasilis Ventirozos wrote:

> Hi all, i work in a telco and i have huge ammount of data, (50 million)
> but i see a lack of performance at huge tables with postgres,
> are 50 million rows the "limit" of postgres ? (with a good performance)
> i am waiting for 2004 2 billion records so i have to do something.
> Does anyone have a huge database to ask him some issues ?
>
> my hardware is good ,my indexes are good plz dont answer me something like use
> vacuum  :)

I have a similarly huge number of records, as I process our web, ftp,
and rsync logs together using postgres. Works like a charm. You do have
to allow that queries are going to take a long time. I use about 6
queries to summarise a quarter's data - each run for each month, so a
total of 18 queries. These run in a little over 24 hours. And there are
many, many records per month.

--

Sam Barnett-Cormack
Software Developer                           |  Student of Physics & Maths
UK Mirror Service (http://www.mirror.ac.uk)  |  Lancaster University

pgsql-admin by date:

Previous
From: Vasilis Ventirozos
Date:
Subject: Re: Are 50 million rows a problem for postgres ?
Next
From: Nathan
Date:
Subject: Re: Performance Issues