Re: Capacity Planning - Mailing list pgsql-admin

From Jeff Keller
Subject Re: Capacity Planning
Date
Msg-id 001d01c45d64$264cd9e0$7601c0c0@JKELLER
Whole thread Raw
In response to Capacity Planning  (Jeff Keller <jeff.keller@clarksecurity.com>)
Responses Re: Capacity Planning
List pgsql-admin
I had a typo in the first post.  The Record Reads per day should be
50,000,000, not 500 Million.  My mistake.  One decimal place makes a huge
difference.   Our current app is Progress based with running on an IBM p650
with 4 processors and suspect a similar load if we were to changes apps and
databases.

Thanks,
Jeff



Jeff Keller wrote:

> Hi All -
>
> We are reviewing possible database and operating solutions for our company
> and we are looking at running PostgreSQL on Linux.
>
> Does PostgreSQL have the capability to handle the following requirements?
> Is anyone successfully running an application with similar
characteristics?
> 100 Gig Database with 600 concurrent users.
> 500,000,000 Record Reads per 12 Hour Business Day
> 200,000 Record Creates per 12 Hour Business Day
> 1,500,000 Record Updates per 12 Hour Business Day

Well, that are big numbers. What do you need is for sure
big iron.

Tell us what are you planning to buy in order to support that load.

My actual experience is ( rougly ):
100 concurrent users
2.000.000 read for 12 h
1.000.000 update for 12 h
50.000 new records each day

as you can see this scenario is far aways from your need
but we are using only a two processor Intel Xeon 2.8 GHz in
hyperthreding mode with a not so tuned RAID system and only
1 GB of RAM.

I think that with 8 processors, good fiber channel access to your
RAID, and good ammount of memory you can easily reach that numbers.


This is a challenging task to accomplish, do you need any help out there ;-)
?


Regards
Gaetano Mendola




pgsql-admin by date:

Previous
From: Jose' Cruanyes
Date:
Subject: Re: Change the character encoding of a database
Next
From: "Nigel Bishop"
Date:
Subject: pg_dump Error