Thread: Re: Postgresql system requirements to support large

Re: Postgresql system requirements to support large

From
Bob.Henkel@hartfordlife.com
Date:




I just want a general idea of what Postgresql can handle. I know the guru's
will say it depends on many different things, but in general what can this
bad boy handle?

50gb to 100gb is by no means small.  But how does Postgresql 7.4 handle
database of  900G, or 1 Terabyte or greater?
 How does Postgresql handle a table with100 columns of integers and
varchar2(400) data types with 1 million rows,10 million, 100 million 500
million,greater then 1 billion joined to a small lookup table of 50000 rows
with both tables indexed properely?  Can this database handle enterprise
quanities of data or is it geared towards the small to medium data?



|---------+---------------------------------->
|         |           wdouma@zootweb.com     |
|         |           (wilbur douma)         |
|         |           Sent by:               |
|         |           pgsql-general-owner@pos|
|         |           tgresql.org            |
|         |                                  |
|         |                                  |
|         |           04/15/2004 03:14 PM    |
|         |                                  |
|---------+---------------------------------->
  >--------------------------------------------------------------------------------------------------------|
  |                                                                                                        |
  |       To:       pgsql-general@postgresql.org                                                           |
  |       cc:                                                                                              |
  |       Subject:  [GENERAL] Postgresql system requirements to support large databases.                   |
  >--------------------------------------------------------------------------------------------------------|




We are looking at implementing a large Postgresql database (50GB -
100GB) and are wondering if there are any limitations or problems for
a database of this size running on 32-bit architecture.  I have seen
some older posts where it appears that Postgresql seemed to have
performance problems when the database reached 5GB, and it was
recommended that 64-bit architecture be used.  Is this still true with
Postgresql version 7.4?  This will be out first experience with
Postgresql and we are needing to get some ideas of what system
requirements a database of this size will require.  Since the machines
that we have are all 32-bit, we would like to know if we will need to
go to 64-bit.  Any comments or suggestions??

Thanks in advance for any help.

---------------------------(end of broadcast)---------------------------
TIP 6: Have you searched our list archives?

               http://archives.postgresql.org






*************************************************************************
PRIVILEGED AND CONFIDENTIAL: This communication, including attachments, is for the exclusive use of addressee and may
containproprietary, confidential and/or privileged information.  If you are not the intended recipient, any use,
copying,disclosure, dissemination or distribution is strictly prohibited.  If you are not the intended recipient,
pleasenotify the sender immediately by return e-mail, delete this communication and destroy all copies. 
*************************************************************************


Re: Postgresql system requirements to support large

From
"scott.marlowe"
Date:
On Tue, 20 Apr 2004 Bob.Henkel@hartfordlife.com wrote:

>
>
>
>
>
> I just want a general idea of what Postgresql can handle. I know the guru's
> will say it depends on many different things, but in general what can this
> bad boy handle?

A lot.  There are terabyte databases running on postgresql

> 50gb to 100gb is by no means small.  But how does Postgresql 7.4 handle
> database of  900G, or 1 Terabyte or greater?
>  How does Postgresql handle a table with100 columns of integers and
> varchar2(400) data types with 1 million rows,10 million, 100 million 500
> million,greater then 1 billion joined to a small lookup table of 50000 rows
> with both tables indexed properely?  Can this database handle enterprise
> quanities of data or is it geared towards the small to medium data?

Databases designed with 100 columns of integers and varchar2(400) columns
are generally not well enough designed to deserve the moniker "Enterprise"
class.

Properly normalized and indexed, postgresql is capable of handling most
loads quite well.

If you want guarantees, you'll get none.  It's your job to test it for the
load you will be putting it under to see if it works.

However, the real advantage PostgreSQL has isn't that it can handle poorly
designed databases well, it's that should you find yourself in a corner
case that hasn't been explored yet, and find a performance problem, you
can talk directly to the developers and get help / patches from them and
be an active part of the process of making PostgreSQL a better database
while receiving better support than most commercial products provide.

The biggest limiter for handling large data sets isn't going to be
PostgreSQL or Oracle, but your system hardware.  Running a 1 terabyte
database on a P100 with 32 megs of ram on an IDE - software RAID array
with write caching turned off is gonna be a lot slower than the same
database on an 8 way opteron with 64 gigs of RAM and 4 battery backed RAID
controllers with hundreds of hard drives under it.

There are no internal design limitations that will prevent you from
handling large data sets though.  Only bottlenecks that haven't been found
and fixed yet.  :-)