Re: Hi Community - Mailing list pgsql-admin

From Kevin Grittner
Subject Re: Hi Community
Date
Msg-id 1123309468.541729.1422887565854.JavaMail.yahoo@mail.yahoo.com
Whole thread Raw
In response to Hi Community  (Naresh Soni <jmnaresh@gmail.com>)
Responses Re: Hi Community  (Naresh Soni <jmnaresh@gmail.com>)
List pgsql-admin
Naresh Soni <jmnaresh@gmail.com> wrote:

> This is my first question on the list, I wanted to ask if
> postgres can handle multi millions records? for example there
> will be 1 million records per table per day, so 365 millions per
> year.

Yes, I have had hundreds of millions of rows in a table without
performance problems. If you want to see such a table in action,
go to the following web site, bring up a court case, and click the
"Court Record Events" button. Last I knew the table containing
court record events had about 450 million rows, with no
partitioning.  The total database was 3.5 TB.

http://wcca.wicourts.gov/

> Is yes, then please elaborate.

You will want indexes on columns used in the searches. Depending
on details you have not provided it might be beneficial to
partition the table. Do not consider partitioning to be some
special magic which always makes things faster, though -- it can
easily make performance much worse if it is not a good fit.

--
Kevin Grittner
EDB: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


pgsql-admin by date:

Previous
From: Naresh Soni
Date:
Subject: Hi Community
Next
From: Naresh Soni
Date:
Subject: Re: Hi Community