On Wed, Jul 28, 2010 at 3:05 PM, P Kishor <punk.kish@gmail.com> wrote:
> Keep in mind, the circa 100 million rows was for only part of the db.
> If I were to build the entire db, I would have about 4 billion rows
> for a year, if I were to partition the db by years. And, partitioning
> by days resulted in too many tables.
>
Don't partition by arbitrary slices. Find out what your queries are
and partition across the most common of those, possibly in two
dimensions even. Without knowing what kinds of queries you do it is
hard to suggest things that may actually benefit you. Are you using
one of the advanced data types in postgres that deals with spatial
data?
Additionally, if you're trying to have 4 billion rows of data and only
have a 12GB RAM on your box, no matter your choice of DB it will be
slow.