Hello all,
I am in the midst of taking a development DB into production, but the
performance has not been very good so far.
The DB is a decision based system, that currently has queries against tables
with up to 20million records (3GB table sizes), and at this point about a
25GB DB in total. {Later down the road up to 60million records and a DB of
up to 150GB is planned).
As I understand it, Oracle has some product called "parallel query" which
splits the table queried into 10 pieces and then does each one across as
many CPUs as possible, then puts it all back together again.
So my question is... based upon the messages I have read here, it does not
appear that PostgreSQL makes use of multiple CPUs, but only hands the next
query off to the next processor based upon operating system rules.
Therefore, what are some good ways to handle such large amounts of
information using PostgreSQL?
Michael Miyabara-McCaskey
Email: mykarz@miyabara.com
Web: http://www.miyabara.com/mykarz/
Mobile: +1 408 504 9014