I wasn't splitting it up because I doubted postgres could handle the volume.
I verified that at the beginning of the project. I split the data up
because my database gets between 5 and 15 Million new records A DAY. We are
trying to keep about 45 days worth of data. Queries on such a volume are
extremely painfull. Keeping the data in one table simply isn't an option.
BTW, the system runs on a dual 733Mh PentiumIII with 1Gb of RAM. I'm using
8 18Gb SCSI drives at 10K RPM. Raid 5. I'm in the process of adding an
ADDITIONAL 8x36Gb RAID cabinet.
You don't have to tell me how well postgres scales. I've proven that. ;^)
Thanx for your time.
Mike Diehl.
-----Original Message-----
From: Christopher Sawtell
To: Diehl, Jeffrey; pgsql-sql@postgresql.org; pgsql-general@postgresql.org
Cc: Diehl, Jeffrey
Sent: 1/16/01 3:41 PM
Subject: Re: [SQL] Query from multiple tables...
On Wed, 17 Jan 2001 09:46, Diehl, Jeffrey wrote:
> Hi all,
>
> I have a database, that because of the size of the data, I have
separated
> on day's worth of data into it's own table.
postgreSQL will manage absolutely huge tables without problems.
( 2GB, & 4GB with that file size setting in the kernel )
Why do you have to cut the data into smaller tables?
> I have a separate table for
> each of, say, 30 days. I would like to write a query that will query
the
> last 15 days worth of tables. Can this be done?
No problem, if you leave the data in one table.
> I tried to use a view; that didn't work. I tried to use the union
clause;
> that didn't allow me to do group by...
>
> Is there a better way than to have my program rewrite the sql on the
fly?
you can issue queries direct from the shell script which would allow you
to
use shell variables to hold the date datum.
--
Sincerely etc.,
NAME Christopher Sawtell
CELL PHONE 021 257 4451
ICQ UIN 45863470
EMAIL csawtell @ xtra . co . nz
CNOTES
ftp://ftp.funet.fi/pub/languages/C/tutorials/sawtell_C.tar.gz
-->> Please refrain from using HTML or WORD attachments in e-mails to
me <<--