Re: Tetra-bytes database / large indexes needs - Mailing list pgsql-hackers

From Hannu Krosing
Subject Re: Tetra-bytes database / large indexes needs
Date
Msg-id 1053633025.1788.30.camel@fuji.krosing.net
Whole thread Raw
In response to Tetra-bytes database / large indexes needs  (Jean-Michel POURE <jm.poure@freesurf.fr>)
Responses Re: Tetra-bytes database / large indexes needs
List pgsql-hackers
Jean-Michel POURE kirjutas N, 22.05.2003 kell 11:38:
> Dear all,
>
> A friend of mine needs to import and query a very large amount of data, coming
> from real-time acquisition systems.

What kind of querying does he need ?

You could check out Telegraph for continuous queries

http://telegraph.cs.berkeley.edu/

>  The database is growing fast, several
> Tetra-bytes a day.

if you mean terabytes TB then 1 TB/day ~= 12.7 MB/sec, just about as
fast as you can write on an average ide drive if you do nothing else, or
what can come in over a 100 base T ethernet.

even in ide disks (the biggest available disks apiece) you have to add 5
disks a day just to store the incoming 1TB/day.

> What is the advancement of the community in the field of very large databases?
> Could you point out to me some useful information, techdocs, etc...?

no ;)

> Are there working groups, private fundings in this precise field?

dunno..


You are possibly on the verge of impossible ?

If you can hardly write down the data due to physical constraints
(memory and bus speeds) , it is very hard to also query it, unless you
deploy some terribly clever techniques which extract and compress your
data before it even gets to your DB ;)

Perhaps you (or your friend) are after filtering and not queries at all
?

--------------
Hannu


pgsql-hackers by date:

Previous
From: Bruce Momjian
Date:
Subject: I am back to email/patches
Next
From: Bruce Momjian
Date:
Subject: Win32 status