It all depends on what you want to do with that data. To give you an
idea as to what I mean I currently have a database that resides on a
Pentium II 450 with 768 M of ram and IDE hard drives. This database
has several tables with over ten million records in them and each of
these tables gets an average of nearly 18,000 inserts a day (there are
no updates or deletes on these large tables).
Of course, this system has a fairly limited number of users (less than
30), and the queries generally only ask for a small subset of the data
(sequential scans of the large tables take more than a minute to
complete, but index scans return very fast).
My guess is that you are going to be just fine :), and if you do end
up with a query that takes a long time to return chances are good that
someone on the lists will have a solution.
Jason
Fernando San Martín Woerner <snmartin@galilea.cl> writes:
> I need to build a postgresql database with 2 tables containing
> 70.000 records each one, but they'll increase their size in 4.000
> records monthly and some triggers and functions will be running on
> this tables plus other smaller tables less than 40.000 records.
>
> So i'm planning to implement a Intel PentiumIII Server with 2 cpu,
> 1GB RAM and a SCSI HDD with 10 GB running red hat 7.2.
>
> will be enough? have you experiences about it? some tips?
>
>
> thanks and happy new year....
>
>
>
> Fernando San Martín Woerner counter.li.org Linux User #216550
> Jefe Depto. Informática Galilea S.A.
> Talca, VII Región Chile (56)71-224876
> ----------------------------------------
> Si hubiera previsto las consecuencias me hubiera hecho relojero.
>
> Albert Einstein
>
>
> ---------------------------(end of broadcast)---------------------------
> TIP 4: Don't 'kill -9' the postmaster