On Tue, 2005-08-16 at 17:39 +0200, Ulrich Wisser wrote:
> Hello,
>
> one of our services is click counting for on line advertising. We do
> this by importing Apache log files every five minutes. This results in a
> lot of insert and delete statements. At the same time our customers
> shall be able to do on line reporting.
>
> We have a box with
> Linux Fedora Core 3, Postgres 7.4.2
> Intel(R) Pentium(R) 4 CPU 2.40GHz
This is not a good CPU for this workload. Try an Opteron or Xeon. Also
of major importance is the amount of memory. If possible, you would
like to have memory larger than the size of your database.
> 2 scsi 76GB disks (15.000RPM, 2ms)
If you decide your application is I/O bound, here's an obvious place for
improvement. More disks == faster.
> I did put pg_xlog on another file system on other discs.
Did that have a beneficial effect?
> Still when several users are on line the reporting gets very slow.
> Queries can take more then 2 min.
Is this all the time or only during the insert?
> I need some ideas how to improve performance in some orders of
> magnitude. I already thought of a box with the whole database on a ram
> disc. So really any idea is welcome.
You don't need a RAM disk, just a lot of RAM. Your operating system
will cache disk contents in memory if possible. You have a very small
configuration, so more CPU, more memory, and especially more disks will
probably all yield improvements.