how to handle a big table for data log - Mailing list pgsql-performance

From kuopo
Subject how to handle a big table for data log
Date
Msg-id AANLkTilCP3sGTbHIvrM-ixG-P1Dz6ToqrhbgijY8m8V8@mail.gmail.com
Whole thread Raw
Responses Re: how to handle a big table for data log  ("Jorge Montero" <jorge_montero@homedecorators.com>)
List pgsql-performance
Hi,

I have a situation to handle a log table which would accumulate a
large amount of logs. This table only involves insert and query
operations. To limit the table size, I tried to split this table by
date. However, the number of the logs is still large (46 million
records per day). To further limit its size, I tried to split this log
table by log type. However, this action does not improve the
performance. It is much slower than the big table solution. I guess
this is because I need to pay more cost on the auto-vacuum/analyze for
all split tables.

Can anyone comment on this situation? Thanks in advance.


kuopo.

pgsql-performance by date:

Previous
From: "Kevin Grittner"
Date:
Subject: Re: IDE x SAS RAID 0 on HP DL 380 G5 P400i controller performance problem
Next
From: Daniel Ferreira de Lima
Date:
Subject: Re: IDE x SAS RAID 0 on HP DL 380 G5 P400i controller performance problem