Hi All,
We are having a table whose data we need to bucketize and show. This is
a continuously growing table (archival is a way to trim it to size).
We are facing 2 issues here:
1. When the records in the table are in the range of 10K, it works fine
for some time after starting postgres server. But as time passes, the
entire machine becomes slower and slower - to the extent that we need to
go for a restart. Though taskmgr does not show any process consuming
extra-ordinary amount of CPU / Memory. After a restart of postgres
server, things come back to normal. What may be going wrong here?
2. When the records cross 200K, the queries (even "select count(*) from
_TABLE_") start taking minutes, and sometimes does not return back at
all. We were previously using MySql and at least this query used to work
OK there. [Our queries are of the form "select sum(col1), sum(col2),
count(col3) ... where .... group by ... " ]. Any suggestions ...
Below is the tuning parameter changes thet we did with the help from
internet:
We are starting postgres with the options [-o "-B 4096"], later we added
a "-S 1024" as well - without any visible improvement.
Machine has 1GB RAM.
shadkam