Hi,
We're facing some perfomance problems with the database for a web site with
very specific needs. First of all, we're using version 8.1 in a server with
1GB of RAM. I know memory normally should be more, but as our tables are not
so big (as a matter of fact, they are small) I think the solution would not
be adding more RAM.
What we basically have is a site where each user has a box with links to
other randomly selected users. Whenever a box from a user is shown, a SPs is
executed: a credit is added to that user and a credit is substracted from
the accounts of the shown links. Accounts with no credits do not have to be
listed. So, we've lots (LOTS) of users querying and updating the same table.
Sometimes with big peaks.
Our first attempt was to split that table in two: one for the actual credits
and another one for the users. So, only the credits table gets updated on
every request, but it has a trigger that updates a flag field in the users
table saying if the user has credits. This had a good impact, but I guess
it's not enough.
For now, we only have 23.000 users, but it's going to grow. Do you have any
advice? Is this possible with postgres or do you recommend just to try with
a volatile memory approach for the credits?
We're using pgpool and the output from free shows only 350M of RAM being
used.
Some relevants parts of the .conf:
max_connections = 160
shared_buffers = 40000
work_mem = 3096
maintenance_work_mem = 131072
max_fsm_pages = 70000
fsync = false
autovacuum = on
Any help would be really appreciated.
Thanks in advance,
Mauro.