On Mon, 11 Nov 2002, Shaun Thomas wrote:
> On Mon, 11 Nov 2002, Henrik Steffen wrote:
>
> > > - How many clients simultaneously connecting to it?
> > one webserver with max. 50 instances, approximately 10.000 users a day,
> > about 150.000 Pageviews daily. All pages are created on the fly using
> > mod_perl connecting to the db-server.
>
> Aha. What kind of web-side data caching are you doing? That alone can
> drop your load down to < 1. Even something like a 1-hour cache, or
> something you can manually expire can work amazing wonders for database
> usage. So far, the only thing we've found that doesn't really fit this
> model are full text searches.
>
> Here, the biggest difference to our DB server was caused by *not* having
> all of our 9 webservers doing 50+ connections per second, which we
> achieved mainly through caching. Adding another CPU will work as well,
> but as far as a long-term, not just throwing hardware at the problem
> kind of solution goes, see if you can get caching worked in there
> somehow.
>
> Since you know you're using Pg.pm (switch to DBI::pg, trust me on this
> one), you should have little problem either caching your result set or
> even the whole resulting page with select non-cachable parts. Not only
> will that reduce page-load time, but the strain on your database as
> well.
Agreed. I highly recommend squid as a caching proxy. Powerful, fast, and
Open source. It's included in most flavors of Linux. I'm sure it's
available as a port if not included in most BSDs as well.