On Tue, 25 Jun 2002, Geraldo Lopes de Souza wrote:
>I want to know if are there real world cases of postgresql supporting
>200 simultaneous connections in two tier application.
Our web site and content management system (www.mcgill.ca) is a home
grown system built around Perl, PHP and PostgreSQL runing on PPC Linux
(YellowDog). The database isn't taxed very heavily because I designed it
so that the system caches preferences and such (results from SQL queries)
in the file system next to the page. However, when ever I make a system
wide change (a little under 7000 pages), I have a Perl script that runs
through the file system updating the cached files. During this time, I've
easily seen over 500 connections to the database without any noticable
effect on the performance of the webserver (that still also does live
queries on many pages if the viewer is anonymous, or all pages if the
view is logged in).
Our web server is an 867Mhz G4 'QuickSilver' and the Posgres machine is a
dual 800 MHz QuickSilver as well (running in single CPU mode thanks to a
hardware bug). Both have 1GB of RAM, and the Postgres machine spends
most of its day yawning. :-)
Cheers,
Chris
--
Christopher Murtagh
Webmaster / Sysadmin
Web Communications Group
McGill University
Montreal, Quebec
Canada
Tel.: (514) 398-3122
Fax: (514) 398-2017