Kilmer C. de Souza said:
> I am a new user of PostgreSQL and there are some questions about its
> performance in a scenario with a high requisition rate
>
> Lets picture an imaginary scenario:
> In my system (Debian Linux), there are 200.000.000 records on the
> database, and a total number of 10.000 diferent users.
> (the manual stated the following: there is a main process
> called postmaster. It starts new processes to each different request and
> each different user ... I dont understand very well ... please correct me
> if I`m wrong)
> If all users try to access, through WEB, at same time the
> database,
> what happens:
> 1. With the OS? Will it crash?
> 2. Will the Postmaster process startup 10.000 diferent
> processes
> at the same time?
> 3. What about performance? Is there any peformance downgrade?
> 4. What is the best solution for this problem?
> 5. How many simultaneos requests may the Postmaster open
> withouth decreasing perfomance?
Depending on your web development environment (java, php, .NET) etc, you
should be able to use some mechanism that will provide a pool of
connections to the database. Each request does not open a new connection
(and then release it), but insteads gets a connection from the pool to
use, and returns it back to the pool when done.
In a recent java web website (I was involved with) which was serving on
average 3 page requests per second, we only used a pool of 8 connections.
Also some pages requests required multiple queries to generate all the
data prior to rendering the page. I can't remember the number of
concurrent user. But you get the idea that even a small pool can service
lots of traffic (if your SQL queries are nicely optimized).
I'm afraid I cannot answer your specific questions on how many
simulatenous and active connections postgres will support; but I suspect
that it is limited by memory and hardware. Perhaps someone else can help.
John Sidney-Woollett