Re: Parallel queries for a web-application |performance testing - Mailing list pgsql-performance

From Kevin Grittner
Subject Re: Parallel queries for a web-application |performance testing
Date
Msg-id 4C18F97A02000025000324CD@gw.wicourts.gov
Whole thread Raw
In response to Parallel queries for a web-application |performance testing  (Balkrishna Sharma <b_ki@hotmail.com>)
List pgsql-performance
Balkrishna Sharma <b_ki@hotmail.com> wrote:

> I wish to do performance testing of 1000 simultaneous read/write
> to the database.

You should definitely be using a connection pool of some sort.  Both
your throughput and response time will be better that way.  You'll
want to test with different pool sizes, but I've found that a size
which allows the number of active queries in PostgreSQL to be
somewhere around (number_of_cores * 2) + effective_spindle_count to
be near the optimal size.

> My question is:Am I losing something by firing these queries
> directly off the server and should I look at firing the queries
> from different IP address (as it would happen in a web application).

If you run the client side of your test on the database server, the
CPU time used by the client will probably distort your results.  I
would try using one separate machine to generate the requests, but
monitor to make sure that the client machine isn't hitting some
bottleneck (like CPU time).  If the client is the limiting factor,
you may need to use more than one client machine.  No need to use
1000 different client machines.  :-)

-Kevin

pgsql-performance by date:

Previous
From: Tom Wilcox
Date:
Subject: Re: requested shared memory size overflows size_t
Next
From: "jgardner@jonathangardner.net"
Date:
Subject: Re: PostgreSQL as a local in-memory cache