Re: random_page_cost = 2.0 on Heroku Postgres - Mailing list pgsql-performance

From Joshua Berkus
Subject Re: random_page_cost = 2.0 on Heroku Postgres
Date
Msg-id 873667728.6754.1329076919645.JavaMail.root@mail-1.01.com
Whole thread Raw
In response to Re: random_page_cost = 2.0 on Heroku Postgres  (Jeff Janes <jeff.janes@gmail.com>)
Responses Re: random_page_cost = 2.0 on Heroku Postgres  (Peter van Hardenberg <pvh@pvh.ca>)
List pgsql-performance
> Is there an easy and unintrusive way to get such a metric as the
> aggregated query times?  And to normalize it for how much work
> happens
> to have been doing on at the time?

You'd pretty much need to do large-scale log harvesting combined with samples of query concurrency taken several times
perminute.  Even that won't "normalize" things the way you want, though, since all queries are not equal in terms of
theamount of data they hit. 

Given that, I'd personally take a statistical approach.  Sample query execution times across a large population of
serversand over a moderate amount of time.  Then apply common tests of statistical significance.  This is why Heroku
hasthe opportunity to do this in a way that smaller sites could not; they have enough servers to (probably) cancel out
anyrandom activity effects. 

--Josh Berkus

pgsql-performance by date:

Previous
From: Jeff Janes
Date:
Subject: Re: random_page_cost = 2.0 on Heroku Postgres
Next
From: Peter van Hardenberg
Date:
Subject: Re: random_page_cost = 2.0 on Heroku Postgres