Re: poor performance when recreating constraints on large tables - Mailing list pgsql-performance

From Samuel Gendler
Subject Re: poor performance when recreating constraints on large tables
Date
Msg-id BANLkTim5GO2k0m7E2=KevdZrMPwH-9aCDg@mail.gmail.com
Whole thread Raw
In response to Re: poor performance when recreating constraints on large tables  ("Kevin Grittner" <Kevin.Grittner@wicourts.gov>)
Responses Re: poor performance when recreating constraints on large tables  (Greg Smith <greg@2ndquadrant.com>)
List pgsql-performance


On Wed, Jun 8, 2011 at 12:53 PM, Kevin Grittner <Kevin.Grittner@wicourts.gov> wrote:
Samuel Gendler <sgendler@ideasculptor.com> wrote:

> The planner knows how many rows are expected for each step of the
> query plan, so it would be theoretically possible to compute how
> far along it is in processing a query based on those estimates,
> wouldn't it?

And it is sometimes off by orders of magnitude.  How much remaining
time do you report when the number of rows actually processed so far
is five times the estimated rows that the step would process?  How
about after it chugs on from there to 20 time she estimated row
count?  Of course, on your next query it might finish after
processing only 5% of the estimated rows....

Sure, but if it is a query that is slow enough for a time estimate to be useful, odds are good that stats that are that far out of whack would actually be interesting to whoever is looking at the time estimate, so showing some kind of 'N/A' response once things have gotten out of whack wouldn't be unwarranted.  Not that I'm suggesting that any of this is a particularly useful exercise.  I'm just playing with the original thought experiment suggestion.
 

-Kevin

pgsql-performance by date:

Previous
From: Tony Capobianco
Date:
Subject: Re: Oracle v. Postgres 9.0 query performance
Next
From: Claudio Freire
Date:
Subject: Re: poor performance when recreating constraints on large tables