Thread: Capturing EXPLAIN ANALYZE for a query without discarding the normal result set
I apologise for dropping this out of nowhere. I had an idea about EXPLAIN ANALYZE that would be very useful for the system we are developing and supporting.
It would be really cool if you could direct the EXPLAIN ANALYZE output to a temporary table so that the query being analyzed could execute normally. Something like this:
EXPLAIN ANALYZE INTO a_temp
SELECT ....
Then we could temporarily cause our application to log EXPLAIN ANALYZE information for certain queries without disrupting normal operation of the system. In the case when we notice long running queries we would then immediately follow up the original query with a select on the temporay table.
We deal with a lot of transient data, so the conditions that cause bad query performance are not always reproducible.
I have no idea how feasible this is, so please feel free to tell me I am an idiot.
- Dave
Re: Capturing EXPLAIN ANALYZE for a query without discarding the normal result set
From
Marti Raudsepp
Date:
On Tue, Jan 14, 2014 at 5:06 AM, Dave Cole <davejohncole@gmail.com> wrote: > It would be really cool if you could direct the EXPLAIN ANALYZE output to a > temporary table so that the query being analyzed could execute normally. You can use the auto_explain contrib module to log the query plans of slow(er) queries: http://www.postgresql.org/docs/current/static/auto-explain.html If you Really Need To, you can use the csvlog log format and import that to a table, but really it's easier to use less/grep/etc. Regards, Marti
Re: Capturing EXPLAIN ANALYZE for a query without discarding the normal result set
From
Tom Lane
Date:
Dave Cole <davejohncole@gmail.com> writes: > It would be really cool if you could direct the EXPLAIN ANALYZE output to a > temporary table so that the query being analyzed could execute normally. What happens if the current transaction rolls back? If you want noninvasive explain data, contrib/auto_explain offers a solution right now. The info goes to the postmaster log, which is perhaps less convenient than a temp table for interactive use, but it doesn't have the rollback problem --- and you can capture data about queries issued by a live application, without hacking the app. regards, tom lane
Re: Capturing EXPLAIN ANALYZE for a query without discarding the normal result set
From
Marti Raudsepp
Date:
On Tue, Jan 14, 2014 at 5:13 AM, Marti Raudsepp <marti@juffo.org> wrote: > You can use the auto_explain contrib module I just remembered that there's also the pg_stat_plans extension, which is closer to what you asked: https://github.com/2ndQuadrant/pg_stat_plans . This one you'll have to build yourself (it's not in contrib). Regards, Marti
Re: Capturing EXPLAIN ANALYZE for a query without discarding the normal result set
From
Jim Nasby
Date:
On 1/13/14, 9:14 PM, Tom Lane wrote: > Dave Cole <davejohncole@gmail.com> writes: >> It would be really cool if you could direct the EXPLAIN ANALYZE output to a >> temporary table so that the query being analyzed could execute normally. > > What happens if the current transaction rolls back? > > If you want noninvasive explain data, contrib/auto_explain offers > a solution right now. The info goes to the postmaster log, which is > perhaps less convenient than a temp table for interactive use, but > it doesn't have the rollback problem --- and you can capture data > about queries issued by a live application, without hacking the app. The downside is that you then have to trawl through logs, which may not be easy (or in some hosting environments, possible). We're actually starting to do the opposite for a lot of automated stuff: we EXPLAIN ANALYZE a CTAS and then read the temptable back out. But that's a lot more data copying than saving the EXPLAIN would be (well, at least normally ;). As for rollback, dblink or FDW fixes that. Or if you're using a front-end that's sophisticated enough, you can pull it outyourself and do whatever with it. -- Jim C. Nasby, Data Architect jim@nasby.net 512.569.9461 (cell) http://jim.nasby.net