Re: Analyze tool? - Mailing list pgsql-general

From Rob Sargent
Subject Re: Analyze tool?
Date
Msg-id 4CA5E575.8080703@gmail.com
Whole thread Raw
In response to Re: Analyze tool?  (Thom Brown <thom@linux.com>)
Responses Re: Analyze tool?  (Thom Brown <thom@linux.com>)
Re: Analyze tool?  (Robert Gravsjö <robert@blogg.se>)
List pgsql-general
Then to get all statements would one simply set log_min_duration to some
arbitrarily small value?

On 10/01/2010 04:30 AM, Thom Brown wrote:
> 2010/10/1 Bjørn T Johansen <btj@havleik.no>:
>> We are using both DB2 and PostgreSQL at work and DB2 has a nice tool, i5 Navigator, where one can enable logging of
SQLstatements and then it will 
>> recommed indexes that should/could be created to increase speed...
>> Does there exist a similar tool for PostgreSQL?
>
> You can set log_min_duration_statement to log statements which take
> over a certain amount of time, and then use pgFouine to read the log
> files and identify the most frequently run queries, and the longest
> queries.
>
> You can also use the auto_explain contrib module
> (http://www.postgresql.org/docs/9.0/static/auto-explain.html)  to log
> the plans of queries which take too long.  However, I don't think
> pgFouine can use those outputs.. at least not yet.
>
> But to find out what indexes you'll need, getting used to reading
> query plans will help as it will show you more than just where
> sequentials scans are taking place.  It will also show you what the
> planner believes a query will cost compared to how much it actually
> costs, which can provide insight into tables which require vacuuming,
> indexes which might need clustering, or table stats which require
> modifying to match you data.
>
> There might be a tool out there for PostgreSQL like you describe,
> although I'm not personally aware of it.
>

pgsql-general by date:

Previous
From: Raymond O'Donnell
Date:
Subject: Re: Slony-I cluster creation Help.
Next
From: Thom Brown
Date:
Subject: Re: Analyze tool?