On Fri, 1 Oct 2010 11:30:59 +0100
Thom Brown <thom@linux.com> wrote:
> 2010/10/1 Bjørn T Johansen <btj@havleik.no>:
> > We are using both DB2 and PostgreSQL at work and DB2 has a nice tool, i5 Navigator, where one can enable logging of
SQLstatements and then it
> > will recommed indexes that should/could be created to increase speed...
> > Does there exist a similar tool for PostgreSQL?
>
> You can set log_min_duration_statement to log statements which take
> over a certain amount of time, and then use pgFouine to read the log
> files and identify the most frequently run queries, and the longest
> queries.
Sounds like a something that should be tried...
>
> You can also use the auto_explain contrib module
> (http://www.postgresql.org/docs/9.0/static/auto-explain.html) to log
> the plans of queries which take too long. However, I don't think
> pgFouine can use those outputs.. at least not yet.
Ok, plan B...
>
> But to find out what indexes you'll need, getting used to reading
> query plans will help as it will show you more than just where
> sequentials scans are taking place. It will also show you what the
> planner believes a query will cost compared to how much it actually
> costs, which can provide insight into tables which require vacuuming,
> indexes which might need clustering, or table stats which require
> modifying to match you data.
Yes, but it would be nice to be pointed in the right direction first, but it seems like log_min_duration_statement can
beused for that...
And also running explain involves manually work, would have been nice with some automatic procedure....
Thx... :)
BTJ