Fabien COELHO <coelho@cri.ensmp.fr> writes:
> There is something I cannot visualise about the idea being discussed.
What I'm imagining is a separate program that you run, and it connects
to the backend and grabs schema data much like pg_dump does. (In fact
the pg_dump code might possibly be useful as a starting point, at least
for the data-acquisition part of it.) Then it prints out some kind of
report.
This would of course only be able to address problems that are apparent
from static inspection of the schema. A further extension would be to
give it a set of sample queries (perhaps grabbed from the postmaster log
output) to see if it can spot anything wrong with the queries --- here,
lack of applicable indexes would be an example of something that can
only be checked when looking at a particular query.
> If the tool is "separate" from postgresql as first suggested, it would
> mean that it should have its own interface? It would be a new command?
> What about GUI such as pgadmin3 of pgaccess?
If you want a GUI, it could be a GUI, though I'd be worried about the
portability price paid to have one. Or are you concerned about whether
a GUI could invoke it? I don't see why not --- the GUIs don't
reimplement pg_dump, do they?
> Or separate only mean that it is a "separate" function of the backend that
> can be triggered by calling existing functions such as "EXPLAIN" or
> "ANALYZE" or new ones such as "CHECK" or "ADVICE" or whatever.
That still leaves us in the situation where only people who are capable
of doing backend programming can help. I hope that a standalone program
would be more understandable and could attract developers who wouldn't
touch the backend.
Also, you'd still have to invent an interface for it --- and the
interface would be constrained by the limits of the FE/BE protocol.
It would have to look like a SQL command that returns a query result,
or possibly NOTICE messages, both of which are pretty confining.
regards, tom lane