On Wed, Jan 11, 2012 at 3:07 AM, Kirill Müller
<kirill.mueller@ivt.baug.ethz.ch> wrote:
> Hi all,
>
> we have a Postgres/PostGIS database with 100+ schemas and 200+ tables in
> each schema, generated automatically. When adding a new PostGIS layer in
> QGis, the application obviously enumerates all tables, and this takes
> minutes. Even browsing the database in pgAdmin3 is horribly slow -- it takes
> several seconds to e.g. open a schema (click on a schema's "+" in the tree
> view).
>
> The problems occurred only after adding that many schemas to the database.
> Before, with only 10+ schemas, the performance was acceptable.
>
> Is this a known limitation of Postgres, or perhaps a misconfiguration of our
> installation? What would you suggest to improve performance here? We
> currently don't have administration rights for the database or login rights
> for the server machine (Linux), but I think we'll need to take care about
> that.
This is a problem I've run into before, but I can't find the previous
post on it. When you run a \d command, if you run top on your server
do you see a single CPU spinning hard on that one command? If so then
it's a pg server side problem, which is what I had on one server with
~40k objects in it.
Off the top of my head I remember something like this helping:
alter function pg_table_is_visible cost 10;
But I'm not sure that's it. Maybe Tom Lane can pipe up on this.