Tim Dudgeon <tdudgeon.ml@gmail.com> writes:
> I'm having problems optimising a query that's very slow due to a sub-query.
I think it might get better if you could fix this misestimate:
> " -> Bitmap Index Scan on idx_sp_property_id
> (cost=0.00..33.90 rows=1146 width=0) (actual time=51.656..51.656 rows=811892 loops=366)"
> " Index Cond: (property_id = ANY ('{1,643413,1106201}'::integer[]))"
1146 estimated vs 811892 actual is pretty bad, and it doesn't seem like
this is a very hard case to estimate. Are the stats for structure_props
up to date? Maybe you need to increase the statistics target for the
property_id column.
Another component of the bad plan choice is this misestimate:
> " -> HashAggregate (cost=1091.73..1091.75 rows=2 width=4) (actual time=2.829..3.212 rows=366 loops=1)"
> " Group Key: structure_props_1.structure_id"
but it might be harder to do anything about that one, since the result
depends on the property_id being probed; without cross-column statistics
it may be impossible to do much better.
regards, tom lane