Stephan Szabo wrote:
>>explain delete from login where userid = 'jennie';
>> QUERY PLAN
>>-----------------------------------------------------------
>> Seq Scan on login (cost=0.00..2045.30 rows=3421 width=6)
>> Filter: (userid = 'jennie'::text)
>>
>
> Well at 3421 of 96824 it's estimating that the cost is lower, what's
> the explain look like with seqscan turned off (my guess'd be it's
> slightly higher cost). It's possible that random_page_cost should
Yepp! You're right. The cost is higher:
set enable_seqscan to off;
explain delete from login where userid = 'jennie';
QUERY PLAN
------------------------------------------------------------------------------------
Index Scan using login_userid_idx on login (cost=0.00..3363.71
rows=4131 width=6)
Index Cond: (userid = 'jennie'::text)
If I lower the random_page_cost to about 2 the index is being used
instead of seq scan. Is it reasonable to have such a setting on a
production server? random_page_cost = 2 is good for this particular
query, but could it have negative effect on other queries?
> be lower, or that perhaps there's some level of clustering in the data
> that's not being picked up. You might want to try raising the
> number of statistics buckets and re-analyzing just to see if that helps.
I'm afraid I'm a bit too new at this kind of tweaking... do you mean the
"default_statistics_target"? In that case I tried to raise it from the
default 10 to as high as 45, but without any other result than vacuum
analyze being slower. Did I understand your suggestion right?
Regards,
Patrik Kudo