I have a cron job that updates the statistics on the "doti_sensor_report" table on the first Saturday of every month. Do you think I should re-generate these statistics more often? This table receives streaming inserts to the volume of about 350 million tuples per-month.
I'll generate new stat's over the weekend, and then execute a new plan
thanks
From: Victor Yegorov [vyegorov@gmail.com] Sent: Friday, September 26, 2014 3:15 PM To: Burgess, Freddie Subject: Re: [PERFORM] Very slow postgreSQL 9.3.4 query
Performance data is included in the attachment, via EXPLAIN PLAN, query takes approx 6 minutes to return count to UI. Amount of data processed is also included in the attachment, 185 million row partition.
It looks like your statistics are off:
-> Index Scan using idx_sensor_report_query_y2014m09 on doti_sensor_report_y2014m09 this__1 (cost=0.57..137498.17 rows=3883 width=0) (actual time=168.416..348873.308 rows=443542 loops=1)
Optimizer expects to find ~ 4k rows, while in reality there're 2 orders of magnitude more rows that matches the condition.