Performance issues when the number of records are around 10 Million - Mailing list pgsql-performance

From venu madhav
Subject Performance issues when the number of records are around 10 Million
Date
Msg-id AANLkTin1f-9M0iNmIEtU7YlvEsVAt6TmHE1vDk3iUmQq@mail.gmail.com
Whole thread Raw
Responses Re: Performance issues when the number of records are around 10 Million  ("Kevin Grittner" <Kevin.Grittner@wicourts.gov>)
Re: Performance issues when the number of records are around 10 Million  ("Jorge Montero" <jorge_montero@homedecorators.com>)
Re: Performance issues when the number of records are around 10 Million  (Shrirang Chitnis <Shrirang.Chitnis@hovservices.com>)
Re: Performance issues when the number of records are around 10 Million  (Josh Berkus <josh@agliodbs.com>)
List pgsql-performance
Hi all,
      In my database application, I've a table whose records can reach 10M and insertions can happen at a faster rate like 100 insertions per second in the peak times. I configured postgres to do auto vacuum on hourly basis. I have frontend GUI application in CGI which displays the data from the database. When I try to get the last twenty records from the database, it takes around 10-15  mins to complete the operation.This is the query which is used:

select e.cid, timestamp, s.sig_class, s.sig_priority, s.sig_name, e.sniff_ip, e.sniff_channel, s.sig_config, e.wifi_addr_1,
e.wifi_addr_2, e.view_status, bssid  FROM event e, signature s WHERE s.sig_id = e.signature   AND e.timestamp >= '1270449180' AND e.timestamp < '1273473180'  ORDER BY e.cid DESC,  e.cid DESC limit 21 offset 10539780;

Can any one suggest me a better solution to improve the performance.

Please let me know if you've any further queries.


Thank you,
Venu

pgsql-performance by date:

Previous
From: thilo
Date:
Subject: Re: Slow Bulk Delete
Next
From: "Kevin Grittner"
Date:
Subject: Re: Performance issues when the number of records are around 10 Million