Performance issues when the number of records are around 10 Million - Mailing list pgadmin-support

From venu madhav
Subject Performance issues when the number of records are around 10 Million
Date
Msg-id AANLkTinFR92eAjVJcukjHV8Nrb4CyjKRwayUTLYTcjMv@mail.gmail.com
Whole thread Raw
Responses Re: Performance issues when the number of records are around 10 Million  (Guillaume Lelarge <guillaume@lelarge.info>)
List pgadmin-support
Hi all,<br />       In my database application, I've a table whose records can reach 10M and insertions can happen at a
fasterrate like 100 insertions per second in the peak times. I configured postgres to do auto vacuum on hourly basis. I
havefrontend GUI application in CGI which displays the data from the database. When I try to get the last twenty
recordsfrom the database, it takes around 10-15  mins to complete the operation.This is the query which is used:<br
/><br/><b> select e.cid, timestamp, s.sig_class, s.sig_priority, s.sig_name, e.sniff_ip, e.sniff_channel, s.sig_config,
e.wifi_addr_1,<br/> e.wifi_addr_2, e.view_status, bssid  FROM event e, signature s WHERE s.sig_id = e.signature   AND
e.timestamp>= '1270449180' AND e.timestamp < '1273473180'  ORDER BY e.cid DESC,  e.cid DESC limit 21 offset
10539780;<br/></b><br /> Can any one suggest me a better solution to improve the performance.<br /><br />Please let me
knowif you've any further queries.<br /><br /><br /> Thank you,<br /><font color="#888888">Venu</font> 

pgadmin-support by date:

Previous
From: Sandeep Thakkar
Date:
Subject: Re: error on getting sources of pgadmin3 using rsync
Next
From: Guillaume Lelarge
Date:
Subject: Re: Performance issues when the number of records are around 10 Million