Postgresql performace question - Mailing list pgsql-general

From Mark Jones
Subject Postgresql performace question
Date
Msg-id sthaj-ov7.ln1@news.hackerjones.org
Whole thread Raw
Responses Re: Postgresql performace question  (Rod Taylor <rbt@rbt.ca>)
List pgsql-general
Hello

I am working on a project that acquires real-time data from an external
device that I need to store and be able to search through and retrieve
quickly. My application receives packets of data ranging in size from 300 to
5000 bytes every 50 milliseconds for the minimum duration of 24 hours before
the data is purged or archived off disk. There are several fields in the
data that I like to be able to search on to retrieve the data at later time.
By using a SQL database such as Postgresql or Mysql it seams that it would
make this task much easier. My questions are, is a SQL database such as
Postgresql able to handle this kind of activity saving a record of 5000
bytes at rate of 20 times a second, also how well will it perform at
searching through a database which contains nearly two million records at a
size of about 8 - 9 gigabytes of data, assuming that I have adequate
computing hardware. I am trying to determine if a SQL database would work
well for this or if I need to write my own custom database for this project.
If anyone has any experience in doing anything similar with Postgresql  I
would love to know about your findings.

Thanks
Mark



pgsql-general by date:

Previous
From: "Robert McKee"
Date:
Subject: Q from new user about postgresql?
Next
From: dhoubrechts
Date:
Subject: difference between overlap and intersect using geometric types of postgresql