I have a production database with 500,000 data points of time series data
in 1 table. There have been no performance problems so far. You just have
to make sure your queries are using the indexes you have created.
I talked to Tom Lane and Bruce Momjian at the O'Reilly Open Source
Conference about storing time series data in PostgreSQL. Tom said
relational databases are not really designed for storing time series data
because there is no assumed ordering among the rows in a table. Bruce
suggested I divide the data points into 2 tables- the historical unchanging
data in one table and the most recent changing data in another table.
Next, Bruce suggested using a cluster index on the historical data to take
advantage of the row ordering. Last, Bruce said to create a view to make
the 2 tables appear as one unified table.
In summary, I've had no problems storing time series data in PostgreSQL
using a very simple database structure. Over time, I plan to add data and
if performance slows I'll investigate the recommendations of Bruce.
George
> Date: Fri, 02 Aug 2002 04:28:49 GMT
> From: "Ron Harter" <rharter@pobox.com>
> To: pgsql-general@postgresql.org
> Subject: Temporal Database
> Message-ID: <5Un29.725$my2.36645232@newssvr17.news.prodigy.com>
>
>
> Has anyone used postges to set up a temporal database? What was the
> performance like? How was retrieval of the data?
> Any comments or problems would be appreciated. We are looking at the
> possibility of using postgress to store long term historical data.
>
> The number of data points could be as high 75,000 points. I would expect
> between 250-400 updates per second under normal load but during a
> disturbance
> the update rate could be as high as several thousand per second.
>
> Any comments are appreciated
>
> --
> Ron Harter | Tongue, Tied and Twisted
> rharter@pobox.com | Just an earth bound misfit ... I
> | Pink Floyd