High Frequency Inserts to Postgres Database vs Writing to a File - Mailing list pgsql-performance

From Jay Manni
Subject High Frequency Inserts to Postgres Database vs Writing to a File
Date
Msg-id 60B0F2124D07B942988329B5B7CA393D01E5B93FF4@mail2.FireEye.com
Whole thread Raw
List pgsql-performance

Hi:

 

I have an application wherein a process needs to read data from a stream and store the records for further analysis and reporting. The data in the stream is in the form of variable length records with clearly defined fields – so it can be stored in a database or in a file. The only caveat is that the rate of records coming in the stream could be several 1000 records a second.

 

The design choice I am faced with currently is whether to use a postgres database or a flat file for this purpose. My application already maintains a postgres (8.3.4) database for other reasons – so it seemed like the straightforward thing to do. However I am concerned about the performance overhead of writing several 1000 records a second to the database. The same database is being used simultaneously for other activities as well and I do not want those to be adversely affected by this operation (especially the query times). The advantage of running complex queries to mine the data in various different ways is very appealing but the performance concerns are making me wonder if just using a flat file to store the data would be a better approach.

 

Anybody have any experience in high frequency writes to a postgres database?

 

- Jay

pgsql-performance by date:

Previous
From: Massan
Date:
Subject: Problem with database performance, Debian 4gb ram ?
Next
From: Greg Smith
Date:
Subject: Re: High Frequency Inserts to Postgres Database vs Writing to a File