best way to write large data-streams quickly? - Mailing list pgsql-general

From Mark Moellering
Subject best way to write large data-streams quickly?
Date
Msg-id CAA0uU3XCiReRsK9-4Zsk0Mdhan1dG2Q4dj6iPQiEc-kOJumLyw@mail.gmail.com
Whole thread Raw
Responses Re: best way to write large data-streams quickly?  (Steve Atkins <steve@blighty.com>)
List pgsql-general
Everyone,

We are trying to architect a new system, which will have to take several large datastreams (total of ~200,000 parsed files per second) and place them in a database.  I am trying to figure out the best way to import that sort of data into Postgres. 

I keep thinking i can't be the first to have this problem and there are common solutions but I can't find any.  Does anyone know of some sort method, third party program, etc, that can accept data from a number of different sources, and push it into Postgres as fast as possible?

Thanks in advance,

Mark Moellering

pgsql-general by date:

Previous
From: Tom Lane
Date:
Subject: Re: algo for canceling a deadlocked transaction
Next
From: Steve Atkins
Date:
Subject: Re: best way to write large data-streams quickly?