Large datasets - Mailing list pgsql-novice

From Nathaniel
Subject Large datasets
Date
Msg-id 535849.62903.qm@web25003.mail.ukl.yahoo.com
Whole thread Raw
List pgsql-novice
Hello all,

I want to use a database to store lots of records, quite a few million. I'm looking for advice on the best way to add
andretrieve many thousands of records at one go from within a C program. 

Each record is a scientific observation, comprising a few double-precision floating point numbers along with a few
otherfields. I'll have 2 C programs: one populates the DB in real-time with data as it becomes available, and the other
analyses/processesthe data, attempting to complete analysis before more data comes in. 

In detail, I want to:

1. Write an array of C structures as new records in the DB.
2. Send a query to retrieve some subset of DB records, and translate them into C structures.

Sounds pretty standard stuff, but given that both steps may be dealing with a number in the order of 100,000 records at
atime, and there is some pressure to do this quickly (so analysis doesn't lag behind data acquisition), what's the best
waydo code-up interaction with the database? 

I know a little about ECPG (but that seems very inefficient) and have a little experience from a couple of years ago of
writinga backend function that made lots of calls like 'DatumGetFloat8' (but presumably that approach has the benefit
ofdirectly accessing the database files, being a backend call). In the manual I see mention of 'prepared statements'
andthe like which I know nothing about, so please treat me like the ignorant newbie that I am. 

Nathaniel

PS. If the mechanism works in postgres 7.4 then even better, as I'm working with a legacy DB that there is much
resistenceto upgrade. Don't ask! 






pgsql-novice by date:

Previous
From: Michael Wood
Date:
Subject: Re: problem loading sql to database?
Next
From: Sergey Samokhin
Date:
Subject: Can't understand how a query from the documentation works