You can do in-memory transfers using 'copy' - you do that with the stdout option. Performance is very good if you're on the same machine as the sql server. As far as I could see the transfer never touched the disk (once I loaded the data into memory from a file) and I was getting blistering insert speeds (this was back in 2009 but if I remember right, the copy speed using this method was 50x the speed of a traditional insert command, even when a bunch of inserts were wrapped in a transaction).
I dug up the code I used back then - it's in Ruby, using ActiveRecord, and is from 2009 (hasn't been used since). But maybe it is useful for porting to your environment: https://gist.github.com/science/15e97e414d5666c2f486
Obviously network is a likely bottleneck if you're not on the same box.
> Anyway, taking the function name 'bulk' into account - I think you want > find the best way to to insert a large number of row/records. > > Review the 'copy' command to populate a database: > http://www.postgresql.org/docs/9.1/static/populate.html - read > carefully as it will increase performance in a dramatic way.
I am aware of the copy command, but we are talking about application inserting rows from a C++ task with values coming in variables. Does COPY handle that? COPY looks more closer to sqlloader of Oracle.
DB2 Bulkcopy API is damn good. We have applications inserting tens of rows in one single call.