blackwater dev <blackwaterdev@gmail.com> schrieb:
> I have some php code that will be pulling in a file via ftp. This file will
> contain 20,000+ records that I then need to pump into the postgres db. These
> records will represent a subset of the records in a certain table. I basically
> need an efficient way to pump these rows into the table, replacing matching
> rows (based on id) already there and inserting ones that aren't. Sort of
> looping through the result and inserting or updating based on the presents of
> the row, what is the best way to handle this? This is something that will run
> nightly.
Insert you data to a extra table and work with regular SQL to
insert/update the destination table. You can use COPY to insert the data
into your extra table, this works very fast, but you need a suitable
file format for this.
Andreas
--
Really, I'm not out to destroy Microsoft. That will just be a completely
unintentional side effect. (Linus Torvalds)
"If I was god, I would recompile penguin with --enable-fly." (unknow)
Kaufbach, Saxony, Germany, Europe. N 51.05082°, E 13.56889°