First off, I would like to take this opportunity to thank everyone who
worked on the new postgres v6.4. I have it up and working here, and most
all appears well.
I do have two questions. Firstly, I have one table that is updated from a
text file. This table also has a serial field that is incremented with
each new record. Is there a way to use the copy command to have that field
automatically updated - e.g. copy table_one from '/data/files/input.txt'
using delimiters '|' ?
So far, I can get this text file to import just fine when using a table
without a serial field, but as of yet not had any luck in importing it into
a table that has a serial field.
I suppose that I could parse the file and do an "insert into etc.... ",
but I fear that would be a little processor intensive for a 12000 line file.
The second question is that I noticed the ODBC bug (feature?) when linking
Postgres to MS Access still exists. This bug occurs when linking a MS
Access table to a Postgres table, and identifying more than one field as
the unique record identifier. This makes Postgres run until it exhausts
all available memory. Does anyone know a way around this? Enabling read
only ODBC is a feature I would like to make available, but I do not want
the possibility of postgres crashing because of an error on the part of a
MS Access user.
BTW - Having capability to be linked to an Access database is not an
option. The current project I am working on calls for that, so it is a
necessary evil that I hav to live with.
Thanks in advance for any assistance.
Regards - Bob Kruger