I have a situation where I am pulling CSV data from various sources
and putting them into a database after they are cleaned up and such.
Currently I am doing bulk of the work outside the database using code
but I think the work would go much faster if I was to use import the
data into temp tables using the COPY command and then cleaning up
there.
The main reason I am not using COPY right now is because postgres will
not allow unprivileged users to issue the COPY from FILENAME. The
only way I could get around this would be to shell out psql or
something but I dont really want to do that.
The other reason I am not using COPY is because it is cumbersome to
create and maintain tables just for the import.
So I am looking for a solution like this.
1. COPY from a text field in a table like this COPY from (select
text_field from table where id =2) as text_data ...
2. The copy command creates a table after a cursory examination of the
data. If the data has headers it uses those field names, if the data
does not have headers it uses col1, col2 etc. Optionally fields and
types could be specified.
Any suggestions?
Cheers.