On Thu, 2003-11-27 at 09:28, Stephen Frost wrote:
> * Bruno Wolff III (bruno@wolff.to) wrote:
> > On Thu, Nov 27, 2003 at 09:15:20 -0500,
> > Stephen Frost <sfrost@snowman.net> wrote:
> > > I don't believe it's possible, currently, to correctly import this
> > > data with copy. I'm not sure the date fields would even be accepted
> > > as date fields. It'd be nice if this could be made to work. From a
> > > user standpoint consider:
> >
> > You can write a filter program that reads the data and passes it off
> > to copy. Perl works pretty well for this.
>
> I already did, but it's basically a poor duplication of what the
> Postgres functions listed already do. Not what I'd consider the best
> scenario. Additionally, overall I'd expect it to be less work to have
> the conversion from text->data type done once and correctly instead of
> run through a filter program to 'clean it up' for Postgres and then also
> run through functions in Postgres (casts at least) to convert it.
How about COPY into a TEMP TABLE for 10k lines, then do an
insert into real_table .... select .... from temp_table;
which converts the data?
You could of course thread the load so 2 or 3 processes execute the data
import.