Typically what you do is pre-process your files a bit
so that the copy command will be more happy with them.
For example, in your case you clearly have some sort
of end of record string. Why not simply replace this
string with a newline and *then* feed it to the copy
command. Your filter program could be as simple as
the following one line of Perl:
cat foo | perl -pe 's/Jason/\n/g;'
This script takes the file foo and replaces all
occurances of 'Jason' with a newline. Perl is an
excellent text mashing language. Heck, I am a Python
bigot, but Perl is so handy for this sort of stuff
that it really can't afford to be overlooked.
The trick is to cook up a text filter that can they be
used with pipes from the command line like so:
cat orig_file | perl_script | psql -d your_database
Feel free to email me if you have any specific
questions,
Jason Earl
--- Peter Tan <kaihua@cs.stanford.edu> wrote:
> When using the copy command to bulk load data, could
> I specify the string of
> "end of record", insted of using newline? If the
> answer is no, is there any
> alternative way? I have lots of text files which I
> want to use copy command
> to load, but each row may contain several lines of
> text.
>
> Thank you
> Peter
>
> ---------------------------(end of
> broadcast)---------------------------
> TIP 5: Have you checked our extensive FAQ?
>
> http://www.postgresql.org/users-lounge/docs/faq.html
__________________________________________________
Do You Yahoo!?
Make a great connection at Yahoo! Personals.
http://personals.yahoo.com