COPY failure - Mailing list pgsql-general

From x
Subject COPY failure
Date
Msg-id 5.1.0.14.0.20010721035801.00a84100@mail.fmaudio.net
Whole thread Raw
Responses Re: COPY failure  (Tom Lane <tgl@sss.pgh.pa.us>)
List pgsql-general
Hi.

I'm trying to run a script that executes multiple \copy commands in order
to import some 5 GB of data. All the input files are computer generated,
simple (5 numeric columns and "\N" for NULL in some cases), use default
delimiters, and _should_ be error free. But I keep getting error messages
for _some_ of the \copy commands.

eg,
pqReadData() -- backend closed the channel unexpectedly.
         This probably means the backend terminated abnormally
         before or while processing the request.
PQendcopy: resetting connection

Questions:

1. Are there any size restrictions on the input files?

2. How do I tell which file or better yet which line is tripping up the
system? I could cut the list in half repeatedly until I find the problem,
but that would be a huge waste of time given how long it takes to import
any of the data. I could setup more verbose logging on the backend, but
will that make a mess if my error is 2 GB into the import?

3. I'm running the Windows/cygwin version of the psql client and a Linux
backend, if that makes a difference.

Any help would be much appreciated.

-Xavier



_________________________________________________________
Do You Yahoo!?
Get your free @yahoo.com address at http://mail.yahoo.com


pgsql-general by date:

Previous
From: "Dr. Evil"
Date:
Subject: Re: How Postgresql Compares For Query And Load Operations
Next
From: Martijn van Oosterhout
Date:
Subject: Performance impact of NULLs and variable length fields