Bill Brandt <brandtwr-pgsql@draaw.net> writes:
> I'm trying to upgrade from 6.3. In doing this, I performed a pg_dump of
> each database (pg_dump -o where the oid's matter). I have one database
> (which doesn't use oid info in the tables btw) that is around 40MB of
> data. The output file that created looks fine, but when I go to load the
> data, it gets a portion of the way into loading and indicates that you
> cannot exceed 20000 lines of input on a copy command. Is there a way to
> dump the database that it won't try more than 20000 lines at a time?
You can just split the command into sections by editing the dump file
(assuming you have an editor that will cope ;-)):
COPY table FROM stdin;
blah blah
blah blah
\.
COPY table FROM stdin;
more more
more more
\.
COPY is really APPEND, it doesn't delete any tuples that were there,
so this should work.
Still, a 20K-line limit sounds awfully brain-dead to me. And a quick
look at the source code (both 6.4.2 and current CVS) doesn't show any
such check unless I missed something. What version are you using?
What's the exact text of the error message?
regards, tom lane