vacuumdb failed - Mailing list pgsql-general

From George Robinson II
Subject vacuumdb failed
Date
Msg-id 39A6BF17.9D60C9E@eurekabroadband.com
Whole thread Raw
Responses Re: vacuumdb failed  (Tom Lane <tgl@sss.pgh.pa.us>)
List pgsql-general
    Last night, while my perl script was doing a huge insert operation, I
got this error...

DBD::Pg::st execute failed: ERROR:  copy: line 4857, pg_atoi: error
reading "2244904358": Result too large

    Now, I'm not sure if this is related, but while trying to do vacuumdb
<dbname>, I got...

NOTICE:  FlushRelationBuffers(all_flows, 500237): block 171439 is
referenced (private 0, global 1)
FATAL 1:  VACUUM (vc_repair_frag): FlushRelationBuffers returned -2
pqReadData() -- backend closed the channel unexpectedly.
        This probably means the backend terminated abnormally
        before or while processing the request.
connection to server was lost
vacuumdb: vacuum failed

    Any ideas?  I'm trying a couple other things right now.  By the way,
this database has one table that is HUGE.  What is the limit on table
size in postgresql7?  The faq says unlimited.  If that's true, how do
you get around the 2G file size limit that (at least) I have in solaris
2.6?

Thank you.

-g2

pgsql-general by date:

Previous
From: "Martin A. Marques"
Date:
Subject: Re: alter table and constraints
Next
From: "Ross J. Reedstrom"
Date:
Subject: Re: Creating a DB for another user (or...) (repost attempt)