Re: URGENT: pg_dump error - Mailing list pgsql-general

From Dmitry Tkach
Subject Re: URGENT: pg_dump error
Date
Msg-id 3E49482C.6050002@openratings.com
Whole thread Raw
In response to URGENT: pg_dump error  (jerome <jerome@gmanmi.tv>)
List pgsql-general
I suspect, your problem is that the output file is too large (if you are on ext2, you cannot have files larger than 2
gigin the filesystem). 
Try this:
pg_dump mydatabase -t mytable | gzip -f > sample.gz
or
pg_dump mydatabase -t mytable | split -C 2000m - sample.
or even
pg_dump mydatabase -t mytable | gzip -f | split -b 2000m - sample.gz.
...

The first case should work, unless even the compressed file is larger than 2 gig, either of the other two will work
regardlessof the output size 
(as long as it fits on your disk of course).
In the two last cases, it will create several files, called like sample.aa, sample,ab... or sample.gz.aa, sample.gz.bb
etc...
To 'reassemble' them later, you'll need something like:

cat sample.* | psql mydatabase                 #for the first case  - no gzip or
cat sample.gz.* | gunzip -f | psql mydatabase

I hope, it helps...

Dima

jerome wrote:
> i tried to do pg_dump
>
> pg_dump mydatabase -t mytable > sample
>
> it always results to=20
>
> KILLED
>
> can anyone tell me what should i do...
>
> TIA
>
> ---------------------------(end of broadcast)---------------------------
> TIP 6: Have you searched our list archives?
>
> http://archives.postgresql.org


pgsql-general by date:

Previous
From: Tom Lane
Date:
Subject: Re: Is Hash Agg being used? 7.4 seems to handle this query worse than 7.3
Next
From: Tom Lane
Date:
Subject: Re: Is Hash Agg being used? 7.4 seems to handle this query worse than 7.3