Re: inserting huge file into bytea cause out of memory - Mailing list pgsql-general

From Albe Laurenz
Subject Re: inserting huge file into bytea cause out of memory
Date
Msg-id A737B7A37273E048B164557ADEF4A58B17BF47B3@ntex2010a.host.magwien.gv.at
Whole thread Raw
In response to Re: inserting huge file into bytea cause out of memory  (liuyuanyuan <liuyuanyuan@highgo.com.cn>)
Responses Re: inserting huge file into bytea cause out of memory  (Michael Paquier <michael.paquier@gmail.com>)
List pgsql-general
liuyuanyuan wrote:
> By the way, my project is about migrating Oracle data of BLOB type to
> PostgreSQL database. The out of memory  error occurred  between migrating
> Oracle BLOB to PostgreSQL bytea. Another question, if I can't migrate BLOB to bytea,
> how about oid type ?

Large Objects (I guess that's what you mean with "oid" here)
might be the better choice for you, particularly since you
have out of memory problems.

While bytea is always written in one piece, you can stream
large objects by reading and writing them in smaller chunks.
Moreober, large objects have a bigger size limit than
the 1GB of bytea.

The downside is that the API is slightly more complicated,
and you'll have to take care that the large object gets
deleted when you remove the last reference to it from your
database.

Yours,
Laurenz Albe

pgsql-general by date:

Previous
From: BladeOfLight16
Date:
Subject: Staging Database
Next
From: Michael Paquier
Date:
Subject: Re: inserting huge file into bytea cause out of memory