Re: inserting huge file into bytea cause out of memory - Mailing list pgsql-general

From liuyuanyuan
Subject Re: inserting huge file into bytea cause out of memory
Date
Msg-id 201308080941247907258@highgo.com.cn
Whole thread Raw
In response to inserting huge file into bytea cause out of memory  (liuyuanyuan <liuyuanyuan@highgo.com.cn>)
Responses Re: inserting huge file into bytea cause out of memory  (Chris Travers <chris.travers@gmail.com>)
List pgsql-general

 
 

liuyuanyuan
 
Date: 2013-08-07 15:26
Subject: Re: [GENERAL] inserting huge file into bytea cause out of memory
On Wed, Aug 7, 2013 at 3:56 PM, Albe Laurenz <laurenz.albe@wien.gv.at> wrote:
> liuyuanyuan wrote:
>>> By the way, my project is about migrating Oracle data of BLOB type to
>>> PostgreSQL database. The out of memory  error occurred  between migrating
>>> Oracle BLOB to PostgreSQL bytea. Another question, if I can't migrate BLOB to bytea,
>>> how about oid type ?
>
      Laurenz Albe wrote:
>> Large Objects (I guess that's what you mean with "oid" here)
>> might be the better choice for you, particularly since you
> >have out of memory problems.
   
      Michael wrote:
>   Take care that the limit of large objects is 2GB in Postgres 9.2 or
>lower (with default block size).By thw way, you will be fine in the
>case of your application. It is also worth noticing that is increased
>to 4TB in 9.3.
     
      Thanks for your last reply!
      I've test Large Object ( oid type ), and it seems better on out of memory.
      But, for the out of memory problem of bytea, we really have no idea to
solve it ? Why there's no way to solve it ? Is this a problem of JDBC ,or the type itself ?
   
Yours,
Liu Yuanyuan

pgsql-general by date:

Previous
From: Sergey Konoplev
Date:
Subject: Re: How to avoid Force Autovacuum
Next
From: Sergey Konoplev
Date:
Subject: Re: Pl/Python runtime overhead