Thread: inserting huge file into bytea cause out of memory

inserting huge file into bytea cause out of memory

From
liuyuanyuan
Date:
Hello everyone!
 I got a out of memory problem, when I tried to insert a binary file (256MB) to bytea column; 
 I want to get a way to insert files (vary from 1byte to 2GB)  or byte array or binary stream into
 PostgreSQL bytea field, never cause out of memory. Fellowed by the details.
 Anybody know about this, please write to me.
 Thanks in advance!
 
Table defination: 
     create table image_bytea(t_id int, t_image  bytea);
 Major code:
     String sql = "insert into image_bytea(t_id, t_image) values (?, ?)";
     ps = conn.prepareStatement(sql);
    ps.setInt(1, 88);
     File file = new file("d://1.jpg"); 
    InputStream in = new BufferedInputStream(new FileInputStream(file));
     ps.setBinaryStream(2, in, (int) file.length());
     System.out.println("set");
     ps.executeUpdate();
Error detail:
org.postgresql.util.PSQLException: Error: out of memory
  Details:Failed on request of size 268443660.
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2157)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1886)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:555)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:417)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:410)
at com.highgo.hgdbadmin.migrationassistant.controller.MigrationController.executeInsert(MigrationController.java:1400)
at com.highgo.hgdbadmin.migrationassistant.controller.MigrationController.insertDate2HG(MigrationController.java:1143)
at com.highgo.hgdbadmin.migrationassistant.controller.MigrationController.migrateTable(MigrationController.java:898)
at com.highgo.hgdbadmin.migrationassistant.controller.MigrationController.migrate(MigrationController.java:301)
at com.highgo.hgdbadmin.migrationassistant.controller.MigrationController.access$000(MigrationController.java:66)
at
com.highgo.hgdbadmin.migrationassistant.controller.MigrationController$MigrateRunnable.run(MigrationController.java:241)
at java.lang.Thread.run(Thread.java:662)


Liu Yuanyuan
August 6, 2013

Re: inserting huge file into bytea cause out of memory

From
sachin kotwal
Date:
>I got a out of memory problem, when I tried to insert a binary file (256MB)
to bytea column;
>I want to get a way to insert files (vary from 1byte to 2GB)  or byte array
or binary stream into
>PostgreSQL bytea field, never cause out of memory. Fellowed by the details.
>Anybody know about this, please write to me.
>Thanks in advance!

Maximum data size allowed to store in BYTEA data types is 1GB.
so you can store data less than 1 GB.

When you are inserting system asking for more data than your expectation. by
this message-
"Details:Failed on request of size 268443660"
so please check anything wrong elsewhere in your code.

Also check free memory of your system, at the time of insertion.


http://www.microolap.com/products/connectivity/postgresdac/help/TipsAndTricks/ByteaVsOid.htm






-----
Thanks and Regards,

Sachin Kotwal
NTT-DATA-OSS Center (Pune)
--
View this message in context:
http://postgresql.1045698.n5.nabble.com/inserting-huge-file-into-bytea-cause-out-of-memory-tp5766466p5766503.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.


Re: inserting huge file into bytea cause out of memory

From
"Tomas Vondra"
Date:
Hi,

On 6 Srpen 2013, 9:12, liuyuanyuan wrote:
> Error detail:
> org.postgresql.util.PSQLException: Error: out of memory
>   Details:Failed on request of size 268443660.

Seems like an issue with the OS, not with PostgreSQL, to me.

What OS and HW are you using? How much memory you have and do you have
some user limits in place? For example Linux uses ulimit and some kernel
parameters to limit how much memory can be allocated by a process.

Try this as postgres (or the user you run the database as):

   ulimit -v

and as root

   sysctl.overcommit_memory

These are the

What limits do you have in place options that bite me most often.

Tomas



Re: inserting huge file into bytea cause out of memory

From
Chris Travers
Date:


On Tue, Aug 6, 2013 at 7:04 AM, Tomas Vondra <tv@fuzzy.cz> wrote:
Hi,

On 6 Srpen 2013, 9:12, liuyuanyuan wrote:
> Error detail:
> org.postgresql.util.PSQLException: Error: out of memory
>   Details:Failed on request of size 268443660.

Seems like an issue with the OS, not with PostgreSQL, to me.

What OS and HW are you using? How much memory you have and do you have
some user limits in place? For example Linux uses ulimit and some kernel
parameters to limit how much memory can be allocated by a process.

I have noticed a number of bytea/memory issues.  This looks like Java, and I am less familiar with that but there are some things that occur to me.  There are a few things that make me relatively suspicious of using byteas where the file size is big (lobs are more graceful in those areas IMO because of the fact that you can do seeking and chunking).

On the client side a lot of the difficulties tend to have to do with escaping and unescaping. While I have not done a lot with Java in this area, I have found that Perl drivers sometimes use up to 10x the memory to process the file as the file would take up in binary format.  I suspect this has to do with copying the data, escaping it, and passing it on through.  For small files this is not an issue but if you are passing 2GB of data in, you had better have a LOT of memory.  I wouldn't be surprised if it were similar in Java.

Now, if the front end and back end are on the same server, front-end memory usage is going to count against you.  Consequently you are going to have at least the following memory counting against you:

1.  The file in binary form
2.  The file in escaped form
3.  The file in escaped form on the back-end
4.  The file in binary form on the back-end.

If hex escaping effectively doubles the size that gives you 6x the memory just for that data.  If it is getting copied elsewhere for intermediary usage, it could be significantly more.

So I would start actually by looking at memory utilization on your machine (front and back-end processes if on the same machine!) and see what is going on.

--
Best Wishes,
Chris Travers

Efficito:  Hosted Accounting and ERP.  Robust and Flexible.  No vendor lock-in.

Re: inserting huge file into bytea cause out of memory

From
liuyuanyuan
Date:

Hi, Tomas Vondra!
Thanks for all of your help!
 
My test PC is Win7 (64-bit), and equipped with 8GB of memory.
In this java project, I configured VM option as:  
     -D java.security.policy=applet.policy -Xms1280m -Xmx1536m.
 
Any thing you want to know, just write to me!
Best Regard!
 
Liu Yuanyuan
Aug 7, 2013
 
Date: 2013-08-07 00:49
Subject: Re: [GENERAL] inserting huge file into bytea cause out of memory


On Tue, Aug 6, 2013 at 7:04 AM, Tomas Vondra <tv@fuzzy.cz> wrote:
Hi,

On 6 Srpen 2013, 9:12, liuyuanyuan wrote:
> Error detail:
> org.postgresql.util.PSQLException: Error: out of memory
>   Details:Failed on request of size 268443660.

Seems like an issue with the OS, not with PostgreSQL, to me.

What OS and HW are you using? How much memory you have and do you have
some user limits in place? For example Linux uses ulimit and some kernel
parameters to limit how much memory can be allocated by a process.

 

Re: inserting huge file into bytea cause out of memory

From
liuyuanyuan
Date:

Hi!
Thanks for all of  your interest!
My test PC is Win7 (64-bit), and equipped with 8GB of memory.
In this java project, I configured VM option as:  
     -D java.security.policy=applet.policy -Xms1280m -Xmx1536m.
And anything needed in the project is in the server descripted above.
I insert
 
By the way, my project is about migrating Oracle data of BLOB type to
PostgreSQL database. The out of memory  error occurred  between migrating
Oracle BLOB to PostgreSQL bytea. Another question, if I can't migrate BLOB to bytea,
how about oid type ?
 
If  anybody know about this problem, please write to me.
Thanks in advance !
 
Liu Yuanyuan
Aug 7, 2013

liuyuanyuan
 
Date: 2013-08-07 00:49
Subject: Re: [GENERAL] inserting huge file into bytea cause out of memory


On Tue, Aug 6, 2013 at 7:04 AM, Tomas Vondra <tv@fuzzy.cz> wrote:
Hi,

On 6 Srpen 2013, 9:12, liuyuanyuan wrote:
Table defination: 
     create table image_bytea(t_id int, t_image  bytea);
 Major code:
     String sql = "insert into image_bytea(t_id, t_image) values (?, ?)";
     ps = conn.prepareStatement(sql);
    ps.setInt(1, 88);
     File file = new file("d://1.jpg"); 
    InputStream in = new BufferedInputStream(new FileInputStream(file));
     ps.setBinaryStream(2, in, (int) file.length());
     System.out.println("set");
     ps.executeUpdate();
Error detail:
     org.postgresql.util.PSQLException: Error: out of memory
     Details:Failed on request of size 268443660.

Seems like an issue with the OS, not with PostgreSQL, to me.

What OS and HW are you using? How much memory you have and do you have
some user limits in place? For example Linux uses ulimit and some kernel
parameters to limit how much memory can be allocated by a process.

I have noticed a number of bytea/memory issues.  This looks like Java, and I am less familiar with that but there are some things that occur to me.  There are a few things that make me relatively suspicious of using byteas where the file size is big (lobs are more graceful in those areas IMO because of the fact that you can do seeking and chunking).

On the client side a lot of the difficulties tend to have to do with escaping and unescaping. While I have not done a lot with Java in this area, I have found that Perl drivers sometimes use up to 10x the memory to process the file as the file would take up in binary format.  I suspect this has to do with copying the data, escaping it, and passing it on through.  For small files this is not an issue but if you are passing 2GB of data in, you had better have a LOT of memory.  I wouldn't be surprised if it were similar in Java.

Now, if the front end and back end are on the same server, front-end memory usage is going to count against you.  Consequently you are going to have at least the following memory counting against you:

1.  The file in binary form
2.  The file in escaped form
3.  The file in escaped form on the back-end
4.  The file in binary form on the back-end.

If hex escaping effectively doubles the size that gives you 6x the memory just for that data.  If it is getting copied elsewhere for intermediary usage, it could be significantly more.

So I would start actually by looking at memory utilization on your machine (front and back-end processes if on the same machine!) and see what is going on.

--
Best Wishes,
Chris Travers

Efficito:  Hosted Accounting and ERP.  Robust and Flexible.  No vendor lock-in.

Re: inserting huge file into bytea cause out of memory

From
Albe Laurenz
Date:
liuyuanyuan wrote:
> By the way, my project is about migrating Oracle data of BLOB type to
> PostgreSQL database. The out of memory  error occurred  between migrating
> Oracle BLOB to PostgreSQL bytea. Another question, if I can't migrate BLOB to bytea,
> how about oid type ?

Large Objects (I guess that's what you mean with "oid" here)
might be the better choice for you, particularly since you
have out of memory problems.

While bytea is always written in one piece, you can stream
large objects by reading and writing them in smaller chunks.
Moreober, large objects have a bigger size limit than
the 1GB of bytea.

The downside is that the API is slightly more complicated,
and you'll have to take care that the large object gets
deleted when you remove the last reference to it from your
database.

Yours,
Laurenz Albe

Re: inserting huge file into bytea cause out of memory

From
Michael Paquier
Date:
On Wed, Aug 7, 2013 at 3:56 PM, Albe Laurenz <laurenz.albe@wien.gv.at> wrote:
> liuyuanyuan wrote:
>> By the way, my project is about migrating Oracle data of BLOB type to
>> PostgreSQL database. The out of memory  error occurred  between migrating
>> Oracle BLOB to PostgreSQL bytea. Another question, if I can't migrate BLOB to bytea,
>> how about oid type ?
>
> Large Objects (I guess that's what you mean with "oid" here)
> might be the better choice for you, particularly since you
> have out of memory problems.
Take care that the limit of large objects is 2GB in Postgres 9.2 or
lower (with default block size).By thw way, you will be fine in the
case of your application. It is also worth noticing that is increased
to 4TB in 9.3.
--
Michael


Re: inserting huge file into bytea cause out of memory

From
liuyuanyuan
Date:

 
 

liuyuanyuan
 
Date: 2013-08-07 15:26
Subject: Re: [GENERAL] inserting huge file into bytea cause out of memory
On Wed, Aug 7, 2013 at 3:56 PM, Albe Laurenz <laurenz.albe@wien.gv.at> wrote:
> liuyuanyuan wrote:
>>> By the way, my project is about migrating Oracle data of BLOB type to
>>> PostgreSQL database. The out of memory  error occurred  between migrating
>>> Oracle BLOB to PostgreSQL bytea. Another question, if I can't migrate BLOB to bytea,
>>> how about oid type ?
>
      Laurenz Albe wrote:
>> Large Objects (I guess that's what you mean with "oid" here)
>> might be the better choice for you, particularly since you
> >have out of memory problems.
   
      Michael wrote:
>   Take care that the limit of large objects is 2GB in Postgres 9.2 or
>lower (with default block size).By thw way, you will be fine in the
>case of your application. It is also worth noticing that is increased
>to 4TB in 9.3.
     
      Thanks for your last reply!
      I've test Large Object ( oid type ), and it seems better on out of memory.
      But, for the out of memory problem of bytea, we really have no idea to
solve it ? Why there's no way to solve it ? Is this a problem of JDBC ,or the type itself ?
   
Yours,
Liu Yuanyuan

Re: inserting huge file into bytea cause out of memory

From
Chris Travers
Date:


On Wed, Aug 7, 2013 at 6:41 PM, liuyuanyuan <liuyuanyuan@highgo.com.cn> wrote:

     
      Thanks for your last reply!
      I've test Large Object ( oid type ), and it seems better on out of memory.
      But, for the out of memory problem of bytea, we really have no idea to
solve it ? Why there's no way to solve it ? Is this a problem of JDBC ,or the type itself ?

I think the big difficulty efficiency-wise is in the fact that everything is exchanged in a textual representation.  This means you have likely at least two representations in memory on the client and the server, and maybe more depending on the client framework, and the textual representation is around twice as large as the binary one.  Add to this the fact that it must all be handled at once and you have difficulties which are inherent to the implementation.  In general, I do not recommend byteas for large amounts of binary data for that reason.  If your files are big, use lobs.

Best Wishes,
Chris Travers 
   
Yours,
Liu Yuanyuan



--
Best Wishes,
Chris Travers

Efficito:  Hosted Accounting and ERP.  Robust and Flexible.  No vendor lock-in.