memory issues with BYTEAs in JSON and hstore - Mailing list pgsql-general

From Felix Kunde
Subject memory issues with BYTEAs in JSON and hstore
Date
Msg-id trinity-21eb4411-22bb-48d5-ac9f-76b4d3ae751c-1401189572913@3capp-gmx-bs55
Whole thread Raw
List pgsql-general
Hey,
I've developed a database auditing/versioning using the JSON data type (http://github.com/fxku/audit) and doing some
testsnow.  
Unfortunately I'm facing some problems when dealing with tables that store images as BYTEA. Some of them are around 15
MBbig. 

My tool logs changes to JSON and can populate it back to views with json_populate_recordset and json_agg. When
performingthis procedure on tables containing BYTEAs I receive an 54000 error ("Cannot enlarge string buffer"). Is this
becauseof json_agg or because of single rows?  

Executing to_json on the whole column that contains the binary data (size above 500 MB) lead to out-of-memory errors.
Thesame goes for hstore. Executing these functions only on the biggest image was successful but freezed my pgAdmin.
WhenI encoded BYTEA to TEXT before transforming it to JSON or hstore it worked. But trying json_populate_recordset
stillruns into memory problems (but explain worked).  

Do you think JSONB will solve my problems in the future?

Here is also a comparison in size between the bytea (and encoded versions to TEXT) and JSON / hstore output which I
foundkinda interesting: 

operation | bytea | bytea->'escape' | bytea->'hex' | bytea->'base64'
----------|-------|-----------------|--------------|----------------
          | 15 MB | 40 MB           | 31 MB        | 21 MB
to_json() | 57 MB | 57 MB           | 31 MB        | 21 MB
hstore()  | 46 MB | 40 MB           | 31 MB        | 21 MB

Thanks in advance for any hints.
Cheers,
Felix


pgsql-general by date:

Previous
From: Yvonne Zannoun
Date:
Subject: Re: Delete trigger and data integrity
Next
From: Alban Hertroys
Date:
Subject: Re: Delete trigger and data integrity