Hey,
I've developed a database auditing/versioning using the JSON data type (http://github.com/fxku/audit) and doing some
testsnow.
Unfortunately I'm facing some problems when dealing with tables that store images as BYTEA. Some of them are around 15
MBbig.
My tool logs changes to JSON and can populate it back to views with json_populate_recordset and json_agg. When
performingthis procedure on tables containing BYTEAs I receive an 54000 error ("Cannot enlarge string buffer"). Is this
becauseof json_agg or because of single rows?
Executing to_json on the whole column that contains the binary data (size above 500 MB) lead to out-of-memory errors.
Thesame goes for hstore. Executing these functions only on the biggest image was successful but freezed my pgAdmin.
WhenI encoded BYTEA to TEXT before transforming it to JSON or hstore it worked. But trying json_populate_recordset
stillruns into memory problems (but explain worked).
Do you think JSONB will solve my problems in the future?
Here is also a comparison in size between the bytea (and encoded versions to TEXT) and JSON / hstore output which I
foundkinda interesting:
operation | bytea | bytea->'escape' | bytea->'hex' | bytea->'base64'
----------|-------|-----------------|--------------|----------------
| 15 MB | 40 MB | 31 MB | 21 MB
to_json() | 57 MB | 57 MB | 31 MB | 21 MB
hstore() | 46 MB | 40 MB | 31 MB | 21 MB
Thanks in advance for any hints.
Cheers,
Felix