Thread: Storing large files in multiple schemas: BLOB or BYTEA

Storing large files in multiple schemas: BLOB or BYTEA

From
Date:

Hi,

 

I need to store large files (from several MB to 1GB) in Postgres database. The database has multiple schemas. It looks like Postgres has 2 options to store large objects: LOB and BYTEA. However we seem to hit problems with each of these options.

 

1. LOB. This works almost ideal, can store up to 2GB and allows streaming so that we do not hit memory limits in our PHP backend when reading the LOB. However all blobs are stored in pg_catalog and are not part of schema. This leads to a big problem when you try to use pg_dump with options –n and –b to dump just one schema with its blobs. It dumps the schema data correctly however then it include ALL blobs in the database not just the blobs that belong to the particular schema.

Is there a way to dump the single schema with its blobs using pg_dump or some other utility?

 

2. BYTEA. These are correctly stored per schema so pg_dump –n works correctly however I cannot seem to find a way to stream the data. This means that there is no way to access the data from PHP if it is larger than memory limit.

 

Is there any other way to store large data in Postgres that allows streaming and correctly works with multiple schemas per database?

 

Thanks.

Re: Storing large files in multiple schemas: BLOB or BYTEA

From
John DeSoi
Date:
On Oct 10, 2012, at 6:12 AM, tigran2-postgres@riatest.com wrote:

> 2. BYTEA. These are correctly stored per schema so pg_dump –n works correctly however I cannot seem to find a way to
streamthe data. This means that there is no way to access the data from PHP if it is larger than memory limit. 

You can get the octet length and then use the substring function to grab large columns in chunks. See

http://www.postgresql.org/docs/current/interactive/functions-binarystring.html


John DeSoi, Ph.D.



Re: Storing large files in multiple schemas: BLOB or BYTEA

From
John DeSoi
Date:
On Oct 10, 2012, at 6:12 AM, tigran2-postgres@riatest.com wrote:

> 2. BYTEA. These are correctly stored per schema so pg_dump –n works correctly however I cannot seem to find a way to
streamthe data. This means that there is no way to access the data from PHP if it is larger than memory limit. 

You can get the octet length and then use the substring function to grab large columns in chunks. See

http://www.postgresql.org/docs/current/interactive/functions-binarystring.html


John DeSoi, Ph.D.