Storing large files in multiple schemas: BLOB or BYTEA - Mailing list pgsql-general

From
Subject Storing large files in multiple schemas: BLOB or BYTEA
Date
Msg-id 007201cda6d0$61813b00$2483b100$@riatest.com
Whole thread Raw
Responses Re: Storing large files in multiple schemas: BLOB or BYTEA  (Shaun Thomas <sthomas@optionshouse.com>)
Re: Storing large files in multiple schemas: BLOB or BYTEA  ("Albe Laurenz" <laurenz.albe@wien.gv.at>)
List pgsql-general

Hi,

 

I need to store large files (from several MB to 1GB) in Postgres database. The database has multiple schemas. It looks like Postgres has 2 options to store large objects: LOB and BYTEA. However we seem to hit problems with each of these options.

 

1. LOB. This works almost ideal, can store up to 2GB and allows streaming so that we do not hit memory limits in our PHP backend when reading the LOB. However all blobs are stored in pg_catalog and are not part of schema. This leads to a big problem when you try to use pg_dump with options –n and –b to dump just one schema with its blobs. It dumps the schema data correctly however then it include ALL blobs in the database not just the blobs that belong to the particular schema.

Is there a way to dump the single schema with its blobs using pg_dump or some other utility?

 

2. BYTEA. These are correctly stored per schema so pg_dump –n works correctly however I cannot seem to find a way to stream the data. This means that there is no way to access the data from PHP if it is larger than memory limit.

 

Is there any other way to store large data in Postgres that allows streaming and correctly works with multiple schemas per database?

 

Thanks.

 

(Sorry if this double-posts on pgsql-php, I did not know which is the best list for this question).

pgsql-general by date:

Previous
From: Divakar Singh
Date:
Subject: libpq-how to add a schema to search path
Next
From: "Albe Laurenz"
Date:
Subject: Re: libpq-how to add a schema to search path