Using BLOBs with PostgreSQL - Mailing list pgsql-general

From Tim Kientzle
Subject Using BLOBs with PostgreSQL
Date
Msg-id 39DFA932.31834C8D@acm.org
Whole thread Raw
Responses Re: Using BLOBs with PostgreSQL  ("Martin A. Marques" <martin@math.unl.edu.ar>)
List pgsql-general
I'm evaluating a couple of different databases for use as
the back-end to a web-based publishing system that's currently
being developed in Java and Perl.

I want to keep _all_ of the data in the database, to
simplify future replication and data management.  That
includes such data as GIF images, large HTML files,
even multi-megabyte downloadable software archives.

I've been using MySQL for initial development; it has pretty
clean and easy-to-use BLOB support.  You just declare a BLOB
column type, then read and write arbitrarily large chunks of data.
In Perl, BLOB columns work just like varchar columns; in JDBC,
the getBinaryStream()/setBinaryStream() functions provide support
for streaming large data objects.

How well-supported is this functionality in PostgreSQL?
I did some early experimenting with PG, but couldn't
find any column type that would accept binary data
(apparently PG's parser chokes on null characters?).

I've heard about TOAST, but have no idea what it really
is, how to use it, or how well it performs.  I'm leery
of database-specific APIs.

            - Tim Kientzle

pgsql-general by date:

Previous
From: Frank Joerdens
Date:
Subject: Re: How does TOAST compare to other databases' mechanisms?
Next
From: Collin Peters
Date:
Subject: Postmaster startup problems