Thread: Using BLOBs with PostgreSQL

Using BLOBs with PostgreSQL

From
Tim Kientzle
Date:
I'm evaluating a couple of different databases for use as
the back-end to a web-based publishing system that's currently
being developed in Java and Perl.

I want to keep _all_ of the data in the database, to
simplify future replication and data management.  That
includes such data as GIF images, large HTML files,
even multi-megabyte downloadable software archives.

I've been using MySQL for initial development; it has pretty
clean and easy-to-use BLOB support.  You just declare a BLOB
column type, then read and write arbitrarily large chunks of data.
In Perl, BLOB columns work just like varchar columns; in JDBC,
the getBinaryStream()/setBinaryStream() functions provide support
for streaming large data objects.

How well-supported is this functionality in PostgreSQL?
I did some early experimenting with PG, but couldn't
find any column type that would accept binary data
(apparently PG's parser chokes on null characters?).

I've heard about TOAST, but have no idea what it really
is, how to use it, or how well it performs.  I'm leery
of database-specific APIs.

            - Tim Kientzle

Re: Using BLOBs with PostgreSQL

From
"Martin A. Marques"
Date:
On Sat, 07 Oct 2000, Tim Kientzle wrote:
>
> I've been using MySQL for initial development; it has pretty
> clean and easy-to-use BLOB support.  You just declare a BLOB
> column type, then read and write arbitrarily large chunks of data.
> In Perl, BLOB columns work just like varchar columns; in JDBC,
> the getBinaryStream()/setBinaryStream() functions provide support
> for streaming large data objects.

If you're talking about BLOB texts, just declare the column as text and thats
all. In the case of binary data, I don't have an idea. I only work we text
data.

> How well-supported is this functionality in PostgreSQL?
> I did some early experimenting with PG, but couldn't
> find any column type that would accept binary data
> (apparently PG's parser chokes on null characters?).
>
> I've heard about TOAST, but have no idea what it really
> is, how to use it, or how well it performs.  I'm leery
> of database-specific APIs.

As far as I have listen, it looks like a nice way to optimize searches in
blobs. Don't know anything else.

Saludos... :-)

--
"And I'm happy, because you make me feel good, about me." - Melvin Udall
-----------------------------------------------------------------
Mart�n Marqu�s            email:     martin@math.unl.edu.ar
Santa Fe - Argentina        http://math.unl.edu.ar/~martin/
Administrador de sistemas en math.unl.edu.ar
-----------------------------------------------------------------

RE: Using BLOBs with PostgreSQL

From
Franck Martin
Date:
PG definitively lack BLOB support and it is a big drawback...

Even with PG7.1 with TOAST that will remove the 8kB limit on records, there
will be still no BLOB support.

As you have indicated it seems that PG chokes on null characters. What is
needed is a varbinary type, which stores binary data unformated, and spits
out the binary data inside a PQResult. Some additional funtions are needed
to read and write chunk of data inside that varbinary type.

Any volunteer amongst the hackers ?

I may try to do it as a used defined type inside a shared library, but I
think PG as is may not handle it correctly...

Cheers
Franck@sopac.org


-----Original Message-----
From: Martin A. Marques
To: Tim Kientzle; PostgreSQL general mailing list
Sent: 8/10/00 11:11
Subject: Re: [GENERAL] Using BLOBs with PostgreSQL

On Sat, 07 Oct 2000, Tim Kientzle wrote:
>
> I've been using MySQL for initial development; it has pretty
> clean and easy-to-use BLOB support.  You just declare a BLOB
> column type, then read and write arbitrarily large chunks of data.
> In Perl, BLOB columns work just like varchar columns; in JDBC,
> the getBinaryStream()/setBinaryStream() functions provide support
> for streaming large data objects.

If you're talking about BLOB texts, just declare the column as text and
thats
all. In the case of binary data, I don't have an idea. I only work we
text
data.

> How well-supported is this functionality in PostgreSQL?
> I did some early experimenting with PG, but couldn't
> find any column type that would accept binary data
> (apparently PG's parser chokes on null characters?).
>
> I've heard about TOAST, but have no idea what it really
> is, how to use it, or how well it performs.  I'm leery
> of database-specific APIs.

As far as I have listen, it looks like a nice way to optimize searches
in
blobs. Don't know anything else.

Saludos... :-)

--
"And I'm happy, because you make me feel good, about me." - Melvin Udall
-----------------------------------------------------------------
Martín Marqués            email:     martin@math.unl.edu.ar
Santa Fe - Argentina        http://math.unl.edu.ar/~martin/
Administrador de sistemas en math.unl.edu.ar
-----------------------------------------------------------------

Re: Using BLOBs with PostgreSQL

From
Denis Perchine
Date:
> PG definitively lack BLOB support and it is a big drawback...

This statement is completly wrong.
Just look at lo_create/lo_open/lo_read/lo_write/lo_close functions
in libpq.

> Even with PG7.1 with TOAST that will remove the 8kB limit on records, there
> will be still no BLOB support.
>
> As you have indicated it seems that PG chokes on null characters. What is
> needed is a varbinary type, which stores binary data unformated, and spits
> out the binary data inside a PQResult. Some additional funtions are needed
> to read and write chunk of data inside that varbinary type.
>
> Any volunteer amongst the hackers ?
>
> I may try to do it as a used defined type inside a shared library, but I
> think PG as is may not handle it correctly...
>
> Cheers
> Franck@sopac.org
>
>
> -----Original Message-----
> From: Martin A. Marques
> To: Tim Kientzle; PostgreSQL general mailing list
> Sent: 8/10/00 11:11
> Subject: Re: [GENERAL] Using BLOBs with PostgreSQL
>
> On Sat, 07 Oct 2000, Tim Kientzle wrote:
> > I've been using MySQL for initial development; it has pretty
> > clean and easy-to-use BLOB support.  You just declare a BLOB
> > column type, then read and write arbitrarily large chunks of data.
> > In Perl, BLOB columns work just like varchar columns; in JDBC,
> > the getBinaryStream()/setBinaryStream() functions provide support
> > for streaming large data objects.
>
> If you're talking about BLOB texts, just declare the column as text and
> thats
> all. In the case of binary data, I don't have an idea. I only work we
> text
> data.
>
> > How well-supported is this functionality in PostgreSQL?
> > I did some early experimenting with PG, but couldn't
> > find any column type that would accept binary data
> > (apparently PG's parser chokes on null characters?).
> >
> > I've heard about TOAST, but have no idea what it really
> > is, how to use it, or how well it performs.  I'm leery
> > of database-specific APIs.
>
> As far as I have listen, it looks like a nice way to optimize searches
> in
> blobs. Don't know anything else.
>
> Saludos... :-)

--
Sincerely Yours,
Denis Perchine

----------------------------------
E-Mail: dyp@perchine.com
HomePage: http://www.perchine.com/dyp/
FidoNet: 2:5000/120.5
----------------------------------

Re: Using BLOBs with PostgreSQL

From
"Martin A. Marques"
Date:
On Tue, 10 Oct 2000, Denis Perchine wrote:

> > PG definitively lack BLOB support and it is a big drawback...
>
> This statement is completly wrong.
> Just look at lo_create/lo_open/lo_read/lo_write/lo_close functions
> in libpq.

Inside the source tree, where can I find the source of libpq?


Thanks!!!


"And I'm happy, because you make me feel good, about me." - Melvin Udall
-----------------------------------------------------------------
Martín Marqués            email:     martin@math.unl.edu.ar
Santa Fe - Argentina        http://math.unl.edu.ar/~martin/
Administrador de sistemas en math.unl.edu.ar
-----------------------------------------------------------------


RE: Using BLOBs with PostgreSQL

From
"Martin A. Marques"
Date:
Would it be difficult to store all the Binary data in a seperate
database ( a special database ) that would contain one binary object per
row with the obious information?

My 2 cents. :-)

On Sun, 8 Oct 2000, Franck Martin wrote:

> PG definitively lack BLOB support and it is a big drawback...
>
> Even with PG7.1 with TOAST that will remove the 8kB limit on records, there
> will be still no BLOB support.
>
> As you have indicated it seems that PG chokes on null characters. What is
> needed is a varbinary type, which stores binary data unformated, and spits
> out the binary data inside a PQResult. Some additional funtions are needed
> to read and write chunk of data inside that varbinary type.
>
> Any volunteer amongst the hackers ?
>
> I may try to do it as a used defined type inside a shared library, but I
> think PG as is may not handle it correctly...
>
> Cheers
> Franck@sopac.org
>
>

Saludos... :-)


"And I'm happy, because you make me feel good, about me." - Melvin Udall
-----------------------------------------------------------------
Martín Marqués            email:     martin@math.unl.edu.ar
Santa Fe - Argentina        http://math.unl.edu.ar/~martin/
Administrador de sistemas en math.unl.edu.ar
-----------------------------------------------------------------


Re: Using BLOBs with PostgreSQL

From
Denis Perchine
Date:
> > > PG definitively lack BLOB support and it is a big drawback...
> >
> > This statement is completly wrong.
> > Just look at lo_create/lo_open/lo_read/lo_write/lo_close functions
> > in libpq.
>
> Inside the source tree, where can I find the source of libpq?

src/interfaces/libpq

--
Sincerely Yours,
Denis Perchine

----------------------------------
E-Mail: dyp@perchine.com
HomePage: http://www.perchine.com/dyp/
FidoNet: 2:5000/120.5
----------------------------------