Thread: Re: Streaming binary data into db, difference between Blob

Re: Streaming binary data into db, difference between Blob

From
Andreas Prohaska
Date:
> > Looking at the AbstractJdbc2Blob class I think that JDBC Blobs
internally
> > use LargeObjects. As I know, this was not the case in earlier versions
> > of the driver. Am I right?
> >
>
> This is for the PostgreSQL Large Objects, not the standard JDBC and SQL
> BLOBs.
>

OK. If I got you right, the Postgres JDBC driver "simulates" the
java.sql.Blob
getBinaryStream() etc. methods by using LargeObject internally.

That seems to be a working solution for me. I just want to keep my
application's
database layer compatible using ordinary JDBC objects. I don't mind if
they are mapped to LargeObject internally. Actually that's even better
regarding
the streaming issues :-)

I assume that I can use this to read and write Blobs, not to delete them
since the LargeObject wouldn't be unlinked.

I'm not familiar with the JDBC driver source, and I would be glad if you
could
confirm this just one more time.

>
> > So far, I'm using LargeObjects and everything works fine,  but I intend
to
> > use c-jdbc for db replication and would have to use JDBC blobs then.
> >
>
> We don't have them yet because PostgreSQL does not have them.  But I
> believe c-jdbc works with PostgreSQL so there must be a way around it.

It certainly works with Postgres, the only question is if it works with
blobs?
I'll try it out.

Thanks for your help,

    Andreas

Re: Streaming binary data into db, difference between Blob

From
Nicolas Modrzyk
Date:
I was following the conversation on blobs with interest.

c-jdbc supports Blobs and Binaries with Postgres.
At the moment though, given the number of different platforms we are
trying to support there is no streaming as such available.

But you can definitely stored and retrieve Blobs from your postgres
databases environment with redundancy.

Nicolas,


On Wed, 2003-09-10 at 15:32, Andreas Prohaska wrote:
> > > Looking at the AbstractJdbc2Blob class I think that JDBC Blobs
> internally
> > > use LargeObjects. As I know, this was not the case in earlier versions
> > > of the driver. Am I right?
> > >
> >
> > This is for the PostgreSQL Large Objects, not the standard JDBC and SQL
> > BLOBs.
> >
>
> OK. If I got you right, the Postgres JDBC driver "simulates" the
> java.sql.Blob
> getBinaryStream() etc. methods by using LargeObject internally.
>
> That seems to be a working solution for me. I just want to keep my
> application's
> database layer compatible using ordinary JDBC objects. I don't mind if
> they are mapped to LargeObject internally. Actually that's even better
> regarding
> the streaming issues :-)
>
> I assume that I can use this to read and write Blobs, not to delete them
> since the LargeObject wouldn't be unlinked.
>
> I'm not familiar with the JDBC driver source, and I would be glad if you
> could
> confirm this just one more time.
>
> >
> > > So far, I'm using LargeObjects and everything works fine,  but I intend
> to
> > > use c-jdbc for db replication and would have to use JDBC blobs then.
> > >
> >
> > We don't have them yet because PostgreSQL does not have them.  But I
> > believe c-jdbc works with PostgreSQL so there must be a way around it.
>
> It certainly works with Postgres, the only question is if it works with
> blobs?
> I'll try it out.
>
> Thanks for your help,
>
>     Andreas
>
> ---------------------------(end of broadcast)---------------------------
> TIP 3: if posting/reading through Usenet, please send an appropriate
>       subscribe-nomail command to majordomo@postgresql.org so that your
>       message can get through to the mailing list cleanly
>


Re: Streaming binary data into db, difference between

From
Fernando Nasser
Date:
Andreas Prohaska wrote:
>
> OK. If I got you right, the Postgres JDBC driver "simulates" the
> java.sql.Blob
> getBinaryStream() etc. methods by using LargeObject internally.
>

Correct.


> I assume that I can use this to read and write Blobs, not to delete them
> since the LargeObject wouldn't be unlinked.
>

Neither to create.  But you can use the org.postgresql.largeobject
extension both to create and to delete (unlink).





--
Fernando Nasser
Red Hat Canada Ltd.                     E-Mail:  fnasser@redhat.com
2323 Yonge Street, Suite #300
Toronto, Ontario   M4P 2C9


Re: Streaming binary data into db, difference between Blob

From
Dave Tenny
Date:
You could always implement your own logical blob manager that implements blob IDs
and breaks blobs into BYTEA records of a particular (manageable) maximum size and associates
multiple BYTEA chunks with the blob id. 

More work, but a least common denominator approach that should be portable to other systems as well.

Nicolas Modrzyk wrote:
I was following the conversation on blobs with interest.

c-jdbc supports Blobs and Binaries with Postgres. 
At the moment though, given the number of different platforms we are
trying to support there is no streaming as such available.

But you can definitely stored and retrieve Blobs from your postgres
databases environment with redundancy.

Nicolas,


On Wed, 2003-09-10 at 15:32, Andreas Prohaska wrote: 
Looking at the AbstractJdbc2Blob class I think that JDBC Blobs       
internally    
use LargeObjects. As I know, this was not the case in earlier versions
of the driver. Am I right?
       
This is for the PostgreSQL Large Objects, not the standard JDBC and SQL 
BLOBs.
     
OK. If I got you right, the Postgres JDBC driver "simulates" the
java.sql.Blob
getBinaryStream() etc. methods by using LargeObject internally. 

That seems to be a working solution for me. I just want to keep my
application's
database layer compatible using ordinary JDBC objects. I don't mind if
they are mapped to LargeObject internally. Actually that's even better
regarding
the streaming issues :-)

I assume that I can use this to read and write Blobs, not to delete them
since the LargeObject wouldn't be unlinked.

I'm not familiar with the JDBC driver source, and I would be glad if you
could
confirm this just one more time. 
   
So far, I'm using LargeObjects and everything works fine,  but I intend       
to   
use c-jdbc for db replication and would have to use JDBC blobs then.
       
We don't have them yet because PostgreSQL does not have them.  But I 
believe c-jdbc works with PostgreSQL so there must be a way around it.     
It certainly works with Postgres, the only question is if it works with
blobs? 
I'll try it out.

Thanks for your help,
Andreas

---------------------------(end of broadcast)---------------------------
TIP 3: if posting/reading through Usenet, please send an appropriate     subscribe-nomail command to majordomo@postgresql.org so that your     message can get through to the mailing list cleanly
   

---------------------------(end of broadcast)---------------------------
TIP 8: explain analyze is your friend
 

Re: Streaming binary data into db, difference between Blob

From
Fernando Nasser
Date:
Dave Tenny wrote:
> You could always implement your own logical blob manager that implements
> blob IDs
> and breaks blobs into BYTEA records of a particular (manageable) maximum
> size and associates
> multiple BYTEA chunks with the blob id.
>
> More work, but a least common denominator approach that should be
> portable to other systems as well.
>

However, bytea is _not_ streamed on 7.3 backends (unless the patch is
used, which actually uses postgreSQL Large Objects as a staging area).

That would be fine for 7.4 where bytea values will be streamed though.


--
Fernando Nasser
Red Hat Canada Ltd.                     E-Mail:  fnasser@redhat.com
2323 Yonge Street, Suite #300
Toronto, Ontario   M4P 2C9


Re: Streaming binary data into db, difference between Blob

From
Fernando Nasser
Date:
Nicolas Modrzyk wrote:
> I was following the conversation on blobs with interest.
>
> c-jdbc supports Blobs and Binaries with Postgres.
> At the moment though, given the number of different platforms we are
> trying to support there is no streaming as such available.
>
> But you can definitely stored and retrieve Blobs from your postgres
> databases environment with redundancy.
>

Great!  Thanks for the info.

Regards.

--
Fernando Nasser
Red Hat Canada Ltd.                     E-Mail:  fnasser@redhat.com
2323 Yonge Street, Suite #300
Toronto, Ontario   M4P 2C9


Re: Streaming binary data into db, difference between Blob

From
Nicolas Modrzyk
Date:
That sounds like really good idea.

Just we can't map to bytea cause it is specific to Postgres, but I would
definitely put this on the feature list for c-jdbc.

Thanks a lot for that.

Nicolas,

On Wed, 2003-09-10 at 16:14, Dave Tenny wrote:
> You could always implement your own logical blob manager that
> implements blob IDs
> and breaks blobs into BYTEA records of a particular (manageable)
> maximum size and associates
> multiple BYTEA chunks with the blob id.
>
> More work, but a least common denominator approach that should be
> portable to other systems as well.
>
> Nicolas Modrzyk wrote:
> > I was following the conversation on blobs with interest.
> >
> > c-jdbc supports Blobs and Binaries with Postgres.
> > At the moment though, given the number of different platforms we are
> > trying to support there is no streaming as such available.
> >
> > But you can definitely stored and retrieve Blobs from your postgres
> > databases environment with redundancy.
> >
> > Nicolas,
> >
> >
> > On Wed, 2003-09-10 at 15:32, Andreas Prohaska wrote:
> >
> > > > > Looking at the AbstractJdbc2Blob class I think that JDBC Blobs
> > > > >
> > >
> > > internally
> > >
> > > > > use LargeObjects. As I know, this was not the case in earlier versions
> > > > > of the driver. Am I right?
> > > > >
> > > > >
> > > >
> > > > This is for the PostgreSQL Large Objects, not the standard JDBC and SQL
> > > > BLOBs.
> > > >
> > > >
> > >
> > > OK. If I got you right, the Postgres JDBC driver "simulates" the
> > > java.sql.Blob
> > > getBinaryStream() etc. methods by using LargeObject internally.
> > >
> > > That seems to be a working solution for me. I just want to keep my
> > > application's
> > > database layer compatible using ordinary JDBC objects. I don't mind if
> > > they are mapped to LargeObject internally. Actually that's even better
> > > regarding
> > > the streaming issues :-)
> > >
> > > I assume that I can use this to read and write Blobs, not to delete them
> > > since the LargeObject wouldn't be unlinked.
> > >
> > > I'm not familiar with the JDBC driver source, and I would be glad if you
> > > could
> > > confirm this just one more time.
> > >
> > >
> > > > > So far, I'm using LargeObjects and everything works fine,  but I intend
> > > > >
> > >
> > > to
> > >
> > > > > use c-jdbc for db replication and would have to use JDBC blobs then.
> > > > >
> > > > >
> > > >
> > > > We don't have them yet because PostgreSQL does not have them.  But I
> > > > believe c-jdbc works with PostgreSQL so there must be a way around it.
> > > >
> > >
> > > It certainly works with Postgres, the only question is if it works with
> > > blobs?
> > > I'll try it out.
> > >
> > > Thanks for your help,
> > >
> > >     Andreas
> > >
> > > ---------------------------(end of broadcast)---------------------------
> > > TIP 3: if posting/reading through Usenet, please send an appropriate
> > >       subscribe-nomail command to majordomo@postgresql.org so that your
> > >       message can get through to the mailing list cleanly
> > >
> > >
> >
> >
> > ---------------------------(end of broadcast)---------------------------
> > TIP 8: explain analyze is your friend
> >
> >


Re: Streaming binary data into db, difference between Blob

From
Fernando Nasser
Date:
Nicolas Modrzyk wrote:
> That sounds like really good idea.
>
> Just we can't map to bytea cause it is specific to Postgres, but I would
> definitely put this on the feature list for c-jdbc.
>

It is just a VARBINARY.  You just have to mind that PostgreSQL has a
very large maximum size for it while other databases are more
restrictive.  I guess his suggestion was to limit the chunk sizes to the
maximum size for the specific DBMS implementation.



--
Fernando Nasser
Red Hat Canada Ltd.                     E-Mail:  fnasser@redhat.com
2323 Yonge Street, Suite #300
Toronto, Ontario   M4P 2C9


Re: Streaming binary data into db, difference between Blob

From
Dave Tenny
Date:
Fernando Nasser wrote:

> Dave Tenny wrote:
>
>> You could always implement your own logical blob manager that
>> implements blob IDs
>> and breaks blobs into BYTEA records of a particular (manageable)
>> maximum size and associates
>> multiple BYTEA chunks with the blob id.
>> More work, but a least common denominator approach that should be
>> portable to other systems as well.
>>
>
> However, bytea is _not_ streamed on 7.3 backends (unless the patch is
> used, which actually uses postgreSQL Large Objects as a staging area).
>
> That would be fine for 7.4 where bytea values will be streamed though.

I know nothing of how the backend works, but assuming it doesn't keep
ALL new BYTEA records in memory,
you get some effect of streaming a chunk at a time with this approach,
so you can control your upper bound
buffer size.