Thread: lo_import for bytea columns

lo_import for bytea columns

From
Jonathan Bartlett
Date:
Is there an equivalent function for bytea columns that works like
lo_import?

Alternatively, is there a way to copy from a large object to a bytea
column from SQL?

Or maybe someone has another way of attacking this problem:

I've got some Perl code that does this:

undef $/;
$data = <FHFOR89MBFILE>;
$sth = $dbh->prepare("insert into data (bigbyteacolumn) values (?)");
$sth->bind_param(1, $data, DBI::SQL_BINARY);
$sth->execute;

Which has worked fine for a while, with file sizes around 10MB.

However, now I have someone who wants to use this for a file that's 89MB,
and it's taking up about 500M of memory before crashing.  I'm trying to
find a less-memory-consuming way of handling this, even if just for a
temporary hack for this one file.  I think what's happening is that Perl
is reading in the 89M, and then I'm guessing that either Perl or the
driver is converting that into a fully-escaped string for transfer, and
this is where the problem is occuring.

Any ideas?

Thanks,

Jonathan Bartlett


Re: lo_import for bytea columns

From
Joe Conway
Date:
Jonathan Bartlett wrote:
> However, now I have someone who wants to use this for a file that's 89MB,
> and it's taking up about 500M of memory before crashing.  I'm trying to
> find a less-memory-consuming way of handling this, even if just for a
> temporary hack for this one file.  I think what's happening is that Perl
> is reading in the 89M, and then I'm guessing that either Perl or the
> driver is converting that into a fully-escaped string for transfer, and
> this is where the problem is occuring.
>
> Any ideas?

If you can use 7.4, then see:
http://www.postgresql.org/docs/current/static/libpq-exec.html

Specifically look at the PQexecParams() function. You could write a
simple command line program that uses PQexecParams() to insert your
large files. Or talk the Perl Postgres DBI driver guys into supporting
it maybe.

Joe