Hello,
I hope this isn't too obvious a question.
I'm currently using Postgres 7.0.3, having installed the postgresql.org
i386 RPMs for RedHat 6.x on an RH 6.1 machine. I'm attempting to create a
database where one of the fields contains a large object read from the
filesystem.
But on piping the following batch file into psql:
CREATE TABLE FileSystem ( NAME varchar(80), URL varchar(200), LASTUPDATE date, LENGTH numeric(10), DATA oid);
INSERT INTO FileSystem VALUES ('/work/ams/Manfac/Panels/ActionRepOpenJobs_0.pnl', NULL, '03-09-2000 2:24:00', 573,
lo_import('/work/ams/Manfac/Panels/ActionRepOpenJobs_0.pnl'));
[then other similar INSERT statements: batch file's about 1850 lines long]
the backend always, but always, crashes on the ninth INSERT with a 'my
bits moved right off the end of the world' error. I've tried jigging the
batch file around, using different filenames, no joy.
All files for the lo_import up to and including the ninth INSERT are
< 2048 bytes in size, so (admittedly without knowledge of the postgresql
code) an over-run seems unlikely. Any hints or ideas?
Many thanks,
Rhys Jones, Swansea
--
http://www.sucs.org/~rhys/