Thread: large object regression tests

large object regression tests

From
Jeremy Drake
Date:
I noticed when I was working on a patch quite a while back that there are
no regression tests for large object support.  I know, large objects
are not the most sexy part of the code-base, and I think they tend to be
ignored/forgotten most of the time.  Which IMHO is all the more reason
they should have some regression tests.  Otherwise, if someone managed to
break them somehow, it is quite likely not to be noticed for quite some
time.

So in this vein, I have recently found myself with some free time, and a
desire to contribute something, and decided this would be the perfect
place to get my feet wet without stepping on any toes.

I guess what I should ask is, would a patch to add a test for large
objects to the regression suite be well received?  And, is there any
advice for how to go about making these tests?

I am considering, and I think that in order to get a real test of the
large objects, I would need to load data into a large object which would
be sufficient to be loaded into more than one block (large object blocks
were 1 or 2K IIRC) so that the block boundary case could be tested.  Is
there any precedent on where to grab such a large chunk of data from?  I
was thinking about using an excerpt from a public domain text such as Moby
Dick, but on second thought binary data may be better to test things with.

My current efforts, and probably the preliminary portion of the final
test, involves loading a small amount (less than one block) of text into a
large object inline from a sql script and calling the various functions
against it to verify that they do what they should.  In the course of
doing so, I find that it is necessary to stash certain values across
statements (large object ids, large object 'handles'), and so far I am
using a temporary table to store these.  Is this reasonable, or is there a
cleaner way to do that?

-- 
Even if you're on the right track, you'll get run over if you just sit there.    -- Will Rogers


Re: large object regression tests

From
Tom Lane
Date:
Jeremy Drake <jeremyd@jdrake.com> writes:
> I noticed when I was working on a patch quite a while back that there are
> no regression tests for large object support.

Yeah, this is bad :-(

> I am considering, and I think that in order to get a real test of the
> large objects, I would need to load data into a large object which would
> be sufficient to be loaded into more than one block (large object blocks
> were 1 or 2K IIRC) so that the block boundary case could be tested.  Is
> there any precedent on where to grab such a large chunk of data from?

There's always plain old junk data, eg, repeat('xyzzy', 100000).
I doubt that Moby Dick would expose any unexpected bugs ...

> ... I find that it is necessary to stash certain values across
> statements (large object ids, large object 'handles'), and so far I am
> using a temporary table to store these.  Is this reasonable, or is there a
> cleaner way to do that?

I think it's supposed to be possible to use psql variables for that;
if you can manage to test psql variables as well as large objects,
that'd be a double bonus.
        regards, tom lane


Re: large object regression tests

From
Markus Schaber
Date:
Hi, Jeremy,
Jeremy Drake wrote:

> I am considering, and I think that in order to get a real test of the
> large objects, I would need to load data into a large object which would
> be sufficient to be loaded into more than one block (large object blocks
> were 1 or 2K IIRC) so that the block boundary case could be tested.  Is
> there any precedent on where to grab such a large chunk of data from?

You could generate such data on the fly, as part of the test scripts.

E. G. a blob of zero bytes, blob of 0xff bytes, a blob of pseudo random
data...

Markus
--
Markus Schaber | Logical Tracking&Tracing International AG
Dipl. Inf.     | Software Development GIS

Fight against software patents in EU! www.ffii.org www.nosoftwarepatents.org


Re: large object regression tests

From
"Lamar Owen"
Date:
On Tuesday 05 September 2006 02:59, Jeremy Drake wrote:
> I am considering, and I think that in order to get a real test of the
> large objects, I would need to load data into a large object which would
> be sufficient to be loaded into more than one block (large object blocks
> were 1 or 2K IIRC) so that the block boundary case could be tested.  Is
> there any precedent on where to grab such a large chunk of data from?  I
> was thinking about using an excerpt from a public domain text such as Moby
> Dick, but on second thought binary data may be better to test things with.

A 5 or 6 megapixel JPEG image.  Maybe a photograph of an elephant.
-- 
Lamar Owen
Director of Information Technology
Pisgah Astronomical Research Institute
1 PARI Drive
Rosman, NC  28772
(828)862-5554
www.pari.edu