Re: Why is NULL = unbounded for rangetypes? - Mailing list pgsql-general

From Andreas Joseph Krogh
Subject Re: Why is NULL = unbounded for rangetypes?
Date
Msg-id OrigoEmail.4d.7156e1fda9bbc0ec.140ce84e50d@prod2
Whole thread Raw
In response to Re: Why is NULL = unbounded for rangetypes?  (Jeff Davis <pgsql@j-davis.com>)
Responses Re: Why is NULL = unbounded for rangetypes?
List pgsql-general
P=C3=A5 fredag 30. august 2013 kl. 03:23:09, skrev Jeff Davis <<a h=
ref=3D"mailto:pgsql@j-davis.com" target=3D"_blank">pgsql@j-davis.com>=
;:

<blockquote style=3D"border-left: 1px solid rgb(204, 204, 204); margin: 0pt=
 0pt 0pt 0.8ex; padding-left: 1ex;">
On =
Tue, 2013-07-09 at 10:45 +0200, Andreas Joseph Krogh wrote:
> I would expect the queries above to return FALSE and have to use
> INFINITY to have them return TRUE. I don't understand what you mean by=

> ranges not allowing either bound to be NULL as it seems to be the case=

> (as in "it works").

Although passing NULL to the constructor works, it does *not* create a
range where one bound is NULL. It actually creates an unbounded range;
that is, a range where one bound is infinite.

NULL semantics are far too confusing to be useful with ranges. For
instance, if ranges did support NULLs; the queries you mention would
have to return NULL, not FALSE.


=C2=A0

But I agree that returning NULL would be OK, then it would be easy to =
catch in queries when starting playing with range-types in queries. Having =
it implicitly mean infinity comes as a surprise, to me at least.

=C2=A0

But now that I know this it's exactly not a blocker...

=C2=A0

--
Andreas Joseph Krogh <andreak@officenet.no>=C2=A0 =C2=A0 =C2=A0 mob: =
+47 909 56 963
Senior Software Developer / CTO - OfficeNet AS - http://www.officenet.no
Public key: http://home.officenet.no/~andreak/public_key.asc

=C2=A0=

pgsql-general by date:

Previous
From: 高健
Date:
Subject: My Experiment of PG crash when dealing with huge amount of data
Next
From: hxreno1
Date:
Subject: Re: My Experiment of PG crash when dealing with huge amount of data