Re: numeric/decimal docs bug? - Mailing list pgsql-hackers

From Bruce Momjian
Subject Re: numeric/decimal docs bug?
Date
Msg-id 200204112139.g3BLdRQ08110@candle.pha.pa.us
Whole thread Raw
In response to Re: numeric/decimal docs bug?  (Jan Wieck <janwieck@yahoo.com>)
Responses Re: numeric/decimal docs bug?
List pgsql-hackers
Jan Wieck wrote:
> Bruce Momjian wrote:
> > Jan Wieck wrote:
> > > > The hard limit is certainly no more than 64K, since we store these
> > > > numbers in half of an atttypmod.  In practice I suspect the limit may
> > > > be less; Jan would be more likely to remember...
> > >
> > >     It is arbitrary of course. I don't recall completely, have to
> > >     dig into the code, but there might be some side  effect  when
> > >     mucking with it.
> > >
> > >     The NUMERIC code increases the actual internal precision when
> > >     doing multiply and divide, what  happens  a  gazillion  times
> > >     when  doing higher functions like trigonometry. I think there
> > >     was some connection between the max precision  and  how  high
> > >     this internal precision can grow, so increasing the precision
> > >     might affect the computational  performance  of  such  higher
> > >     functions significantly.
> >
> > Oh, interesting, maybe we should just leave it alone.
> 
>     As  said, I have to look at the code. I'm pretty sure that it
>     currently will not use hundreds of digits internally  if  you
>     use  only  a  few digits in your schema. So changing it isn't
>     that dangerous.
> 
>     But who's going to write and run a regression test,  ensuring
>     that  the  new  high  limit can really be supported. I didn't
>     even run the numeric_big test lately, which  tests  with  500
>     digits  precision  at least ... and therefore takes some time
>     (yawn). Increasing the number of digits used you  first  have
>     to  have  some  other  tool  to  generate  the  test  data (I
>     originally used bc(1) with some scripts). Based  on  that  we
>     still  claim that our system deals correctly with up to 1,000
>     digits precision.
> 
>     I don't like the idea of  bumping  up  that  number  to  some
>     higher  nonsense, claiming we support 32K digits precision on
>     exact numeric, and noone ever tested if  natural  log  really
>     returns  it's  result  in  that precision instead of a 30,000
>     digit precise approximation.
> 
>     I missed some of the discussion,  because  I  considered  the
>     1,000 digits already beeing complete nonsense and dropped the
>     thread. So could someone please enlighten me  what  the  real
>     reason  for  increasing  our  precision  is?   AFAIR  it  had
>     something to do with the docs. If it's just because the  docs
>     and  the code aren't in sync, I'd vote for changing the docs.

I have done a little more research on this.  If you create a numeric
with no precision:
CREATE TABLE test (x numeric);

You can insert numerics that are greater in length that 1000 digits:
INSERT INTO test values ('1111(continues 1010 times)');

You can even do computations on it:
SELECT x+1 FROM test;

1000 is pretty arbitrary.  If we can handle 1000, I can't see how larger
values somehow could fail.

Also, the numeric regression tests takes much longer than the other
tests.  I don't see why a test of that length is required, compared to
the other tests.  Probably time to pair it back a little.

--  Bruce Momjian                        |  http://candle.pha.pa.us pgman@candle.pha.pa.us               |  (610)
853-3000+  If your life is a hard drive,     |  830 Blythe Avenue +  Christ can be your backup.        |  Drexel Hill,
Pennsylvania19026
 


pgsql-hackers by date:

Previous
From: Tom Lane
Date:
Subject: Re: RFC: Restructuring pg_aggregate
Next
From: Peter Eisentraut
Date:
Subject: Re: RFC: Restructuring pg_aggregate