Hey,
v16 to get the nice underscore separators for readability.
This came up on Reddit [1] the other day and boils down to the question: "why do the two divisions below end up with radically different result scales?"
postgres=# select .999_999_999_999_999_999_999 / 2;
?column?
-------------------------
0.500000000000000000000
(1 row)
postgres=# select .000_000_000_000_000_000_001 / 2;
?column?
--------------------------------------------
0.0000000000000000000005000000000000000000
(1 row)
Is this an expected difference?
The first example produces a rounded answer since the correct answer will not fit within the 21 digits of the left input.
The second example has the same exact problem but because the system calculated a scale of 40 the result does indeed fit without rounding.
I'm getting my head around "weight" finally and realize that the difference must somehow come down to the discarded zeros in the packed form of NumericVar. But figured I'd at least post here to see if there is some foundational knowledge to be shared before I try to figure out exactly what the algorithm is doing. I did find "The Art of Computer Programming, Volume 2" by Donald E. Knuth (1997) on my Safari Bookshelf subscription and skimmed the addition algorithm, but not yet the division one. The code comments mention Knuth by name though the comment block at the top of numeric.c doesn't.
Thanks!
David J.