Hello Tom,
> moonjelly just reported an interesting failure [1].
I noticed. I was planning to have a look at it, thanks for digging!
> It seems that with the latest bleeding-edge gcc, this code is
> misoptimized:
> else if (imax - imin < 0 || (imax - imin) + 1 < 0)
> {
> /* prevent int overflows in random functions */
> pg_log_error("random range is too large");
> return false;
> }
>
> such that the second if-test doesn't fire. Now, according to the C99
> spec this code is broken, because the compiler is allowed to assume
> that signed integer overflow doesn't happen,
Hmmm, so it is not possible to detect these with standard arithmetic as
done by this code. Note that the code was written in 2016, ISTM pre C99
Postgres. I'm unsure about what a C compiler could assume on the previous
standard wrt integer arithmetic.
> whereupon the second if-block is provably unreachable.
Indeed.
> The failure still represents a gcc bug, because we're using -fwrapv
> which should disable that assumption.
Ok, I'll report it.
I also see a good point with pgbench tests exercising edge cases.
> However, not all compilers have that switch, so it'd be better to code
> this in a spec-compliant way.
Ok.
> I suggest applying the attached in branches that have the required
> functions.
The latest gcc, recompiled yesterday, is still wrong, as shown by
moonjelly current status.
The proposed patch does fix the issue in pgbench TAP test. I'd suggest to
add unlikely() on all these conditions, as already done elsewhere. See
attached version.
I confirm that check-world passed with gcc head and its broken -fwrapv.
I also recompiled after removing manually -fwrapv: Make check, pgbench TAP
tests and check-world all passed. I'm not sure that edge case are well
enough tested in postgres to be sure that all is ok just from these runs
though.
--
Fabien.