On Thu, Nov 21, 2024 at 1:22 PM Andrey M. Borodin <x4mmm@yandex-team.ru> wrote:
>
>
>
> > On 21 Nov 2024, at 02:24, Masahiko Sawada <sawada.mshk@gmail.com> wrote:
> >
> > But does replacing the least significant 2 bits
> > with random 2 bits really not affect monotonicity?
>
> You are right. We have to take into account this when calculating monotonicity. PFA another version.
>
While it works fine, I think we need a comment for this change:
-#define SUB_MILLISECOND_STEP ((NS_PER_MS / (1 << 12)) + 1)
+#if defined(__darwin__) || _MSC_VER
+#define SUB_MILLISECOND_BITS 10
+#else
+#define SUB_MILLISECOND_BITS 12
+#endif
+#define SUB_MILLISECOND_STEP ((NS_PER_MS / (1 << SUB_MILLISECOND_BITS)) + 1)
because the reader might think we should use SUB_MILLISECOND_BITS
here too at a glance:
+ /* sub-millisecond timestamp fraction (12 bits) */
+ increased_clock_precision = ((ns % NS_PER_MS) * (1 << 12)) / NS_PER_MS;
Regards,
--
Masahiko Sawada
Amazon Web Services: https://aws.amazon.com