On Wed, Sep 22, 2010 at 10:54:53PM +0100, Thom Brown wrote:
> On 22 September 2010 22:01, Josh Berkus <josh@agliodbs.com> wrote:
> > All,
> >
> > I was just checking on our year-2027 compliance, and happened to notice
> > that time with time zone takes up 12 bytes. ?This seems peculiar, given
> > that timestamp with time zone is only 8 bytes, and at my count we only
> > need 5 for the time with microsecond precision. ?What's up with that?
> >
> > Also, what is the real range of our 8-byte *integer* timestamp?
>
> The time is 8 bytes, (1,000,000 microseconds * 60 minutes, * 24 hours
> = 1,440,000,000 microseconds = 31 bits = 8 bytes).
>
31 bits = approx. 4 bytes at 8 bits/byte, not 8 bytes.
> The timezone displacement takes up to 12 bits, meaning 3 bytes.
> (1460+1459 = 2919 = 12 bits = 3 bytes). So that's 11 bytes. Not sure
> where the extra 1 byte comes from.
>
This would yield 7 bytes.
Ken