I wrote:
> The current sorting code will fail if the data volume exceeds whatever
> the maximum file size is on your OS. (Actually, if long is 32 bits,
> it might fail at 2gig even if your OS can handle 4gig; not sure, but
> it is doing signed-long arithmetic with byte offsets...)
> I am just about to commit code that fixes this by allowing temp files
> to have multiple segments like tables can.
OK, committed. I have tested this code using a small RELSEG_SIZE,
and it seems to work, but I don't have the spare disk space to try
a full-scale test with > 4Gb of data. Anyone care to try it?
I have not yet done anything about the excessive space consumption
(4x data volume), so plan on using 16+Gb of diskspace to sort a 4+Gb
table --- and that's not counting where you put the output ;-)
regards, tom lane