Thread: Overflow of attmissingval is not handled gracefully

Overflow of attmissingval is not handled gracefully

From
Tom Lane
Date:
Consider this admittedly-rather-contrived example:

regression=# create table foo(f1 int);
CREATE TABLE
regression=# alter table foo add column bar text default repeat('xyzzy', 1000000);
ERROR:  row is too big: size 57416, maximum size 8160

Since the table contains no rows at all, this is a surprising
failure.  The reason for it of course is that pg_attribute
has no TOAST table, so it can't store indefinitely large
attmissingval fields.

I think the simplest answer, and likely the only feasible one for
the back branches, is to disable the attmissingval optimization
if the proposed value is "too large".  Not sure exactly where the
threshold for that ought to be, but maybe BLCKSZ/8 could be a
starting offer.

            regards, tom lane



Re: Overflow of attmissingval is not handled gracefully

From
Andrew Dunstan
Date:
On 2/28/22 18:21, Tom Lane wrote:
> Consider this admittedly-rather-contrived example:
>
> regression=# create table foo(f1 int);
> CREATE TABLE
> regression=# alter table foo add column bar text default repeat('xyzzy', 1000000);
> ERROR:  row is too big: size 57416, maximum size 8160
>
> Since the table contains no rows at all, this is a surprising
> failure.  The reason for it of course is that pg_attribute
> has no TOAST table, so it can't store indefinitely large
> attmissingval fields.
>
> I think the simplest answer, and likely the only feasible one for
> the back branches, is to disable the attmissingval optimization
> if the proposed value is "too large".  Not sure exactly where the
> threshold for that ought to be, but maybe BLCKSZ/8 could be a
> starting offer.
>
>             


WFM. After all, it's taken several years for this to surface. Is it
based on actual field experience?


cheers


andrew


--
Andrew Dunstan
EDB: https://www.enterprisedb.com




Re: Overflow of attmissingval is not handled gracefully

From
Tom Lane
Date:
Andrew Dunstan <andrew@dunslane.net> writes:
> On 2/28/22 18:21, Tom Lane wrote:
>> regression=# create table foo(f1 int);
>> CREATE TABLE
>> regression=# alter table foo add column bar text default repeat('xyzzy', 1000000);
>> ERROR:  row is too big: size 57416, maximum size 8160
>>
>> I think the simplest answer, and likely the only feasible one for
>> the back branches, is to disable the attmissingval optimization
>> if the proposed value is "too large".  Not sure exactly where the
>> threshold for that ought to be, but maybe BLCKSZ/8 could be a
>> starting offer.

> WFM. After all, it's taken several years for this to surface. Is it
> based on actual field experience?

No, it was an experiment that occurred to me while thinking about
the nearby proposal to add a TOAST table to pg_attribute [1].
If we do that, this restriction could be dropped.  But I agree that
there's hardly any practical use-case for such default values,
so I wouldn't mind living with the de-optimization either.

            regards, tom lane

[1] https://www.postgresql.org/message-id/flat/1643112264.186902312@f325.i.mail.ru