Re: row is too big: size 8168, maximum size 8160 - Mailing list pgsql-admin

From ramsiddu007
Subject Re: row is too big: size 8168, maximum size 8160
Date
Msg-id CA+zEy7-M2crn=35CbSU2_kNvg1AXwqrP42+c7H2iUmXErSYO4A@mail.gmail.com
Whole thread Raw
In response to row is too big: size 8168, maximum size 8160  (Mario De Frutos Dieguez <mariodefrutos@gmail.com>)
List pgsql-admin
Hi,
     Recently we have faced the same problem. We're trying to create table with dynamic colums, after that while updating the data it was shown that row size too big error. After as per our DBA lead suggestion, we done some chsnges like "there is one date column its taking lenght high because of it is timestamp format, thats why we cast that column with date data type". By this lenght became 10,  obviously space released problem solved. But this same cause not for all the time for that error. So first check your data if there is any unnecessary thing avoid it. Its my first experience for this error.
Thanking you

On Wed, 11 Jul 2018, 19:46 Mario De Frutos Dieguez, <mariodefrutos@gmail.com> wrote:
Hello every one!

I've found this error message and it's driving me crazy.

I have a table with 790 numeric columns, I'm trying to make an INSERT INTO x SELECT... and in the same column/s I'm getting this message.

I've tried everything VACUUM FULL, Batch the insert with updates...but nothing always the same error.

THe BLCKSZ is set to 8Kb here: https://github.com/postgres/postgres/blob/master/src/include/pg_config.h.in#L36 but I'm suspecting that this or corrupted data or something I'm missing.

I've also read about TOAST-able and so on and I haven't messed with storage properties. The data into that columns aren't big numbers (maybe having many decimal counts)

Any clues? help?

Thank you

pgsql-admin by date:

Previous
From: Mario de Frutos Dieguez
Date:
Subject: Re: row is too big: size 8168, maximum size 8160
Next
From: John Scalia
Date:
Subject: Re: upgrading postgresql cluster(3 nodes) to v10 without DOWNTIME