Thread: BUG #6052: ADD COLUMN - ERROR: tables can have at most 1600 columns

BUG #6052: ADD COLUMN - ERROR: tables can have at most 1600 columns

From
"dinesh"
Date:
The following bug has been logged online:

Bug reference:      6052
Logged by:          dinesh
Email address:      dinesh@milkorwater.com
PostgreSQL version: 8.4.7
Operating system:   WIndows 7 pro 64 bit
Description:        ADD COLUMN - ERROR: tables can have at most 1600 columns
Details:

I have a table which is used during data uploads, a so-called staging table.
This table has a fixed number of columns that [must] match the input CSV
file. This CSV file is uploaded using COPY command. Following the COPY, a
new column (meant for indexing) is constructed on this table using some
application logic; and dropped after that data upload cycle is over.

After some 1500+ cycles, I get the following error:

ERROR: tables can have at most 1600 columns
SQL state: 54011
Context: SQL statement "ALTER TABLE stage_fo ADD COLUMN exch_ticker char
varying"

So it appears that the command
ALTER TABLE stage_fo DROP COLUMN exch_ticker
is only producing some soft effects, not sufficient for the db engine.

There was a similar problem mentioned by another user, Ron St-Pierre, in
June 2004
(http://bytes.com/topic/postgresql/answers/422107-error-tables-can-have-most
-1600-columns). Suggestion made there - to drop/rebuild the table - is not a
trivial choice.

I will appreciate any helpful pointers that will get us past this
showstopper. Rewrite is going to be expensive.

Thanks & regards
Dinesh
Dinesh

Re: BUG #6052: ADD COLUMN - ERROR: tables can have at most 1600 columns

From
Peter Eisentraut
Date:
On lör, 2011-06-04 at 07:09 +0000, dinesh wrote:
> I have a table which is used during data uploads, a so-called staging table.
> This table has a fixed number of columns that [must] match the input CSV
> file. This CSV file is uploaded using COPY command. Following the COPY, a
> new column (meant for indexing) is constructed on this table using some
> application logic; and dropped after that data upload cycle is over.
>
> After some 1500+ cycles, I get the following error:
>
> ERROR: tables can have at most 1600 columns
> SQL state: 54011
> Context: SQL statement "ALTER TABLE stage_fo ADD COLUMN exch_ticker char
> varying"
>
> So it appears that the command
> ALTER TABLE stage_fo DROP COLUMN exch_ticker
> is only producing some soft effects, not sufficient for the db engine.

Yeah, a DROP COLUMN only hides the column and gradually phases it out
from the storage, it doesn't remove it from the metadata storage, which
is where the 1600 limit applies.  So that application design is not
going to work.

I find it a bit suspicious that you add a column "for indexing", when
you could perhaps be using an expression index.

Re: BUG #6052: ADD COLUMN - ERROR: tables can have at most 1600 columns

From
Noah Misch
Date:
On Sat, Jun 04, 2011 at 07:09:27AM +0000, dinesh wrote:
> I have a table which is used during data uploads, a so-called staging table.
> This table has a fixed number of columns that [must] match the input CSV
> file. This CSV file is uploaded using COPY command. Following the COPY, a
> new column (meant for indexing) is constructed on this table using some
> application logic; and dropped after that data upload cycle is over.

When you provide a column list to the COPY command, the table's columns need not
exactly match the input CSV structure.  Could you leave that extra column there
all the time, omit it from your column list in the COPY command, and have a
trigger or default value expression populate it during the COPY?

Re: BUG #6052: ADD COLUMN - ERROR: tables can have at most 1600 columns

From
Juris Kaminskis
Date:
Hello,

I would like to refresh this problem again. The use case: local automated testing of database migrations. If those migrations ALTER TABLES by adding and dropping columns, at some point the only valid way out of 1600 columns error is to reload database. If this is not a big problem would be nice to solve.

Juris

pirmd., 2023. g. 18. sept., plkst. 17:11 — lietotājs dinesh (<dinesh@milkorwater.com>) rakstīja:

The following bug has been logged online:

Bug reference:      6052
Logged by:          dinesh
Email address:      dinesh@milkorwater.com
PostgreSQL version: 8.4.7
Operating system:   WIndows 7 pro 64 bit
Description:        ADD COLUMN - ERROR: tables can have at most 1600 columns
Details:

I have a table which is used during data uploads, a so-called staging table.
This table has a fixed number of columns that [must] match the input CSV
file. This CSV file is uploaded using COPY command. Following the COPY, a
new column (meant for indexing) is constructed on this table using some
application logic; and dropped after that data upload cycle is over.

After some 1500+ cycles, I get the following error:

ERROR: tables can have at most 1600 columns
SQL state: 54011
Context: SQL statement "ALTER TABLE stage_fo ADD COLUMN exch_ticker char
varying"

So it appears that the command
ALTER TABLE stage_fo DROP COLUMN exch_ticker
is only producing some soft effects, not sufficient for the db engine.

There was a similar problem mentioned by another user, Ron St-Pierre, in
June 2004
(http://bytes.com/topic/postgresql/answers/422107-error-tables-can-have-most
-1600-columns
). Suggestion made there - to drop/rebuild the table - is not a
trivial choice.

I will appreciate any helpful pointers that will get us past this
showstopper. Rewrite is going to be expensive.

Thanks & regards
Dinesh
Dinesh


Re: BUG #6052: ADD COLUMN - ERROR: tables can have at most 1600 columns

From
"David G. Johnston"
Date:
On Mon, Sep 18, 2023 at 8:03 AM Juris Kaminskis <juris.kaminskis@gmail.com> wrote:
I would like to refresh this problem again.

This isn't a bug so this isn't the place to discuss it.  If all you want to do is try and get a bunch of +1s for the idea, post your use case to -general and see who else agrees with you.  But I don't see it being very likely that this particular limit goes away.  Especially if the last reported request is from the 8.4 days of PostgreSQL and you aren't offering a patch.

A staging table that isn't built from scratch each time isn't a use case I'd be too interested in spending significant effort to handle.  And a test harness that doesn't have an initializer to build out clean first-run tables on demand falls into basically the same category.

David J.

Re: BUG #6052: ADD COLUMN - ERROR: tables can have at most 1600 columns

From
Andrew Dunstan
Date:


On 2023-09-18 Mo 10:16, Juris Kaminskis wrote:
Hello,

I would like to refresh this problem again. The use case: local automated testing of database migrations. If those migrations ALTER TABLES by adding and dropping columns, at some point the only valid way out of 1600 columns error is to reload database. If this is not a big problem would be nice to solve.

Juris

pirmd., 2023. g. 18. sept., plkst. 17:11 — lietotājs dinesh (<dinesh@milkorwater.com>) rakstīja:

The following bug has been logged online:

Bug reference:      6052
Logged by:          dinesh
Email address:      dinesh@milkorwater.com
PostgreSQL version: 8.4.7
Operating system:   WIndows 7 pro 64 bit
Description:        ADD COLUMN - ERROR: tables can have at most 1600 columns
Details:

I have a table which is used during data uploads, a so-called staging table.
This table has a fixed number of columns that [must] match the input CSV
file. This CSV file is uploaded using COPY command. Following the COPY, a
new column (meant for indexing) is constructed on this table using some
application logic; and dropped after that data upload cycle is over.


This is simply untrue. The number of table columns does not have to match the CSV. It can have more columns, all you have to do is name the columns you are inserting into explicitly:

COPY mytable (col1, col2, ...) FROM srcloc CSV;

That's always been the case.

If you need to be selective about which fields you insert from the CSV there is also the file_textarray_fdw extension.


cheers


andrew


--
Andrew Dunstan
EDB: https://www.enterprisedb.com