Data loss when '"json_populate_recorset" with long column name - Mailing list pgsql-hackers

From Денис Романенко
Subject Data loss when '"json_populate_recorset" with long column name
Date
Msg-id CALSd-cppwDQ5+AmvrZ7a+XKQBCE9amS1uRK3X60=q1iL7x0SaQ@mail.gmail.com
Whole thread Raw
Responses Re: Data loss when '"json_populate_recorset" with long column name
List pgsql-hackers
If we create a column name longer than 64 bytes, it will be truncated in PostgreSQL to max (NAMEDATALEN) length. 

For example: "VeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongName" will be truncated in database to "VeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongNameVer"

But in the codebase we could work with full column name - SQL functions like INSERT/UPDATE work with long names without problem, automatically searches for suitable column (thank you for it).

But if we try to update it with "json_populate_recordset" using full name, it will not just ignore column with long name - data in that record will be nulled.

How to reproduce:
1. create table wow("VeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongName" text);
2. select * from json_populate_recordset(null::wow,'[{"VeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongName": "haha"}]');
3. "VeryLongNameVeryLongNameVeryLongNameVeryLongNameVeryLongNameVer" becomes null.


P.S. Why do I need columns with more than 64 bytes length - because I use non-Latin characters in column and table names, so In fact I have only 32 chars because of Unicode. (PostgreSQL: NAMEDATALEN increase because of non-latin languages)



pgsql-hackers by date:

Previous
From: Amit Kapila
Date:
Subject: Re: [BUG] Failed Assertion in ReorderBufferChangeMemoryUpdate()
Next
From: Fujii Masao
Date:
Subject: Re: Allow escape in application_name (was: [postgres_fdw] add local pid to fallback_application_name)