The following bug has been logged on the website:
Bug reference: 15730
Logged by: ShuLin Du
Email address: dsl0530@hotmail.com
PostgreSQL version: 9.6.6
Operating system: RedHat Enterprise linux 7.3
Description:
I use the pg_bulkload(3.1.14) to load the file data into table .
The input data(file data) needs to be processed by filtering function(Define
filter function in CTL file) before insert into table.
But I found that if I used a filter function to process input data,then
output to db table,the maximum length limit for table fields is invalid.
filter function as below (input data :2 items, output data: 3 items):
create function FVUA001(varchar,varchar) returns record as $BODY$
select row($1,$2,null) $BODY$
language sql volatile
cost 100;
But, Even if the length of a field in the input data is longer than the
length of the table definition,data will also be successfully inserted into
tables. So I wonder if this would be a postgreSQL bug.
Please help me solve this problem. Thanks a lot.