Re: Bad Data back Door - Mailing list pgsql-hackers

From Tom Lane
Subject Re: Bad Data back Door
Date
Msg-id 20303.1349485921@sss.pgh.pa.us
Whole thread Raw
In response to Bad Data back Door  ("David E. Wheeler" <david@justatheory.com>)
Responses Re: Bad Data back Door  (John R Pierce <pierce@hogranch.com>)
Re: Bad Data back Door  ("David E. Wheeler" <david@justatheory.com>)
Re: Bad Data back Door  ("David E. Wheeler" <david@justatheory.com>)
List pgsql-hackers
"David E. Wheeler" <david@justatheory.com> writes:
> I�ve discovered something a bit disturbing at $work. We�re migrating (slowly) from Oracle to PostgreSQL, and in some
casesare using oracle_fdw to copy data over. Alas, there are a fair number of text values in the Oracle database that,
althoughthe database is UTF-8, are actually something else (CP1252 or Latin1). When we copy from an oracle_fdw foreign
tableinto a PostgreSQL table, PostgreSQL does not complain, but ends up storing the mis-encoded strings, even though
thedatabase is UTF-8.
 

> I assume that this is because the foreign table, as a table, is assumed by the system to have valid data, and
thereforadditional character encoding validation is skipped, yes?
 

Probably not so much "assumed" as "nobody thought about it".  In
e.g. plperl we expend the cycles to do encoding validity checking on
*every* string entering the system from Perl.  I'm not sure why foreign
tables ought to get a pass on that, especially when you consider the
communication overhead that the encoding check would be amortized
against.

Now, having said that, I think it has to be the reponsibility of the FDW
to apply any required check ... which makes this a bug report against
oracle_fdw, not the core system.  (FWIW, contrib/file_fdw depends on the
COPY code, which will check encoding.)
        regards, tom lane



pgsql-hackers by date:

Previous
From: Peter Eisentraut
Date:
Subject: why repl_gram.h?
Next
From: Tatsuo Ishii
Date:
Subject: Re: 64-bit API for large object