> On 27/02/2023 13:13 CET Laurenz Albe <laurenz.albe@cybertec.at> wrote:
>
> I'd be curious to know how the customer managed to do that.
> Perhaps there is a loophole in PostgreSQL that needs to be fixed.
Probably via some data access layer and not directly via Postgres. It's easy
to reproduce with psycopg:
import psycopg
with psycopg.connect() as con:
con.execute('create temp table jsontab (jsoncol json)')
con.execute(
'insert into jsontab (jsoncol) values (%s)',
[psycopg.types.json.Json('\0')],
)
with con.execute('select jsoncol from jsontab') as cur:
print(cur.fetchall())
try:
with con.execute('select jsoncol::jsonb from jsontab') as cur:
pass
raise AssertionError("jsonb should fail")
except psycopg.errors.UntranslatableCharacter:
pass
Another reason to prefer jsonb over json to reject such inputs right away.
The documentation states that json does not validate inputs in constrast to
jsonb.
Of course the OP now has to deal with json. The data can be sanitized by
replacing all null character escape sequences:
update jsontab
set jsoncol = replace(jsoncol::text, '\u0000', '')::json
where strpos(jsoncol::text, '\u0000') > 0;
But the data access layer (or whatever got the json into the database) must be
fixed as well to reject or sanitize those inputs in the future.
--
Erik