Thread: Bug report - incorrect value displayed in jsonb column for large numbers
Hello,
I hope that it is correct place for bug report.
I have found bug in pg admin where if jsonb contains field with large numerical value (maybe extending JS Number.MAX_SAFE_INTEGER value?) then value displayed in column gets incorrectly rounded). Whole issue is described in https://stackoverflow.com/questions/55491006/incorrect-insert-of-bigint-into-jsonb-column-when-using-pgadmin-4.
- PGAdmin4 version 3.6
- Ubuntu 18.04.2 LTS
- desktop mode
- repro steps:
0) open query tool in some db
1) create table:
CREATE TABLE document_wrapper
( id integer NOT NULL, document jsonb NOT NULL, CONSTRAINT document_pkey PRIMARY KEY (id)
)
WITH ( OIDS = FALSE
)
TABLESPACE pg_default;
2) insert value
insert into document_wrapper(id, document) values(-8, '{"id":101861191707868275}');
3) select value
SELECT document FROM document_wrapper where id = -8;
4) Observe incorrect result (note 0 at the end):
{ "id": 101861191707868270 }?
If more info is needed please contact me
Michał Iwańczuk
=?UTF-8?B?TWljaGHFgiBJd2HFhGN6dWs=?= <miwanczuk7@gmail.com> writes: > 1) create table: > CREATE TABLE document_wrapper( > id integer NOT NULL, > document jsonb NOT NULL, > CONSTRAINT document_pkey PRIMARY KEY (id))WITH ( > OIDS = FALSE) > TABLESPACE pg_default; > 2) insert value > insert into document_wrapper(id, document) values(-8, > '{"id":101861191707868275}'); > 3) select value > SELECT document FROM document_wrapper where id = -8; > 4) Observe incorrect result (note 0 at the end): > { "id": 101861191707868270 }? Works fine for me in psql, so whatever the problem is here, it's not in the Postgres server. You should try asking in the pgadmin lists, instead. (I'm rather surprised to hear that pgadmin does anything special with jsonb values, but apparently it does ...) regards, tom lane
Re: Bug report - incorrect value displayed in jsonb column for largenumbers
From
Jerry Sievert
Date:
Hi,
You might note that JSON serialization in most language bindings adhere to IEEE 754-2008, which defines a number as a double-precision 64-bit binary, thus when you attempt to use a number that large in a language that follows that standard (as defined in ECMA-262, for instance), it will round to what you are seeing (in Javascript: Number.MAX_VALUE). You can see that result in Javascript:
insert into document_wrapper(id, document) values(-8,
'{"id":101861191707868275}');
let a = 101861191707868275;
console.log(a);
So, any language or binding that attempts to deserialize this value will round to 101861191707868270.
101861191707868270
So, any language or binding that attempts to deserialize this value will round to 101861191707868270.
JSONB, on the other hand, stores numbers internally as a NUMERIC type, and therefor are not subject to these limitations, and thus output a number that, while not conforming to that standard, is the actual representation of the number.
There is a BigInt type, that has been made available in Javascript, that can correctly store integers as large as you are expecting, and is portable across multiple other languages and bindings, but it does not currently have a safe serialization and deserialization to/from JSON.
You can follow that discussion at https://github.com/tc39/proposal-bigint/issues/24 which includes a lot of historical context as well as links to other discussions that are taking place on how to best deal with larger numbers in other languages.
Until then, you will likely need to deal with the possibility of pg serializing values that cannot be deserialized correctly. You can usually work around this behavior by instead using the pg operator #> or ##> and cast it as a NUMERIC.
It is up to the pg community to decide whether this serialization/deserialization of numbers outside of those limitations is a bug or not, as there will likely be consequences to any change.
I will note that the JSON/JSONB documentation for pg notes that it follows RFC7159, where you can read the Number specific section at: https://tools.ietf.org/html/rfc7159#section-6
This specification allows implementations to set limits on the range and precision of numbers accepted. Since software that implements IEEE 754-2008 binary64 (double precision) numbers [IEEE754] is generally available and widely used, good interoperability can be achieved by implementations that expect no more precision or range than these provide, in the sense that implementations will approximate JSON numbers within the expected precision. A JSON number such as 1E400 or 3.141592653589793238462643383279 may indicate potential interoperability problems, since it suggests that the software that created it expects receiving software to have greater capabilities for numeric magnitude and precision than is widely available. Note that when such software is used, numbers that are integers and are in the range [-(2**53)+1, (2**53)-1] are interoperable in the sense that implementations will agree exactly on their numeric values.
Here you can see that the RFC has an opinion, but does not set specific limits; it just notes that there may be interoperability problems for large values.
Hope that helps.