Eliot Gable-4 wrote
> I advocated creating a separate mapping table which
> maps the ID of these records to the other ID we are searching for and
> performing a JOIN on the two tables with appropriate foreign key
> relationships and indices. However, I was ask to instead put the list into
> a single column on each row to reduce implementation complexity.
>
> Assuming the list of IDs is in a column on each row as TEXT in the format
> of a JSON array, what is the best way to index the column so I can quickly
> find the rows with the given ID?
I recommend benchmarking two implementations:
1) id_xref integer[] --on the same table, use "search_id = ANY(id_xref)" as
the WHERE condition
2) Your multi-table solution but use "EXISTS (SELECT 1 FROM xref_master
WHERE search_id = id_xref)"
And I'd politely respond that implementation complexity is somewhat less
important than performance in an embedded system - not that either of these
solutions is considered complex and both can readily be encapsulated into
functions to hide any such complexity from the application.
I would not introduce the added indirection of storing the values as a
single JSON array. Especially if the IDs are integer-based but even if you
represent IDs as text anyway.
The fact you want to use LIKE/REGEX confuses me but that may be because you
are limiting yourself to text. Most cross-ref searches know the exact ID
being looked for so pattern matching is not required...
David J.
--
View this message in context:
http://postgresql.1045698.n5.nabble.com/JSON-vs-Text-Regexp-Index-Searching-tp5793472p5793492.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.