I found this article:
It seems I should modify: uint8 t_hoff;
and replace it with something like: uint32 t_hoff; or uint64 t_hoff;
And perhaps should I modify this too?
The fix is easy enough, just adding a
v_hoff = LLVMBuildZExt(b, v_hoff, LLVMInt32Type(), "");
fixes the issue for me.
If that is the case, I am not sure what kind of modification we should do.
I feel I need to explain why we create these huge tables. Basically we want to process big matrices for machine learning.
Using tables with classic columns let us write very clear code. If we have to start using arrays as columns, things would become complicated and not intuitive (besides, some columns store vectors as arrays... ).
We could use JSONB (we do, but for json documents). The problem is, storing large amounts of jsonb columns create performance issues (compared with normal tables).
Since almost everybody is doing ML to apply to different products, perhaps are there other companies interested in a version of Postgres that could deal with tables with thousands of columns?
I did not find any postgres package ready to use like that though.
Pablo