Re: How To: A large [2D] matrix, 100,000+ rows/columns - Mailing list pgsql-general

From Wim Bertels
Subject Re: How To: A large [2D] matrix, 100,000+ rows/columns
Date
Msg-id 4bb77358282da0509415de05079f118ef73f7033.camel@ucll.be
Whole thread Raw
In response to Re: How To: A large [2D] matrix, 100,000+ rows/columns  (Joe Conway <mail@joeconway.com>)
Responses Re: How To: A large [2D] matrix, 100,000+ rows/columns  (Erik Wienhold <ewie@ewie.name>)
List pgsql-general
Joe Conway schreef op vr 09-06-2023 om 09:16 [-0400]:
> On 6/8/23 22:17, Pat Trainor wrote:
> > I need to have a very large matrix to maintain & query, and if not 
> > (1,600 column limit), then how could such data be broken down to
> > work?
> 
>   100,000 rows *
>   100,000 columns *
>   8 bytes (assuming float8)
> = about 80 GB per matrix if I got the math correct.
> 
> 

based on my personal experience i would not use postgres in the case
where you need many columns, u can work around this with json for
example, but it will likely end up being less easy to work with

as Joe replied: R or Python are probably a better fit,
or another database that can easily handle a lot of columns,
postgres is a great database, but not when you need a lot of columns

(as you noted+:
there might be another backend storage for postgres that can handle
this better (or in the future?), but i don't think there is one;
also there is the header for which standard 8K is provisioned anyway,
so that is the first bottleneck (you can change this value, if you
compile postgres yourself)
https://www.postgresql.org/docs/current/limits.html )

Wim



pgsql-general by date:

Previous
From: Joe Conway
Date:
Subject: Re: How To: A large [2D] matrix, 100,000+ rows/columns
Next
From: Erik Wienhold
Date:
Subject: Re: How To: A large [2D] matrix, 100,000+ rows/columns