On Aug 12, 2008, at 1:15 PM, Jeff Gentry wrote:
> Hi there ...
>
> I recently discovered that there is a hard cap on the # of columns,
> being
> at 1600. I also understand that it is generally unfathomable that
> anyone
> would ever feel limited by that number ... however I've managed to
> bump
> into it myself and was looking to see if anyone had advice on how to
> manage the situation.
>
> As a bit of background, we have a Postgres database to manage
> information
> revolving around genomic datasets, including the dataset itself. The
> actual data is treated in other applications as a matrix, and while
> it has
> caused the DB design to be sub-optimal the model worked to just
> stash the
> entire matrix in the DB (the rest of the DB design is proper, but the
> storage of these matrices straight up is unorthodox ... for the
> convenience of having everything in the same storage unit with all
> of the
> other information, it has been worth the extra headache and potential
> performance dings).
What operations do you perform on the data? If it's just store and
retrieve, can you serialize them into a bytea (or xml) field?
Cheers,
Steve