At 07:13 AM 19-02-2000 -0800, Nicholas Negulescu wrote:
>This message was sent from Geocrawler.com by "Nicholas Negulescu"
<negulen@eou.edu>
>Be sure to reply to that address.
>
>Hi, I'm looking for a solution to the 8k limit on
>text in Postgres. I want the text to be
>searchable, so a large object is out. Does
>anybody have a method to split text larger than
>8k and insert it into a multiple-row entry...
It's probably better to do it in a different way. Searching large amounts
of text can be very time consuming.
Maybe you could create a dictionary table of words (and possibly short
phrases). Then you have a link table between the dictionary table and the
actual BLOB text table. You could even put "points" in the link table and
sort according to points. That way you could at least get an indexed
search. I haven't tried this out, I welcome any comments or suggestions.
So far I haven't gathered up the courage to use postgres BLOBs myself :). I
resorted to storing big stuff in the file system for various reasons -
BLOBs don't look easy to use yet, seem to be "gotchas" here and there. I'm
not sure if there is a 2GB limit on the whole table in Linux as well- so
far it has not been easy to find the limits to stuff on Postgres.
Transaction control is not as neat, but I guess I could have an application
level "VACUUM" to clean out left over files.
Cheerio,
Link.