On 8/17/23 07:35, Sai Teja wrote:
> Hi Team,
>
> Even I used postgreSQL Large Objects by referring this link to store
> and retrieve large files (As bytea not working)
> https://www.postgresql.org/docs/current/largeobjects.html
>
> But even now I am unable to fetch the data at once from large objects
>
> select lo_get(oid);
>
> Here I'm getting the same error message.
>
> But if I use select data from pg_large_object where loid = 49374
> Then I can fetch the data but in page wise (data splitting into rows
> of each size 2KB)
>
> So, here how can I fetch the data at single step rather than page by
> page without any error.
>
> And I'm just wondering how do many applications storing huge amount of
> data in GBs? I know that there is 1GB limit for each field set by
> postgreSQL. If so, how to deal with these kind of situations? Would
> like to know about this to deal with real time scenarios.
>
> We need to store large content (huge volume of data) and retrieve it.
> Currently It is not happening due to limit of field size set by
> postgreSQL.
>
> Would request to share your insights and suggestions on this to help
> me for resolving this issue.
>
>
>
My first attempt at handling large payload was to use Java Selector
directly in my app. This worked but manually chunking the data was
tricky. I switched to using Tomcat and it handles large http(s)
payloads seamlessly.