I got to know the field size limit for the bytea datatype column is limited to 1 GB in postgreSQL. Then how can we increase this? Since we need to store high volume of data for each row in a table
I am just running select query to fetch the result
Query : select id, content_data, name from table_name
So here content_data is bytea content which is having more than 700 MB. Even if I run this query in any DB client such as Pgadmin, dbeaver etc.. I'm facing the same error. But this query is being called in java as well
So, I don't think java could be the issue as I can able to successfully insert the data. But, only the problem is with fetching the data that too only specific rows which are having huge volume of data.
On 8/14/23 09:29, Sai Teja wrote: > Could anyone please suggest any ideas to resolve this issue. > > I have increased the below parameters but still I'm getting same error. > > work_mem, shared_buffers > > Out of 70k rows in the table only for the few rows which is of large > size (700MB) getting the issue. Am unable to fetch the data for that > particular row. > > Would be appreciated if anyone share the insights. > > Thanks, > Sai > > Are you using java? There's an upper limit on array size, hence also on String length. You'll likely need to process the output in chunks.