Thanks for the answer. I agree that if you want to work with large data, then make sure you have enough memory. If you call the PQexec function with a regular select and then PQgetvalue to get the value. And you get an error - then it's your own fault.
But I know that there may not be enough memory, so I use the "copy" keyword in the query and the PQgetCopyData function. I thought that this function was designed for portioned work. By analogy with the PQputCopyData function, which works fine.
I found a way out. I use "SELECT substring(%s from %d for %d) as chunk from %s.%s WHERE %s = %s" with PQunescapeBytea in a loop. But it looks strange and takes longer.
p .s
I specified that my file is 300+ MB, not 700+.
PG Bug reporting form <noreply@postgresql.org> writes:
> Hello, I need your help. The essence of the problem is that I am trying to
> download a bytea field from a table row. The size of the data in the field
> is about 700 MB. In response, I receive an out-of-memory error.
I don't see this as a Postgres bug. If you want to work with values
that large, you'd better have plenty of memory available.
regards, tom lane
--