Ilya Knyazev <knuazev@gmail.com> writes:
> But I know that there may not be enough memory, so I use the "copy" keyword
> in the query and the PQgetCopyData function. I thought that this function
> was designed for portioned work. By analogy with the PQputCopyData
> function, which works fine.
Its documentation is fairly clear, I thought:
Attempts to obtain another row of data from the server during a
<command>COPY</command>. Data is always returned one data row at
a time; if only a partial row is available, it is not returned.
If you need to work with data values that are large enough to risk
memory problems, I think "large objects" are the best answer. Their
interface is a bit clunky, but it's at least designed to let you
both read and write by chunks.
regards, tom lane