Re: [ADMIN] Processing very large TEXT columns (300MB+) using C/libpq - Mailing list pgsql-admin

From Cory Nemelka
Subject Re: [ADMIN] Processing very large TEXT columns (300MB+) using C/libpq
Date
Msg-id CAMe5Gn0C5r75WLfh1cmUXYr23bSZ+kaddo6Z8HwY0zsUsz1+UA@mail.gmail.com
Whole thread Raw
In response to Re: [ADMIN] Processing very large TEXT columns (300MB+) using C/libpq  (Cory Nemelka <cnemelka@gmail.com>)
Responses Re: [ADMIN] Processing very large TEXT columns (300MB+) using C/libpq  (Geoff Winkless <pgsqladmin@geoff.dj>)
List pgsql-admin
All I am am doing is iterating through the characters so I know it isn't my code.

--cnemelka

On Fri, Oct 20, 2017 at 9:14 AM, Cory Nemelka <cnemelka@gmail.com> wrote:
Yes, but I should be able to read them much faster.  The psql client can display an 11MB column in a little over a minute, while in C using libpg library, it takes over an hour.  

Anyone have any experience with the same issue that can help me resolve?

--cnemelka

On Thu, Oct 19, 2017 at 5:20 PM, Aldo Sarmiento <aldo@bigpurpledot.com> wrote:
I believe large columns get put into a TOAST table. Max page size is 8k. So you'll have lots of pages per row that need to be joined with a size like that: https://www.postgresql.org/docs/9.5/static/storage-toast.html


On Thu, Oct 19, 2017 at 2:03 PM, Cory Nemelka <cnemelka@gmail.com> wrote:
I have getting very poor performance using libpq to process very large TEXT columns (300MB+).   I suspect it is IO related but can't be sure.

Anyone had experience with same issue that can help me resolve?

--cnemelka



pgsql-admin by date:

Previous
From: Cory Nemelka
Date:
Subject: Re: [ADMIN] Processing very large TEXT columns (300MB+) using C/libpq
Next
From: Geoff Winkless
Date:
Subject: Re: [ADMIN] Processing very large TEXT columns (300MB+) using C/libpq