List,
Can anyone suggest where the below error comes from, given I'm attempting to load HTTP access log data with reasonably
smallrow and column value lengths?
logs=# COPY raw FROM '/path/to/big/log/file' DELIMITER E'\t' CSV;ERROR: out of memoryDETAIL: Cannot enlarge string
buffercontaining 1073712650 bytes by 65536 more bytes.CONTEXT: COPY raw, line 613338983
It was suggested in #postgresql that I'm reaching the 1GB MaxAllocSize - but I would have thought this would only be a
constraintagainst either large values for specific columns or for whole rows. It's worth noting that this is after 613
millionrows have already been loaded (somewhere around 100GB of data) and that I'm running this COPY after the "CREATE
TABLEraw ..." in a single transaction.
I've looked at line 613338983 in the file being loaded (+/- 10 rows) and can't see anything out of the ordinary.
Disclaimer: I know nothing of PostgreSQL's internals, please be gentle!
Regards,
Tom