Hi fellows!
I have tried to insert 1.000.000 of record to the following table;
--------------------------------------------------
zakal=# \d teste;
codigo | bigint | not null
nome | character varying(100) |
--------------------------------------------------
and I got these errors:
--------------------------------------------------
zakal$ psql -c "copy teste from 'teste.dat' using delimeters '|'"
ERROR: parser: parse error at or near "delimeters"
ERROR: parser: parse error at or near "delimeters"
zakal$ psql -c "copy teste from 'teste.dat' using delimiters '|'"
ERROR: COPY command, running in backend with effective uid 504, could not
open
file 'teste.dat' for reading. Errno = No such file or directory (2).
ERROR: COPY command, running in backend with effective uid 504, could not
open
file 'teste.dat' for reading. Errno = No such file or directory (2).
zakal$ pwd
/home/zakal/tmp
zakal$ psql -c "copy teste from '`pwd`/teste.dat' using delimiters '|'"
DEBUG: copy: line 27536, XLogWrite: new log file created - consider
increasing
WAL_FILES
DEBUG: copy: line 93146, XLogWrite: new log file created - consider
increasing
WAL_FILES
DEBUG: recycled transaction log file 0000000000000000
ERROR: copy: line 164723, Bad int8 external representation "16722"
ERROR: copy: line 164723, Bad int8 external representation "16722"
zakal$
zakal$
zakal$ DEBUG: recycled transaction log file 0000000000000001
----------------------------------------------------------------------
the log has overflowed.
Ok, this was a test. I'd like to know what would be happen.
But, from you, great PostGres DBA's, what is the best way to
insert a large number of data?
Is there a way to turn off the log?
Is there a way to commit each 100 records?
regards,
..............................................
A Question...
Since before your sun burned hot in space and before your race was born, I
have awaited a question.
Elielson Fontanezi
DBA Technical Support - PRODAM
+55 11 5080 9493