1 milion data insertion - Mailing list pgsql-general

From Elielson Fontanezi
Subject 1 milion data insertion
Date
Msg-id A799F7647794D311924A005004ACEA97080DDEC4@cprodamibs249.prodam
Whole thread Raw
Responses Re: 1 milion data insertion  (Andrew Sullivan <andrew@libertyrms.info>)
Re: 1 milion data insertion  (Chris Albertson <chrisalbertson90278@yahoo.com>)
List pgsql-general
Hi fellows!

    I have tried to insert 1.000.000 of record to the following table;

--------------------------------------------------
zakal=# \d teste;
 codigo | bigint                 | not null
 nome   | character varying(100) |
--------------------------------------------------

    and I got these errors:

--------------------------------------------------
zakal$ psql -c "copy teste from 'teste.dat' using delimeters '|'"
ERROR:  parser: parse error at or near "delimeters"
ERROR:  parser: parse error at or near "delimeters"
zakal$ psql -c "copy teste from 'teste.dat' using delimiters '|'"
ERROR:  COPY command, running in backend with effective uid 504, could not
open
file 'teste.dat' for reading.  Errno = No such file or directory (2).
ERROR:  COPY command, running in backend with effective uid 504, could not
open
file 'teste.dat' for reading.  Errno = No such file or directory (2).
zakal$ pwd
/home/zakal/tmp
zakal$ psql -c "copy teste from '`pwd`/teste.dat' using delimiters '|'"
DEBUG:  copy: line 27536, XLogWrite: new log file created - consider
increasing
WAL_FILES
DEBUG:  copy: line 93146, XLogWrite: new log file created - consider
increasing
WAL_FILES
DEBUG:  recycled transaction log file 0000000000000000


ERROR:  copy: line 164723, Bad int8 external representation "16722"
ERROR:  copy: line 164723, Bad int8 external representation "16722"
zakal$
zakal$
zakal$ DEBUG:  recycled transaction log file 0000000000000001
----------------------------------------------------------------------

the log has overflowed.

    Ok, this was a test. I'd like to know what would be happen.
    But, from you, great PostGres DBA's, what is the best way to
insert a large number of data?
    Is there a way to turn off the log?
    Is there a way to commit each 100 records?

regards,

..............................................
A Question...
Since before your sun burned hot in space and before your race was born, I
have awaited a question.

Elielson Fontanezi
DBA Technical Support - PRODAM
+55 11 5080 9493



pgsql-general by date:

Previous
From: Janning Vygen
Date:
Subject: Re: plpgsql: function throws error on second call!
Next
From: Andrew Sullivan
Date:
Subject: Re: 1 milion data insertion