Thread: 1 milion data insertion

1 milion data insertion

From
Elielson Fontanezi
Date:
Hi fellows!

    I have tried to insert 1.000.000 of record to the following table;

--------------------------------------------------
zakal=# \d teste;
 codigo | bigint                 | not null
 nome   | character varying(100) |
--------------------------------------------------

    and I got these errors:

--------------------------------------------------
zakal$ psql -c "copy teste from 'teste.dat' using delimeters '|'"
ERROR:  parser: parse error at or near "delimeters"
ERROR:  parser: parse error at or near "delimeters"
zakal$ psql -c "copy teste from 'teste.dat' using delimiters '|'"
ERROR:  COPY command, running in backend with effective uid 504, could not
open
file 'teste.dat' for reading.  Errno = No such file or directory (2).
ERROR:  COPY command, running in backend with effective uid 504, could not
open
file 'teste.dat' for reading.  Errno = No such file or directory (2).
zakal$ pwd
/home/zakal/tmp
zakal$ psql -c "copy teste from '`pwd`/teste.dat' using delimiters '|'"
DEBUG:  copy: line 27536, XLogWrite: new log file created - consider
increasing
WAL_FILES
DEBUG:  copy: line 93146, XLogWrite: new log file created - consider
increasing
WAL_FILES
DEBUG:  recycled transaction log file 0000000000000000


ERROR:  copy: line 164723, Bad int8 external representation "16722"
ERROR:  copy: line 164723, Bad int8 external representation "16722"
zakal$
zakal$
zakal$ DEBUG:  recycled transaction log file 0000000000000001
----------------------------------------------------------------------

the log has overflowed.

    Ok, this was a test. I'd like to know what would be happen.
    But, from you, great PostGres DBA's, what is the best way to
insert a large number of data?
    Is there a way to turn off the log?
    Is there a way to commit each 100 records?

regards,

..............................................
A Question...
Since before your sun burned hot in space and before your race was born, I
have awaited a question.

Elielson Fontanezi
DBA Technical Support - PRODAM
+55 11 5080 9493



Re: 1 milion data insertion

From
Marc Spitzer
Date:
On Fri, Jul 26, 2002 at 03:34:12PM -0300, Elielson Fontanezi wrote:
> Hi fellows!
> 
>     I have tried to insert 1.000.000 of record to the following table;
> 
> --------------------------------------------------
> zakal=# \d teste;
>  codigo | bigint                 | not null
>  nome   | character varying(100) |
> --------------------------------------------------
> 
>     and I got these errors:
> 
> --------------------------------------------------
> zakal$ psql -c "copy teste from 'teste.dat' using delimeters '|'"
> ERROR:  parser: parse error at or near "delimeters"
> ERROR:  parser: parse error at or near "delimeters"
> zakal$ psql -c "copy teste from 'teste.dat' using delimiters '|'"
> ERROR:  COPY command, running in backend with effective uid 504, could not
> open
> file 'teste.dat' for reading.  Errno = No such file or directory (2).
> ERROR:  COPY command, running in backend with effective uid 504, could not
> open
> file 'teste.dat' for reading.  Errno = No such file or directory (2).
> zakal$ pwd
> /home/zakal/tmp
> zakal$ psql -c "copy teste from '`pwd`/teste.dat' using delimiters '|'"
> DEBUG:  copy: line 27536, XLogWrite: new log file created - consider
> increasing
> WAL_FILES
> DEBUG:  copy: line 93146, XLogWrite: new log file created - consider
> increasing
> WAL_FILES
> DEBUG:  recycled transaction log file 0000000000000000
> 
> 
> ERROR:  copy: line 164723, Bad int8 external representation "16722"
> ERROR:  copy: line 164723, Bad int8 external representation "16722"
> zakal$
> zakal$
> zakal$ DEBUG:  recycled transaction log file 0000000000000001
> ----------------------------------------------------------------------
> 
> the log has overflowed.
>     
>     Ok, this was a test. I'd like to know what would be happen.
>     But, from you, great PostGres DBA's, what is the best way to
> insert a large number of data?
>     Is there a way to turn off the log?
>     Is there a way to commit each 100 records?
> 
> regards,

in relativly small chuncks, do 100 10,000 record transactions and
you should be fine.  

marc

> 
> ..............................................
> A Question...
> Since before your sun burned hot in space and before your race was born, I
> have awaited a question.
> 
> Elielson Fontanezi 
> DBA Technical Support - PRODAM
> +55 11 5080 9493
> 
> 
> 
> ---------------------------(end of broadcast)---------------------------
> TIP 1: subscribe and unsubscribe commands go to majordomo@postgresql.org


Re: [GENERAL] 1 milion data insertion

From
Chris Albertson
Date:
>
>     Ok, this was a test. I'd like to know what would be happen.
>     But, from you, great PostGres DBA's, what is the best way to
> insert a large number of data?
>     Is there a way to turn off the log?
>     Is there a way to commit each 100 records?

Yes, "COPY" actually does an append.  So just do what you
do now 10,000 times for 100 records.  It's a bit safer.
I've done 1M recod COPYsmany times on a low-end PC, no trouble

Put the log file someplace with more room.  You should be able to
run for a month without worrying about log files over filling

Logging is controled likely from the startup script.  Maybe in
/etc/rc.d  details depend on your OS.


=====
Chris Albertson
  Home:   310-376-1029  chrisalbertson90278@yahoo.com
  Cell:   310-990-7550
  Office: 310-336-5189  Christopher.J.Albertson@aero.org

__________________________________________________
Do You Yahoo!?
Yahoo! Health - Feel better, live better
http://health.yahoo.com