Re: Import Database - Mailing list pgsql-general

From Brett W. McCoy
Subject Re: Import Database
Date
Msg-id Pine.LNX.4.30.0102051516040.30791-100000@chapelperilous.net
Whole thread Raw
In response to Import Database  ("Matt" <matthewf9@aol.com (nospam)>)
List pgsql-general
On Mon, 29 Jan 2001, Matt wrote:

> I am trying to find if importing a very large delimited text file is faster
> with postgresql or mysql (with mysqlimport). Each night the transaction
> system we use completes a text file of the days activities, which must be
> loaded into a database, the speed is very important, mysqlimport takes less
> than an hour, however sometimes crashes. Is postgresql likely to be faster
> or slower at importing such vast amounts of data?

How much data are you talking about?  Megabytes?  Gigabytes?

PostgreSQL will load fairly fast if you turn off fsync and delete your
indexes and rebuild them after the import.  I haven't played with large
imports on the newer Postgres, but a couple of years ago I was importing
millions of rows into 6.5 on a lowly Pentium 200, with no indexes and with
fsync turned off.  I had to load each table separately (each one was
several million rows, plain old delimited delimited text), and they loaded
fairly quickly -- maybe 10 or 15 minutes, just using the COPY command
inside of psql.  With fsync on and indexes in place, it took *hours* to
load and basically slowed the server to a crawl because of the I/O
overhead.

-- Brett



pgsql-general by date:

Previous
From: "Martin A. Marques"
Date:
Subject: Re: Solaris 8 compilation errors
Next
From: Alex Pilosov
Date:
Subject: Re: Compiling Perl code