Re: Importing *huge* mysql database into pgsql - Mailing list pgsql-general

From Chris
Subject Re: Importing *huge* mysql database into pgsql
Date
Msg-id 45EE124E.2030007@gmail.com
Whole thread Raw
In response to Importing *huge* mysql database into pgsql  (".ep" <erick.papa@gmail.com>)
List pgsql-general
.ep wrote:
> Hello,
>
> I would like to convert a mysql database with 5 million records and
> growing, to a pgsql database.
>
> All the stuff I have come across on the net has things like
> "mysqldump" and "psql -f", which sounds like I will be sitting forever
> getting this to work.


If you can convert the database schema, then in mysql do a dump of the
tables like this:

select * from table into outfile '/tmp/filename';

(see http://dev.mysql.com/doc/refman/4.1/en/select.html)

and then import it into postgres like this:

\copy table from '/tmp/filename'

(see http://www.postgresql.org/docs/8.2/interactive/sql-copy.html)

That's much better because it creates a CSV like file which postgres can
process in one go.

Using complete inserts to do a conversion is horribly slow because
postgres does a single transaction per insert - so you can either wrap a
number of inserts inside a transaction, or do a copy like this (copy is
best).

--
Postgresql & php tutorials
http://www.designmagick.com/

pgsql-general by date:

Previous
From: "Ed L."
Date:
Subject: Re: vacuum error
Next
From: "Reuven M. Lerner"
Date:
Subject: Re: Database slowness -- my design, hardware, or both?