Re: Importing *huge* mysql database into pgsql - Mailing list pgsql-general

From Richard Huxton
Subject Re: Importing *huge* mysql database into pgsql
Date
Msg-id 45ED808A.1060007@archonet.com
Whole thread Raw
In response to Importing *huge* mysql database into pgsql  (".ep" <erick.papa@gmail.com>)
List pgsql-general
.ep wrote:
> Hello,
>
> I would like to convert a mysql database with 5 million records and
> growing, to a pgsql database.

And where's the huge database?

> All the stuff I have come across on the net has things like
> "mysqldump" and "psql -f", which sounds like I will be sitting forever
> getting this to work.

Well, there's not much of an alternative to exporting from one system
and importing to another. If you do find a better way, patent it!

This is probably a sensible place to start for converting schemas:
   http://pgfoundry.org/projects/mysql2pgsql

Then, you'll face two problems:
1. Invalid data in your mysql dump (e.g. dates like 0000-00-00)
2. Mysql-specific usage in your application

Then you might want to examine any performance issues (where your
application code has been tuned to work well with MySQL but not
necessarily PG).

Shouldn't be more than a day's work, maybe just 1/2 a day. I like to
build these things up as sets of perl scripts. That way when I notice
"one more thing" I can re-run my scripts from wherever the problem was.

Oh - if you come up with any improvements in mysql2pgsql then let the
developers know - I'm sure they'll be interested.

Good luck!
--
   Richard Huxton
   Archonet Ltd

pgsql-general by date:

Previous
From: Csaba Nagy
Date:
Subject: Re: Importing *huge* mysql database into pgsql
Next
From: Korin Richmond
Date:
Subject: Re: plpythonu and PYTHONPATH/sys.path