Re: Importing *huge* mysql database into pgsql - Mailing list pgsql-general

From Harald Fuchs
Subject Re: Importing *huge* mysql database into pgsql
Date
Msg-id pur6s2mg2w.fsf@srv.protecting.net
Whole thread Raw
In response to Importing *huge* mysql database into pgsql  (".ep" <erick.papa@gmail.com>)
List pgsql-general
In article <1173191066.416664.320470@n33g2000cwc.googlegroups.com>,
".ep" <erick.papa@gmail.com> writes:

> Hello,
> I would like to convert a mysql database with 5 million records and
> growing, to a pgsql database.

> All the stuff I have come across on the net has things like
> "mysqldump" and "psql -f", which sounds like I will be sitting forever
> getting this to work.

> Is there anything else?

If you really want to convert a *huge* MySQL database (and not your
tiny 5M record thingie), I'd suggest "mysqldump -T". This creates for
each table an .sql file containing just the DDL, and a .txt file
containing the data.

Then edit all .sql files:
* Fix type and index definitions etc.
* Append a "COPY thistbl FROM 'thispath/thistbl.txt';"

Then run all .sql files with psql, in an order dictated by foreign keys.

pgsql-general by date:

Previous
From: altudela@gmail.com
Date:
Subject: "oracle to postgresql" conversion
Next
From: Floyd Shackelford
Date:
Subject: foreign key support for inheritance