Hello,
You'll probably want to create the tables form scratch in postgres because
the datatypes and table creation syntax in each is pretty different. Then
dump just the data from MySQL and load the dump files into PostgreSQL.
Mysqldump provides several command-line switches for tweaking how the table
dump is formatted. You might want to do something like:
mysqldump -ct <database> <table> >
This will output just the data (no table creation info) as full INSERT
statements of the following form:
INSERT INTO mytable(col1, col2, col3) VALUES('1','2','3');
As long as the column names are the same and the general datatype
(string, integer, etc.) is the same, it should load pretty easily.
To load them , don't use pg_dump, just do
psql -f <datafile> <database>
Good luck!
Norm
On Wednesday 01 November 2000 03:34 am, you wrote:
> Hello everybody,
>
> We have a MySQL database, and we're planning to migrate all to a
> PostgreSQL database. We must migrate all the data to a development system,
> in order to take real performance numbers. The problem is we don't know how
> to convert the files generated by mysqldump in order to import them with
> pg_dump (and have therefore the same tables and data).
>
> Does anybody done this before?
> Is there any script to automate the process?
>
> Please, send the answer with a CC to my personal address
> (alberto@cometatech.com) since I'm not already subscribed to the mailing
> list.
>
> Thank you very much in advance for your help.