Getting Mysql data into Postgres: least painful methods? - Mailing list pgsql-general

I'm wondering if anyone can point me towards a good method for moving mysql data into Postgres?  I've done some web searching, and found documentation from various years, but it's not clear what's current and what works best.  Much of what I found seems to be flame war material (why Postgres is better), or is both old and seemingly involved and complex.

Here's the fuller description of what I'm trying to do.  I've got a dataset (a UMLS Metathesaurus subset) that I need to get into a Postgres database.  It's all reference data, and so will be read-only.  There's no functions or logic involved. I anticipate having to update it at least quarterly, so I'd like to get to a well-grooved import process.

The data as distributed can be had in Oracle or Mysql formats.  (I already gave them my two cents to include Postgres.)  I did see some information about modifying the Mysql distribution files to make them Postgres-compatible, but I thought (perhaps foolishly) it would be easier to bring them into Mysql, and from there export them to Postgres.

A recurring idea seemed to be to use:

mysqldump -v --compatible=postgresql umls_test > dumpfile.sql

followed by

sed -i "s/\\\'/\'\'/g" dumpfile.sql

but that didn't bring me much success.  I figure this has to be a fairly common need, and hopefully by 2013 there's an easy solution.  Thanks in advance!

Ken

--
AGENCY Software  
A data system that puts you in control
(253) 245-3801

pgsql-general by date:

Previous
From: Gavin Flower
Date:
Subject: Re: >
Next
From: Adrian Klaver
Date:
Subject: Re: Getting Mysql data into Postgres: least painful methods?