Re: Switching Database Engines - Mailing list pgsql-general

From Steve Atkins
Subject Re: Switching Database Engines
Date
Msg-id B28D4519-8AC8-4364-B664-29BE05DA9EF5@blighty.com
Whole thread Raw
In response to Switching Database Engines  (Carlos Mennens <carlos.mennens@gmail.com>)
List pgsql-general
On Apr 26, 2011, at 10:24 AM, Carlos Mennens wrote:

> We've been using a Wiki server at the office for years. It was
> originally configured to use MySQL and finally after 8+ years we're
> moving the Wiki to a new platform of hardware. My question is the Wiki
> software (MediaWiki) is the only thing still tied to and using MySQL
> which we want to decommission but we've been using it for years so I'm
> worried we will lose the data. I've done some Google'ing to find out
> how can I change the MySQL database dump and successfully export it
> into my new PostgreSQL database however I don't know how practical or
> recommended this process is. I found sites like the following:

It's certainly possible to dump a mysql database and import it into
postgresql, without too much difficulty in most cases. The problem
with porting the data tends to be bad data in the mysql database
that was allowed by mysql but is caught by postgresql.

Changing the app to support it is usually the bigger problem.

>
> http://en.wikibooks.org/wiki/Converting_MySQL_to_PostgreSQL

At a quick glance, that doesn't look like a great resource. It's
suggesting using "password" rather than "md5" authentication,
amongst other things. There are also some obvious thinkos
or copy/paste problems (suggesting that '=' in mysql is equivalent
to '<>' in postgresql, for instance). While much of what it says
looks reasonable, I wouldn't rely on it.

http://www.mediawiki.org/wiki/Manual:PostgreSQL is a better
place to look, perhaps.

Most of the core mediawiki runs OK with postgresql, but most
addons don't.

You definitely want to set up a "test" wiki instance, running on
postgresql - that's the first thing to do regardless of how you
migrate the data.

Then doing an XML dump from your existing mediawiki
instance and importing it into your test instance will give
you an idea of how well that will work. If that's good enough,
you don't need to care about the underlying database.
There are several ways to import xml dumps, with different
tradeoffs - check the mediawiki docs.

> Can you guys tell me if this is something that will work? I don't mean
> the exact link above but just in general taking a database from MySQL
> and successfully migrating it for PostgreSQL use?
>
> From what I can see in the MySQL database, there appears to be 43
> tables with lots of column data and who knows what else:

Cheers,
  Steve


pgsql-general by date:

Previous
From: Phoenix Kiula
Date:
Subject: Re: PG 9.0 - Default postgresql.conf?
Next
From: Andrew Sullivan
Date:
Subject: Re: PG 9.0 - Default postgresql.conf?