Re: Best practice to load a huge table from ORACLE to PG - Mailing list pgsql-performance

From Adonias Malosso
Subject Re: Best practice to load a huge table from ORACLE to PG
Date
Msg-id 8a5d3c890804281437u6bd655b3k9a38a77a49dc5552@mail.gmail.com
Whole thread Raw
In response to Re: Best practice to load a huge table from ORACLE to PG  ("Jonah H. Harris" <jonah.harris@gmail.com>)
Responses Re: Best practice to load a huge table from ORACLE to PG
List pgsql-performance
Jonah,
 
Thank you for the answer. Good to know about this enterprise DB feature.
 
I´ll follow using pgloader.
 
Regards.
 
Adonias Malosso

On Sat, Apr 26, 2008 at 10:14 PM, Jonah H. Harris <jonah.harris@gmail.com> wrote:
On Sat, Apr 26, 2008 at 9:25 AM, Adonias Malosso <malosso@gmail.com> wrote:
> I´d like to know what´s the best practice to LOAD a 70 milion rows, 101
> columns table
> from ORACLE to PGSQL.

The fastest and easiest method would be to dump the data from Oracle
into CSV/delimited format using something like ociuldr
(http://www.anysql.net/en/ociuldr.html) and load it back into PG using
pg_bulkload (which is a helluva lot faster than COPY).  Of course, you
could try other things as well... such as setting up generic
connectivity to PG and inserting the data to a PG table over the
database link.

Similarly, while I hate to see shameless self-plugs in the community,
the *fastest* method you could use is dblink_ora_copy, contained in
EnterpriseDB's PG+ Advanced Server; it uses an optimized OCI
connection to COPY the data directly from Oracle into Postgres, which
also saves you the intermediate step of dumping the data.

--
Jonah H. Harris, Sr. Software Architect | phone: 732.331.1324
EnterpriseDB Corporation | fax: 732.331.1301
499 Thornall Street, 2nd Floor | jonah.harris@enterprisedb.com
Edison, NJ 08837 | http://www.enterprisedb.com/

pgsql-performance by date:

Previous
From: PFC
Date:
Subject: Postgres Benchmark looking for maintainer
Next
From: Chris Browne
Date:
Subject: Re: Replication Syatem