Re: Best practice to load a huge table from ORACLE to PG - Mailing list pgsql-performance

From Tino Wildenhain
Subject Re: Best practice to load a huge table from ORACLE to PG
Date
Msg-id 48162C86.1040405@wildenhain.de
Whole thread Raw
In response to Best practice to load a huge table from ORACLE to PG  ("Adonias Malosso" <malosso@gmail.com>)
List pgsql-performance
Adonias Malosso wrote:
> Hi All,
>
> I´d like to know what´s the best practice to LOAD a 70 milion rows, 101
> columns table
> from ORACLE to PGSQL.
>
> The current approach is to dump the data in CSV and than COPY it to
> Postgresql.
>
Uhm. 101 columns you say? Sounds interesting. There are dataloaders
like: http://pgfoundry.org/projects/pgloader/  which could speed
up loading the data over just copy csv. I wonder how much normalizing
could help.

Tino

pgsql-performance by date:

Previous
From: Shane Ambler
Date:
Subject: Re: Very poor performance loading 100M of sql data using copy
Next
From: "Joshua D. Drake"
Date:
Subject: Re: Benchmarks WAS: Sun Talks about MySQL