On Jan 18, 2008 4:14 AM, Dorren <
database.replication@gmail.com> wrote:
Terabytes of data: this is a lot of Oracle data to migrate. You would
need a high performance tools capable to handle heterogeneous
environment
People suggested links here, so I will add some that could be very
appropriate to your case:
PostgreSQL loader is limited by the way. For instance, if you have a
end of the line character within your data then load into PostgreSQL
will fail.
Check this pdf: http://www.wisdomforce.com/dweb/resources/docs/OracleToNetezzaWithFastReader.pdf
Few tools to consider:
FastReader: http://www.wisdomforce.com/dweb/index.php?id=23 -
extracts data from Oracle into ASCII flat files or pipe and create a
input for PostgreSQL loader. Many people use it for fast initial
synchronization. Fastreader performs bulk data extract when terabytes
of data can be migrated in hours
Database Sync - http://www.wisdomforce.com/dweb/index.php?id=1001 -
also fast data transfer tool that operates as a change data capture.
It captures all the latest transactions and could be used for data
warehouse incremental feeds with OLTP Oracle data. You may need it if
don't want each time to move terabytes of data but only the changed
data
Thanks. Ill checkout those options. I also have another question in mind.
How good(or fast) will it be to use java with jdbc to transfer these terabytes of data from oracle to postgresql? This worked okay for small datasets but Im not sure how it will behave for large data.
And also keep track of the changes in the Oracle production system using triggers?
Thanks
josh