Re: Adapter update. - Mailing list pgsql-general

From Ow Mun Heng
Subject Re: Adapter update.
Date
Msg-id 1189134390.17218.40.camel@neuromancer.home.net
Whole thread Raw
In response to Re: Adapter update.  (Richard Huxton <dev@archonet.com>)
List pgsql-general
On Wed, 2007-08-22 at 20:41 +0100, Richard Huxton wrote:
> Murali Maddali wrote:
> > This is what I am doing, I am reading the data from SQL Server 2005 and
> > dumping to out to Postgresql 8.2 database.

My 2 cents.. I'm doing roughly the same thing, but I'm using perl and
DBI to do it.

> Fastest way to load data into PG is via COPY, don't know if npgsql
> driver supports that. If not, you'd have to go via a text-file.
>
> Load the data into an import table (TEMPORARY table probably) and then
> just use three queries to handle deletion, update and insertion.
> Comparing one row at a time is adding a lot of overhead.

My way of doing it..

1. pull from SQL Server via DBI to temp csv file.
2. Import via \copy into PG to temp table.
begin transaction
3. DElete duplicate pkey entries in actual table
4. insert new entries into actual table
5, truncate temp table
6. update a log file
end transaction.

works great..

Note on [3]..all data are new.. so instead of just doing update, I
resorted to doing a delete like the mysql's mysqlimport --replace
command. (my choice)

pgsql-general by date:

Previous
From: "Merlin Moncure"
Date:
Subject: Re: array_to_records function
Next
From: "A.M."
Date:
Subject: Re: Call for Speakers PostgreSQL Conference Fall 2007