Re: Load data from a csv file without using COPY - Mailing list pgsql-general

From Ravi Krishna
Subject Re: Load data from a csv file without using COPY
Date
Msg-id em2345975f-0c51-42dd-a35f-ff88715e8bbb@ravi-lenovo
Whole thread Raw
In response to Re: Load data from a csv file without using COPY  (Nicolas Paris <niparisco@gmail.com>)
Responses Re: Load data from a csv file without using COPY
List pgsql-general
Thanks all for replying.  I see that I did not explain my requirement in detail.  So let me
explain it in detail.

1. Currently we have a legacy app running in DB2/LUW. Application writes to it either via Java program
    or uses a custom ETL scripts using a vendor product.
2. We want to migrate it to DB2 and eliminate vendor ETL tool.
3. We now have a catch-22 situation.  Should we spend time porting the app to PG without first verifying
    that PG can perform as well as DB2. In other words, if some sort of testing rules out PG as a good
    replacement for DB2, why even bother to port.  Of course that does not prove conclusively that if PG
    passes the test, then it would mean that the app will work just as fine.  But at least basic test will tell
   that we are not on a wrong path.
4. What I am planning is:
    4.a Get a set of large tables exported as a pipe delimited text file.
    4.b Load them in both DB2 and PG on a similar h/w
    4.c  Run OLAP queries.

4.b is to test i/o. Our app is sensitive to the load times and some of the tables are really wide.
4.c is to test maturity of PG in handling complex OLAP SQLs. From what I have read, while PG
     optimizer is very good in handling OLTP, it is not, as yet, as good in OLAP queries.

I just want to keep the testing tool same in 4.b for both db2 and pg. If COPY is the only way,
we will use it with something comparable on the DB2 side.

pgsql-general by date:

Previous
From: Adrian Klaver
Date:
Subject: Re: Load data from a csv file without using COPY
Next
From: Michael Paquier
Date:
Subject: Re: Load data from a csv file without using COPY