Re: Using COPY to import large xml file - Mailing list pgsql-general

From Tim Cross
Subject Re: Using COPY to import large xml file
Date
Msg-id 8736xaseju.fsf@gmail.com
Whole thread Raw
In response to Re: Using COPY to import large xml file  (Anto Aravinth <anto.aravinth.cse@gmail.com>)
Responses Re: Using COPY to import large xml file  (Anto Aravinth <anto.aravinth.cse@gmail.com>)
List pgsql-general
Anto Aravinth <anto.aravinth.cse@gmail.com> writes:

> Thanks a lot. But I do got lot of challenges! Looks like SO data contains
> lot of tabs within itself.. So tabs delimiter didn't work for me. I thought
> I can give a special demiliter but looks like Postrgesql copy allow only
> one character as delimiter :(
>
> Sad, I guess only way is to insert or do a through serialization of my data
> into something that COPY can understand.
>

The COPY command has a number of options, including setting what is used
as the delimiter - it doesn't have to be tab. You need to also look at
the logs/output to see exactly why the copy fails.

I'd recommend first pre-processing your input data to make sure it is
'clean' and all the fields actually match with whatever DDL you have
used to define your db tables etc. I'd then select a small subset and
try different parameters to the copy command until you get the right
combination of data format and copy definition.

It may take some effort to get the right combination, but the result is
probably worth it given your data set size i.e. difference between hours
and days. 

--
Tim Cross


pgsql-general by date:

Previous
From: Jeff Janes
Date:
Subject: Re: DB size growing exponentially when materialized view refreshedconcurrently (postgres 9.6)
Next
From: Data Ace
Date:
Subject: Re: PostgreSQL Volume Question