Re: Import csv in PostgreSQL - Mailing list pgsql-performance

From Gabi D
Subject Re: Import csv in PostgreSQL
Date
Msg-id CAC7ivptPV5Uxx=mJ6=h=Y5aDr6ZqVNLg5wPNL_4C2Syk==e8HQ@mail.gmail.com
Whole thread Raw
In response to Import csv in PostgreSQL  (Dinesh Chandra 12108 <Dinesh.Chandra@cyient.com>)
List pgsql-performance
it must be something in your data. either you have ',' (which you specified as your delimiter) in the data itself or you have end of line chars embedded in the data. Try a file with one row only and see what happens. If it's ok try a few more - possibly the problem lies in some other row. add more lines until you can see the problem happening and then identify the problem row(s) / char(s). 
Possibly also try using the encoding parameter for the copy command or for the query/procedure that you use to issue the data to file


On Mon, Oct 15, 2018 at 8:43 AM Dinesh Chandra 12108 <Dinesh.Chandra@cyient.com> wrote:

Hi,

 

I have a CSV file having only 400 records.

 

I have to import it in DB table, it’s working fine but why it’s importing 1047303 rows as I have only 400 records are present in that file.

Could you please help me on this?

 

 

Regards,

Dinesh Chandra

 




DISCLAIMER:

This email message is for the sole use of the intended recipient(s) and may contain confidential and privileged information. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message. Check all attachments for viruses before opening them. All views or opinions presented in this e-mail are those of the author and may not reflect the opinion of Cyient or those of our affiliates.
Attachment

pgsql-performance by date:

Previous
From: Dinesh Chandra 12108
Date:
Subject: Import csv in PostgreSQL
Next
From: "Sam R."
Date:
Subject: Re: One big table or split data? Writing data. From disk point ofview. With a good storage (GBs/s, writing speed)