Thread: COPY command problems
I need help
Has somebody experienced difficulties using COPY command on large files.
I have a large (250MB) file and each time I insert records I've got one or more (< 30 of cca 1079000) corrupted
records. The number of corrupted records is not constant (i.e. 1, 30, 7, 23 etc..)
p.s. the file is tab-nl separated and corruption is done in the middle of record field
(not the end or begining)
lp
Nikola
"Nikola Ivacic" <nikola@rs-pi.com> writes: > Has somebody experienced difficulties using COPY command on large files. > I have a large (250MB) file and each time I insert records I've got one or = > more (< 30 of cca 1079000) corrupted > records. The number of corrupted records is not constant (i.e. 1, 30, 7, 23= > etc..) I'd bet on flaky hardware --- have you run memory and disk tests? If the COPY data is passing across a network, then network problems are also worthy of suspicion. regards, tom lane
It must be internal error: two reasons: 1.) the original file is OK (I checked with grep + there is no network envolved) 2.) Error has strange patern: it substitutes 0x31 with 0x21 (1 with !) also 0x34 with 0x24 (4 with $) and 0x39 with 0x29 (9 with ) ) so I guess you are right. can you suggest some tools for FreeBSD to test RAM, because I think the hard disk is ok. p.s. right now I am testing it with splited file thanks Nikola ----- Original Message ----- From: "Tom Lane" <tgl@sss.pgh.pa.us> To: "Nikola Ivacic" <nikola@rs-pi.com> Cc: <pgsql-sql@postgresql.org> Sent: Tuesday, December 24, 2002 4:24 PM Subject: Re: [SQL] COPY command problems > "Nikola Ivacic" <nikola@rs-pi.com> writes: > > Has somebody experienced difficulties using COPY command on large files. > > I have a large (250MB) file and each time I insert records I've got one or = > > more (< 30 of cca 1079000) corrupted > > records. The number of corrupted records is not constant (i.e. 1, 30, 7, 23= > > etc..) > > I'd bet on flaky hardware --- have you run memory and disk tests? If > the COPY data is passing across a network, then network problems are > also worthy of suspicion. > > regards, tom lane >
http://www.memtest86.com/ may be useful. regds mallah. On Tuesday 24 December 2002 09:25 pm, Nikola Ivacic wrote: > It must be internal error: > two reasons: > 1.) the original file is OK (I checked with grep + there is no network > envolved) > 2.) Error has strange patern: it substitutes 0x31 with 0x21 (1 with !) also > 0x34 with 0x24 (4 with $) > and 0x39 with 0x29 (9 with ) ) > > so I guess you are right. > > can you suggest some tools for FreeBSD > to test RAM, because I think the hard disk is ok. > > p.s. right now I am testing it with splited file > > thanks > > Nikola > > ----- Original Message ----- > From: "Tom Lane" <tgl@sss.pgh.pa.us> > To: "Nikola Ivacic" <nikola@rs-pi.com> > Cc: <pgsql-sql@postgresql.org> > Sent: Tuesday, December 24, 2002 4:24 PM > Subject: Re: [SQL] COPY command problems > > > "Nikola Ivacic" <nikola@rs-pi.com> writes: > > > Has somebody experienced difficulties using COPY command on large > > > files. I have a large (250MB) file and each time I insert records I've > > > got one > > or = > > > > more (< 30 of cca 1079000) corrupted > > > records. The number of corrupted records is not constant (i.e. 1, 30, > > > 7, > > 23= > > > > etc..) > > > > I'd bet on flaky hardware --- have you run memory and disk tests? If > > the COPY data is passing across a network, then network problems are > > also worthy of suspicion. > > > > regards, tom lane > > ---------------------------(end of broadcast)--------------------------- > TIP 2: you can get off all lists at once with the unregister command > (send "unregister YourEmailAddressHere" to majordomo@postgresql.org) -- Rajesh Kumar Mallah, Project Manager (Development) Infocom Network Limited, New Delhi phone: +91(11)6152172 (221) (L) ,9811255597 (M) Visit http://www.trade-india.com , India's Leading B2B eMarketplace.