Re: [SQL] Slow Inserts Again - Mailing list pgsql-sql

From pierre@desertmoon.com
Subject Re: [SQL] Slow Inserts Again
Date
Msg-id 19990503140653.7998.qmail@desertmoon.com
Whole thread Raw
In response to Re: [SQL] Slow Inserts Again  ("Frank Morton" <fmorton@base2inc.com>)
List pgsql-sql
Hmmm I've had problems with punctuation and stuff when importing large quantities of test into my DB. But I've always
hadsuccess using copy. Have you tried using perl to munge your data and escape the appropriate characters?
 
 I've always used the following to import data into a clean DB.
 copy fubar from '/home/pierre/data/fubar.txt' using delimiters ',';
 How are you building your import files? That is how are you putting your data  together? 
 For me simply using a regex of: s/'/''/g  and s/,/\\,/g on each text field BEFORE I  dump it into my data file is
sufficientto allow it to be imported using the copy command.
 
 So...for a table that has three varchar columns, A/B/C my data file might look like:
 However\, I''m here.,Don''t take me serisouly.,Hi there!
 The above would be imported correctly. I may be missing something as I just started reading this thread, but I hope
thishelps...
 
 -=pierre

> 
> >> This last attempt, I bracket each insert statement with
> >                               ^^^^^^^^^^^^^^^^^^^^^
> >> "begin;" and "end;".
> >
> >Why _each_?
> >Enclose ALL statements by begin; & end; to insert ALL data
> >in SINGLE transaction:
> 
> This was suggested by someone on the list so that all
> 150,000 inserts would not be treated as one large transaction.
> 
> Like I said before, I have tried all suggestions without success.
> 
> 
> 
> 
> 
> 



pgsql-sql by date:

Previous
From: "Frank Morton"
Date:
Subject: Re: [SQL] Slow Inserts Again
Next
From: Chris Bitmead
Date:
Subject: Re: [SQL] No DIVIDE Operator