Thread: Question about Scripting in Postgresql.

Question about Scripting in Postgresql.

From
Nico King
Date:
Here is the problem I have some key tables that I need
to import some data into it.I can't go ahead and 
write "insert into table value()"for over 40 different
tables and over 100s of rows and columns

The reason that I have to write a script to enter the
data into the tables is that what if I have to enter
1000 lines of data into 200 rows??
here is a piece of my script that works but not when I
enter lets' say a char instead of integer.
=========================================================
copy accounts from stdin using delimiters ',';
1,pass,mac,,,
2,pass2,mac2,ip,test
0,pass2,mac2,ip,test2
\.
=======================================================
P.S: also I have used the tab delimiter.

I have written a script to import some data into
my database tables, with the delimiter ','. Now my
question is sometime the data being sent to my tables
might not match the data type or be corrupted and I
receive an error message.
One: how could I prevent that?


Two: how can I proceed with importing the rest of the
data into the next record even though some are
corrupted,'cause I get intrupted as soon as there is
an error in inserting the data?


__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com


Re: Question about Scripting in Postgresql.

From
Nico King
Date:
well let's say that some values get crupted!
what postgres does stops the process, I want it to ignore the erro and continue importing the rest of the data into my tables and sent the error to a log file.
how could this  be done.?
I really need to find a way, any sugesstions are welcome, I even thought of garding the data, but still this is dangerous when dealing with tons of data.
Thanx

Richard Huxton <dev@archonet.com>, pgsql-hackers@postgresql.org wrote:
On Friday 05 September 2003 00:24, Nico King wrote:
[moving this to pgsql-general]
> The reason that I have to write a script to enter the
> data into the tables is that what if I have to enter
> 1000 lines of data into 200 rows??
> here is a piece of my script that works but not when I
> enter lets' say a char instead of integer.
> =========================================================
> copy accounts from stdin using delimiters ',';
> 1,pass,mac,,,
> 2,pass2,mac2,ip,test
> 0,pass2,mac2,ip,test2
> \.
> =======================================================

Sorry - don't understand. Assuming your values are the right type for the
columns that looks OK to me.

> I have written a script to import some data into
> my database tables, with the delimiter ','. Now my
> ques tion is sometime the data being sent to my tables
> might not match the data type or be corrupted and I
> receive an error message.
> One: how could I prevent that?

Don't try and put bad data into the batch. It's designed so that if you
automate importing batches of data the operation isn't left in some half-done
state.

> Two: how can I proceed with importing the rest of the
> data into the next record even though some are
> corrupted,'cause I get intrupted as soon as there is
> an error in inserting the data?

Sounds like you want to write a small Perl script to take your data, strip out
anything obviously bad and then insert it in batches. If you have a lot of
bad data you can do it one row at a time, if not transactions of say 100 rows
at a time might be better.

--
Richard Huxton
Archonet Ltd


Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software