how to continue after error in batch mode with psql - Mailing list pgsql-novice
From | Urs Rau |
---|---|
Subject | how to continue after error in batch mode with psql |
Date | |
Msg-id | 4BD85B49.2010206@uk.om.org Whole thread Raw |
Responses |
Re: how to continue after error in batch mode with psql
Re: how to continue after error in batch mode with psql |
List | pgsql-novice |
I have a nightly process that pulls data over into postgresql 8.3.10 from a progress server runing under8.x. unfortunately the progress db does not enforce record size/length limitations. It does not care if a record of 20 characters contains 21, or 100 for that matter. we have a simple pipe that taks the progress data export dump and imports it into postgresql su - postgres -c "cat test.csv | psql -X -q test -c \"COPY t_test FROM stdin WITH DELIMITER AS ',' NULL AS '?' CSV QUOTE AS '\\\"' ESCAPE AS '\\\"'\"" If any of the fields are over-length, we do get a error message that tells us which row needs fixing. ERROR: value too long for type character varying(20) CONTEXT: COPY t_test, line 2, column t_comment_c: "'comment 3 that is too long'" But how do we get psql to run through and continue after an error and import as much as possible of the rest of the import data and give us error messages about all lines with errors? So I want: ERROR: value too long for type character varying(20) CONTEXT: COPY t_test, line 2, column t_comment_c: "'comment 3 that is too long'" ERROR: value too long for type character varying(20) CONTEXT: COPY t_test, line 4, column t_comment_c: "'comment 5 that is too long'" ERROR: value too long for type character varying(20) CONTEXT: COPY t_test, line 6, column t_comment_c: "'comment 7 that is too long'" ERROR: value too long for type character varying(20) CONTEXT: COPY t_test, line 8, column t_comment_c: "'comment 9 that is too long'" I have tried adding the following to psql, but that did not make any difference. "-v ON_ERROR_STOP" "-v ON_ERROR_STOP=0" "--set "ON_ERROR_STOP" "--set "ON_ERROR_STOP=0" Here is my test database definition, followed by my test.csv data: SET client_encoding = 'UTF8'; SET standard_conforming_strings = on; SET check_function_bodies = false; SET client_min_messages = warning; SET search_path = public, pg_catalog; SET default_tablespace = ''; SET default_with_oids = false; -- -- Name: t_test; Type: TABLE; Schema: public; Owner: postgres; Tablespace: -- CREATE TABLE t_test ( t_record_i integer DEFAULT 0 NOT NULL, t_comment_c character varying(20) ); ALTER TABLE public.t_test OWNER TO postgres; -- -- Data for Name: t_test; Type: TABLE DATA; Schema: public; Owner: postgres -- INSERT INTO t_test VALUES (1, 'comment 1'); -- -- Name: t_table_pkey; Type: CONSTRAINT; Schema: public; Owner: postgres; Tablespace: -- ALTER TABLE ONLY t_test ADD CONSTRAINT t_table_pkey PRIMARY KEY (t_record_i); -- -- Name: t_test; Type: ACL; Schema: public; Owner: postgres -- REVOKE ALL ON TABLE t_test FROM PUBLIC; REVOKE ALL ON TABLE t_test FROM postgres; GRANT ALL ON TABLE t_test TO postgres; and my test.csv data: 2,'comment 2' 3,'comment 3 that is too long' 4,'comment 4' 5,'comment 5 that is too long' 6,'comment 6' 7,'comment 7 that is too long' 8,'comment 8' 9,'comment 9 that is too long' 10,'comment 10' What I am hoping for is to get an error message about the rows 3,5,7 and 9 (and - if I am lucky all the data up to the defined length in the record) and definitely all the data for row 2,4,6,8 and 10 in the t_test table in the test database. What does it take, top get this working? I am a newbie, am I barking up the wrong tree with trying to unset ON_ERROR_STOP? Many thanks for any hints. Regards, -- Urs Rau
pgsql-novice by date: