query and pg_dump problem on my postgresql 6.5.3/Redhat 6.2 - Mailing list pgsql-general
From | 吴德文 |
---|---|
Subject | query and pg_dump problem on my postgresql 6.5.3/Redhat 6.2 |
Date | |
Msg-id | 200312041501.hB4F1g5r014577@staff.xmu.edu.cn Whole thread Raw |
Responses |
Re: query and pg_dump problem on my postgresql 6.5.3/Redhat
Re: query and pg_dump problem on my postgresql 6.5.3/Redhat 6.2 |
List | pgsql-general |
Help! A few days ago, my php page began to complain this: ------ Warning: PostgresSQL query failed: pqReadData() -- backend closed the channel unexpectedly. This probably means the backend terminated abnormally before or while processing the request. ------ The SQL string in php page is: ------ $sql.='Select news_id,title,summary,publish_time,is_html,if_use_url,url,news_pri '; $sql.='From newses N,classes C '; $sql.="Where N.class_id = C.class_id AND C.classname='$class' "; $sql.='Order by publish_time Desc,news_id Desc Limit '.$Nlimit; ------ NOTE: I'm on Redhat 6.2 with Postgresql 6.5.3, the database named "news", and the table is "newses", looks like this (dumped from "pg_dump -s -t newses news"): CREATE TABLE "newses" ( "news_id" int4 DEFAULT nextval ( '"newses_news_id_seq"' ) NOT NULL, "title" character varying(100) NOT NULL, "class_id" int4 NOT NULL, "summary" text DEFAULT '', "user_id" int4 NOT NULL, "url" character varying(100), "img_url" character varying(100), "publish_time" date NOT NULL, "if_show_news" bool DEFAULT bool 'f' NOT NULL, "if_use_url" bool DEFAULT bool 'f' NOT NULL, "is_html" bool DEFAULT bool 'f' NOT NULL, "view_count" int4 DEFAULT 0 NOT NULL, "news_pri" int4); CREATE UNIQUE INDEX "newses_pkey" on "newses" using btree ( "news_id" "int4_ops" ); This table has 243 records, the max news_id is 253. Later I found queries like these fails in psql: select news_id,title from newses order by news_id desc limit 10; select count(news_id) from newses; But thess works fine: select * from newses where news_id< 300; select count(*) from newses where news_id <300; select count(news_id) from newses where news_id <300; A simple rule is if I'm running query over the whole table without condition, I get same error message mentioned above. I thought my postgresql should be patch or upgrade, so I began to backup the database on it. But I found that pg_dump sometimes does not work on that very table, and sometimes work with a long long time then error. following are the error message of "pg_dump news -t newses -f newses-data.sql": ------ pqWait() -- connection not open PQendcopy: resetting connection SQL query to dump the contents of Table 'newses' did not execute correctly. After we read all the table contents from thebackend, PQendcopy() failed. Explanation from backend: ''. The query was: 'COPY "newses" TO stdout; '. ------ I read the file(14M) generated and found that after the normally record(91K) there are many character like these: ------ \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N 280368896 \N \N 0 \N f f f 0 0 280368896 \N \N 0 \N f f f 0 0 280368896 \N \N 0 \N f f f 0 0 ------ And end with ------ \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \N \. ------ It is my nightmare now for I can't get back my data. I googled around with no luck. Anyone help me to get back the data and tell me what was going on? Yours Wind Wood windwood@jingxian.xmu.edu.cn 2003-12-04
pgsql-general by date: