Thread: server closed the connection unexpectedly
Hi: I'm getting this error when accessing a table with certain WHERE condition: "server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request. The connection to the server was lost. Attempting reset: Failed." I've read through the posts but found no answer to the problem. When I "Vaccum analize" the table I get the same error. I have droped and re-created the indexes. Version is "PostgreSQL 7.2.2 on i686-pc-linux-gnu, compiled by GCC gcc (GCC) 3.2 20020903 (Red Hat Linux 8.0 3.2-7)" Any help on how to recover the table would be greatly appreciated. Thanks, Ruben.
Can you use pg_dump to backup the database and possibly then upgrade the db? 7.2 is rather old. I recall reading similar postings in the mailing this that recommend you upgrade to 7.2.4 or .6(?) if you must stay on 7.2. On Mon, 2004-07-05 at 17:48, ruben wrote: > Hi: > > I'm getting this error when accessing a table with certain WHERE condition: > > "server closed the connection unexpectedly > This probably means the server terminated abnormally > before or while processing the request. > The connection to the server was lost. Attempting reset: Failed." > > I've read through the posts but found no answer to the problem. > > When I "Vaccum analize" the table I get the same error. > > I have droped and re-created the indexes. > > Version is > "PostgreSQL 7.2.2 on i686-pc-linux-gnu, compiled by GCC gcc (GCC) 3.2 > 20020903 (Red Hat Linux 8.0 3.2-7)" > > Any help on how to recover the table would be greatly appreciated. > Thanks, Ruben. > > > > ---------------------------(end of broadcast)--------------------------- > TIP 6: Have you searched our list archives? > > http://archives.postgresql.org
Hi Mike: Thanks for your answer, I'm unable to dump the table: -bash-2.05b$ pg_dump -Fc -t afected_table database_name -f ./afected_table.dump pg_dump: connection not open pg_dump: lost synchronization with server, resetting connection pg_dump: SQL command to dump the contents of table "afected_table" failed: PQendcopy() failed. pg_dump: Error message from server: FATAL 1: The database system is starting up Ruben. mike g wrote: > Can you use pg_dump to backup the database and possibly then upgrade the > db? 7.2 is rather old. I recall reading similar postings in the > mailing this that recommend you upgrade to 7.2.4 or .6(?) if you must > stay on 7.2. > > > On Mon, 2004-07-05 at 17:48, ruben wrote: > >>Hi: >> >>I'm getting this error when accessing a table with certain WHERE condition: >> >>"server closed the connection unexpectedly >>This probably means the server terminated abnormally >>before or while processing the request. >>The connection to the server was lost. Attempting reset: Failed." >> >>I've read through the posts but found no answer to the problem. >> >>When I "Vaccum analize" the table I get the same error. >> >>I have droped and re-created the indexes. >> >>Version is >>"PostgreSQL 7.2.2 on i686-pc-linux-gnu, compiled by GCC gcc (GCC) 3.2 >>20020903 (Red Hat Linux 8.0 3.2-7)" >> >>Any help on how to recover the table would be greatly appreciated. >>Thanks, Ruben. >> >> >> >>---------------------------(end of broadcast)--------------------------- >>TIP 6: Have you searched our list archives? >> >> http://archives.postgresql.org >
Ok, Other suggestions: 1) Have you done a Vaccum full on the table? That should reduce the table size and resources required. 2) Use psql to dump the table instead of pg_dump. In psql do a COPY affected table TO /file name. That will output the table results in a tab delimited text file which could then be reimported later using COPY affected table from /that file name. COPY BINARY affected table to /File name could be considered as well. 3) create a few smaller tables with the same data definitions. Insert specific sections of the table into each smaller table via Insert into one_smaller_table Select * from affected_table where date between X and Y HTH On Tue, 2004-07-06 at 10:43, ruben wrote: > Hi Mike: > > Thanks for your answer, I'm unable to dump the table: > > -bash-2.05b$ pg_dump -Fc -t afected_table database_name -f > ./afected_table.dump > pg_dump: connection not open > pg_dump: lost synchronization with server, resetting connection > pg_dump: SQL command to dump the contents of table "afected_table" > failed: PQendcopy() failed. > pg_dump: Error message from server: FATAL 1: The database system is > starting up > > > Ruben. > > > mike g wrote: > > Can you use pg_dump to backup the database and possibly then upgrade the > > db? 7.2 is rather old. I recall reading similar postings in the > > mailing this that recommend you upgrade to 7.2.4 or .6(?) if you must > > stay on 7.2. > > > > > > On Mon, 2004-07-05 at 17:48, ruben wrote: > > > >>Hi: > >> > >>I'm getting this error when accessing a table with certain WHERE condition: > >> > >>"server closed the connection unexpectedly > >>This probably means the server terminated abnormally > >>before or while processing the request. > >>The connection to the server was lost. Attempting reset: Failed." > >> > >>I've read through the posts but found no answer to the problem. > >> > >>When I "Vaccum analize" the table I get the same error. > >> > >>I have droped and re-created the indexes. > >> > >>Version is > >>"PostgreSQL 7.2.2 on i686-pc-linux-gnu, compiled by GCC gcc (GCC) 3.2 > >>20020903 (Red Hat Linux 8.0 3.2-7)" > >> > >>Any help on how to recover the table would be greatly appreciated. > >>Thanks, Ruben. > >> > >> > >> > >>---------------------------(end of broadcast)--------------------------- > >>TIP 6: Have you searched our list archives? > >> > >> http://archives.postgresql.org > > > > > > ---------------------------(end of broadcast)--------------------------- > TIP 4: Don't 'kill -9' the postmaster
Hi again Mike: mike g wrote: > Ok, > > Other suggestions: > 1) Have you done a Vaccum full on the table? That should reduce the > table size and resources required. First thing I tried was VACCUUM on the table, but got the same errror: This probably means the server terminated abnormally before or while processing the request. The connection to the server was lost. Attempting reset: Failed. > 2) Use psql to dump the table instead of pg_dump. In psql do a COPY > affected table TO /file name. That will output the table results in a > tab delimited text file which could then be reimported later using COPY > affected table from /that file name. > > COPY BINARY affected table to /File name could be considered as well. I could not try this since I already solved the problem saving partial contents of the table, recreating and inserting. Finally, I only lost 300 tuples :-( > 3) create a few smaller tables with the same data definitions. Insert > specific sections of the table into each smaller table via Insert into > one_smaller_table > Select * from affected_table where date between X and Y What worries me most, besides recovering this table, is why could this happened and how to avoid it. I guess it's due to a hardware problem and the only solution is a frequent backup. Thanks for your help, Mike Ruben.