=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
=3D
POSTGRESQL BUG REPORT TEMPLATE
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
=3D
Your name : Adam Palmblad
Your email address : adampalmblad@yahoo.ca
System Configuration
---------------------
Architecture (example: Intel Pentium) : dual AMD 64s (242),=20
Operating System (example: Linux 2.4.18) : Gentoo Linux, kernel 2.6.3-Ge=
ntoo-r2, XFS file system
PostgreSQL version (example: PostgreSQL-7.4.2): PostgreSQL-7.4.2 (64-bi=
t compile)
Compiler used (example: gcc 2.95.2) : 3.3.3
Please enter a FULL description of your problem:
------------------------------------------------
We are having a recurring problem with page corruption in our database. We=
need to add over 3 million records a day to our database, and
we have been finding that we will start getting page header corruption erro=
rs after around 12 - 15 million records. These errors
show up both in tables and in indexes. Generally they only occur in our la=
rgest tables. This is a new server, when it was set up some
basic hardware tests were done, and they checked out okay. The data in the=
databases is critical to our business; having to rebuild a table
and reinsert data every few days is not really an acceptable solution.
Another error was just noted, reading as follows: ERROR: Couldn't open segm=
ent 1 of relation: XXXX (target block 746874992): No such file or directory.
Please describe a way to repeat the problem. Please try to provide a
concise reproducible example, if at all possible:=20
----------------------------------------------------------------------
Insert 15 million records to a table. Use the copy command. We are runnin=
g copy with files of 60 000 lines to insert the data.
Do a vacuum or similar operation that would visit every page of the table.
An invalid page header error may occur.
If you know how this problem might be fixed, list the solution below:
---------------------------------------------------------------------
Has anyone else had this problem? Would it be better for us to try a diffe=
rent
file system or kernel? Should postgres be recompiled in 32-bit mode?