Thread: data format problem for upgrede from 7.1->7.3
Hi: I just upgraded from 7.1->7.3, but did not take a data back up. Now when i try to start the posrGreSQL server I get the message: An old version of the database format was found.\nYou need to upgrade the data format before using PostgreSQL.\nSee (Your System's documentation directory)/postgresql-7.3/README.rpm-dist for more information. There is not much info on format change in the Docs. Any idea what might be the resolution here? Thanks, Gautam
Gautam Saha <gsaha@imsa.edu> writes: > Any idea what might be the resolution here? Put back your 7.1 executables and make a pg_dumpall run before you upgrade. Then upgrade, initdb, reload the dump. regards, tom lane
Hello admins Tome Lane once wrote, if I remember well, that the biggest PostgreSQL DB he knows is about 4 TB!!! In Oracle we normaly say, it is "not possible" (= usefull) to exp/imp more than let's say 20 to 50 GB of data. This is more ore less the same we do with pg_dumpall??? Oracle was not changing the Fileformat since version 8.0 (or it does the conversion itself in the background?). Now I would like to know: * Experience of upgrading PostgreSQL db bigger than 100 GB??? * Recommendations, resp. what do the pg-kernel hackers think, when they "allways" change the file format. * Is there a upgrade-path instead of pg_dump? * How can we be a "professional highend" database if we have to pg_dump/load the data every year because of a new release? Thanks for feedback and discussion Oli Tom Lane wrote: >Gautam Saha <gsaha@imsa.edu> writes: > > >>Any idea what might be the resolution here? >> >> > >Put back your 7.1 executables and make a pg_dumpall run before you >upgrade. Then upgrade, initdb, reload the dump. > > regards, tom lane > >---------------------------(end of broadcast)--------------------------- >TIP 7: don't forget to increase your free space map settings > > > -- ------------------------------------------------------- Oli Sennhauser Database-Engineer (Oracle & PostgreSQL) Rebenweg 6 CH - 8610 Uster / Switzerland Phone (+41) 1 940 24 82 or Mobile (+41) 79 450 49 14 e-Mail oli.sennhauser@bluewin.ch Website http://mypage.bluewin.ch/shinguz/PostgreSQL/ Secure (signed/encrypted) e-Mail with a Free Personal SwissSign ID: http://www.swisssign.ch Import the SwissSign Root Certificate: http://swisssign.net/cgi-bin/trust/import
Attachment
Thanks Jonathan and Kaolin fire Technicaly it is clear to me, how it works. But on a SUN E10000 we had hours to exp/imp 20-30 GB of data. And I do not think, that we were able to exp/imp 1TB. If somebody is telling me now, I have to do this several times a year (how can I sell this to a customer???)... It is not a problem handling a micky-mouse database. dump/load 100 GB I would gess it takes me more than 12 h, 2 or 4 times a year makes an availability of less than 99%. If I have 1 TB...??? Regards Oli PS: It is not a technical question/problem its a strategical/marketing question/problem. 7.0.0 2000-05-08 A dump/restore using pg_dump is required 7.1.0 2001-04-13 A dump/restore using pg_dump is required 11 months 7.2.0 2002-02-04 A dump/restore using pg_dump is required 10 months 7.3.0 2002-11-27 A dump/restore using pg_dump is required 9 months 7.4.0 2003-11-xx A dump/restore using pg_dump is required 12 months ------------------------------------------------------- Oli Sennhauser Database-Engineer (Oracle & PostgreSQL) Rebenweg 6 CH - 8610 Uster / Switzerland Phone (+41) 1 940 24 82 or Mobile (+41) 79 450 49 14 e-Mail oli.sennhauser@bluewin.ch Website http://mypage.bluewin.ch/shinguz/PostgreSQL/ Secure (signed/encrypted) e-Mail with a Free Personal SwissSign ID: http://www.swisssign.ch Import the SwissSign Root Certificate: http://swisssign.net/cgi-bin/trust/import
Attachment
On Thu, Nov 06, 2003 at 00:27:03 +0100, Oli Sennhauser <oli.sennhauser@bluewin.ch> wrote: > > If somebody is telling me now, I have to do this several times a year > (how can I sell this to a customer???)... It is not a problem handling a > micky-mouse database. dump/load 100 GB I would gess it takes me more > than 12 h, 2 or 4 times a year makes an availability of less than 99%. > If I have 1 TB...??? More like once a year. Only upgrades between major releases (e.g. 7.3.x to 7.4.x) require dump and reloads. People are looking at trying to do something about this, but it won't happen for 7.4. Maybe by the time 7.5 is released there will be a better way to upgrade.
On Thu, Nov 06, 2003 at 12:27:03AM +0100, Oli Sennhauser wrote: > > If somebody is telling me now, I have to do this several times a year > (how can I sell this to a customer???)... It is not a problem handling a > micky-mouse database. dump/load 100 GB I would gess it takes me more > than 12 h, 2 or 4 times a year makes an availability of less than 99%. > If I have 1 TB...??? You can use one of the replication systems, dump/restore to a second copy of the database (over many hours, yes), start replicating into that newly-restored database, and then take your outage just long enough to catch up any changes. Of course, you need twice the disk, which is a pain, but it's certainly not impossible. A -- ---- Andrew Sullivan 204-4141 Yonge Street Afilias Canada Toronto, Ontario Canada <andrew@libertyrms.info> M2P 2A8 +1 416 646 3304 x110