I have a table that is used to control documents: recno | integer | not null default nextval('document_recno_seq'::regclass) foreigntablename | character varying(255) | not null foreignrecno | integer | not null docforeigntablename | character varying(255) | not null docforeignrecno | integer | not null docforeigntext | character varying(255) | not null documenttyperecno | integer | not null version | character varying(20) | not null loadeddate | date | not null loadedtime | time without time zone | not null usersrecno | integer | not null title | character varying(40) | not null description | character varying(255) | doculookupcodesrecno | integer | not null document | oid | not null suffix | character varying(255) |
Each document is loaded using the large object commands and the OID of the load then inserted as a ref number on the document table. This has been working fine, but I had a major crash on the server and it had to be rebuilt. The database had been exported (pg_dump) - so I reimported (pg_restore) the database. If I now look at the oid (ie document.document) of the blob on the numbers are in the range: 3159553408 3159553409 3159553410 3159553411 3159553412 3159553413 3159553414 3159553415 these numbers are above the field I use for the OID in the code (ie an Integer - catering for 2 billion) - so the program no longer finds the document to export. Any ideas as to why did these OID's become so large after reimporting (I am assuming here that they must have been under 2 billion before or else the process could not have worked before)? And what limit is ther then on this OID?