Hello all
I have been watching with some interest the "problem" of not being able
to work with more than one database. My muse is aimed at creating an
incremental backup for a very low usage database - say several hundred
transactions per day
Back in my days of DOS about v3 I used to use the >> pipe which appends
to an existing file.
If the pgsql command 'COPY TO' was extended or a similar command created
to allow it, transactions could be written to a delimited text file/s
held on a dedicated drive.
For my requirements, such a facility, coupled with regular backups,
would give me all the security I require. It would also provide a
partial solution for Stef Telford. Such a solution would require
regular maintenance, however, the period depending on the rate of growth
of the files and the frequency of backups. Definitely for the smaller
system.
My knowledge of computer programming is very limited, so that I cannot
take any such idea any further.
regards
Max Wood
From: "Stef Telford" <stef@Chronozon.dyndns.org>
> Hello everyone,
> I have hit on a limit in my knowledge and i am looking for
> some guidance. Currently I have two seperate databases, one for
> live data, the other for historical data. The only difference really
> being that the historical data has a Serial in it so that the tables
> can keep more than one 'version history'.
>
> What i would like to do, is after my insert transaction to the
> live database, i would like the information also transferred to the
> historical one. Now. I can do this via perl (and i have been doing it
> this way) and using two database handles. This is rather clumsy and
> I know there must be a 'better' or more 'elegant' solution.
Not really (AFAIK) - this crops up fairly regularly but there's no way
to do
a cross-database query.
You could use rules/triggers to set a "dirty" flag for each record that
needs copying - but it sounds like you're already doing that.
If you wanted things to be more "real-time" you could look at
LISTEN/NOTIFY
- Richard Huxton