I am working on a fairly small application to use for managing a companies
business.
I have a "production" instance hosted by one of the cloud providers, and 2
other instances. This is fairly new to me. In the past, I have created
applications by keeping a set of scripts that can be used to rebuild the
database, and pg_dump to restore the date. Based on some recommendations I
am using pg_basebackup to backup the production instance nightly. My
background is primarily Oracle. I realize looking at the way pg_basebackup
works that multiple database instances, provided by one server are actually
stored in the same physical OS files.
We have traditionally (in the Postgres world) had a sandbox, that we used
for upgrades, and testing development methodologies, and this seems to be
supported pretty well by pg_dump.
Now that I know "too much" I am concerned about hosting the sandbox on the
same Postgres instance.
Recognizing that this is a fairly small application, what are wiser folks
than I recommendations?
Should I run the sandbox from different Postgres server, possibly even on a
different machine? Is pg_dump still good way to move the production
instance to the sandbox, and perhaps even the other way around?
--
"They that would give up essential liberty for temporary safety deserve
neither liberty nor safety."
-- Benjamin Franklin