On 12/5/2017 2:09 PM, Martin Mueller wrote:
Time is not really a problem for me, if we talk about hours rather than days. On a roughly comparable machine I’ve made backups of databases less than 10 GB, and it was a matter of minutes. But I know that there are scale problems. Sometimes programs just hang if the data are beyond some size. Is that likely in Postgres if you go from ~ 10 GB to ~100 GB? There isn’t any interdependence among my tables beyond queries I construct on the fly, because I use the database in a single user environment
another factor is restore time. restores have to create indexes. creating indexes on multi-million-row tables can take awhile. (hint, be sure to set maintenance_work_mem to 1GB before doing this!)
--
john r pierce, recycling bits in santa cruz