What methods of backup do you recommend for medium to large databases? In our example, we have a 20GB database and it takes 2 hrs to load from a pg_dump file.
We just tar/gzip the entire data directory. It takes all of 20 sec. We've successfully restored from that also. The machine you are restoring to *must* be running the save version of postgresql you backed up from.
If you successfully backed up in 20 seconds, then you have a tiny DB. Also, if you successfully restored from that style backup, your DB must not be written to much, or you were extremely lucky to get a consistent state.