Doing some forensic work on a Postgres 9.5 database that is in testing with only 19 relations and over 100,000 sequences. Once this database goes live, it could see in excess of 1 Million sequences created due to the complexity of the application. There are obvious risks such as pg_dump issues and slow response when scanning catalog table for info. But, are there any serious issues that can show up from this situation? I know theoretically, postgres can have unlimited tables in a database. But, I am looking for some realistic worse case scenarios in an environment like the one described.
--
Thanks,
Jorge Torralba
----------------------------
Note: This communication may contain privileged or other confidential information. If you are not the intended recipient, please do not print, copy, retransmit, disseminate or otherwise use the information. Please indicate to the sender that you have received this email in error and delete the copy you received. Thank You.