Frank,
I had the same questioning a while ago and another thing that made me think was the amount of data per user.
In the end, I decided on using a single DB and single schema and add a clause to split everything by each customer (customer_id).
I then added an index on that column and my code became simpler and fast enough.
This also allowed me to work with some other aggregates that provided very useful "global" statistics.
--
Jorge Godoy <
jgodoy@gmail.com>
On Sun, Nov 15, 2009 at 04:28, undisclosed user
<lovetodrinkpepsi@gmail.com> wrote:
Hello everyone,
I have hit a wall on completing a solution I am working on. Originally, the app used a db per user (on MyIsam)....the solution did not fair so well in reliability and performance. I have been increasingly interested in Postgres lately.
Currently, I have about 30-35k users/databases. The general table layout is the same....only the data is different. I don't need to share data across databases. Very similar to a multi-tenant design.
Here are a few questions I have:
1. Could postgres support this many DBs? Are there any weird things that happen when the postgres is used this way?
2. Is the schema method better? Performance, maintainability, backups, vacuum? Weird issues?
Any incite is greatly appreciated.
Thanks.
Frank