A basic question on handling large number of concurrent requests on DB.
I have a cloud service which can get large of requests which will obviously trigger the db operations.
Every db will have some max connection limit which can get exhausted on large number of requests.
I know db connection pooling can be used to reuse the connections but it will not help when there are large number of active concurrent connections. My queries are already optimised and short living.
pgbounder i understand is a proxy which needs to be separately installed on the web or db server.
I was thinking if the normal client side db connection pooling libraries like Apache DBCP , can also provide similar connection queuing while running in the application runtime.