> > I need to give community access directly to the database, so I
need to
> > impose some sane controls. As I can't hold any of that class of
users
> > accountable, I need to impose the limits in the software.
>
> For comparison, I know that MySQL allows you to set the max queries
per
> hour for a particular user...
Which is completely useless. A single table with a few thousand
records joined to itself 40 or so times as a cartesian product will
completely wipe out most MySQL boxes and thats only 1 query :)
PostgreSQL has the same problem..
What would be interesting is the ability to limit a query to a maximum
of X cost. This requires knowing the cost of a function call which is
currently evaluated as nothing.
Perhaps even 'nice' processeses running for certain users / groups.
Offline or background processes could be niced to 20 to allow the
interactive sessions to run unnoticed. Not so good if the background
process locks a bunch of stuff though -- so that may be more trouble
than good.
Either way the user can simply attempt to make thousands of database
connections which would effectivly block anyone else from connecting
at all.