On 2/4/15 12:13 PM, Stephen R. van den Berg wrote:
> If you know beforehand the query might generate more than one row (SELECT)
> yet you also know that you are not interested in those, then maxrows=1
> is best; then again, modifying the query to include a LIMIT 1 is even
> better, in which case maxrows can be zero again.
This seems to be a common pattern, and I think it's a *huge* mistake to
specify maxrows=1 and/or ignore rows after the first one in the driver
layer. If the user says "give me the only row returned by this query",
the interface should check that only one row is in reality returned by
the query. If the query returns more than one row, the user made a
mistake in formulating the query and she probably wants to know about
it. If she genuinely doesn't care about the rows after the first one,
she can always specify LIMIT 1.
For a sad example, look at PL/PgSQL's SELECT .. INTO ..; it's not
terribly difficult to write a query which returns more than one row *by
mistake* and have something really bad happen later on since it went
undetected during testing because you just happened to get the expected
row back first. And when you do want to specifically enforce it for
e.g. security critical code, you have to resort to really ugly hacks
like window functions.
.m