Chris Browne wrote:
> Further, the Right Thing is to group related data together, and come
> up with a policy that is driven primarily by the need for data
> consistency. If things work well enough, then don't go off trying to
> optimize something that doesn't really need optimization, and perhaps
> break the logic of the application.
Right. I think it was Jon Louis Bently who wrote (in his book, "Writing
Efficient Programs") something to the effect, "Premature optimization is the
root of all evil." Just because so much of it broke the logic of the
application (and did not help anyway). (Gotta profile first, for one thing.)
I had a boss once who insisted we write everyting in assembly language for
efficiency. We did not even know what algorithms we needed for the
application. And at the time (System 360 days), IBM did not even publish the
execution times for the instruction set of the machine we were using because
so many executed in zero-time -- overlapped with other instructions, local
caching in the processor, locality of memory reference, and so on. To get
efficiency, you must first get your algorithms right, including getting the
best ones for the problem at hand.
--
.~. Jean-David Beyer Registered Linux User 85642.
/V\ PGP-Key: 9A2FC99A Registered Machine 241939.
/( )\ Shrewsbury, New Jersey http://counter.li.org
^^-^^ 10:05:01 up 3 days, 2:23, 1 user, load average: 4.10, 4.24, 4.18