|
Posted by Lew on 11/19/07 06:10
NC wrote:
> But that's the point. No coding was required. Performance
> improvement was achieved purely by means of system administration
> (installing an opcode cache and changing MySQL configuration
> settings). So every time I see a performance test, I wonder which
> optimizations testers chose not to use or didn't know of...
Optimization is always both situational and non-transferable. The best one
can do is create a benchmark that is reasonably representative and is duplicable.
Ececution speed is also not the only, or even usually the most important
consideration. By far the largest cost of most software is not when it's
running, but when it's not (or not correctly at least).
Once performance gets in the ballpark, in other words, once with appropriate
effort you can get Java or PHP or C++ or C# or whatever to run with
approximately equal efficiency, the ability to manage team projects, create
all desired features and minimize risk will be the dominant factor in platform
selection. Development and maintenance costs will trump execution costs.
Benchmarks have shown that Java, C# and C++ are all in about the same ballpark
with respect to performance for most applications where these platforms are
considered. Strongly-typed languages like those three tend to win in the
choice of platforms for managed projects because of the robust API libraries,
language support for risk mitigation and feature support for concurrent
programs and other common requirements. The availability of trained
programmers factors in as well.
The best benchmarks will simulate typical work loads for a host under explicit
conditions. None will ever be perfect, because the real world is not
committed to working the way benchmarks do. At some point one has to assert
that the platform is fast enough, and look for other criteria to choose the
right one.
--
Lew
[Back to original message]
|