[H-GEN] Which is better?
ben.carlyle at invensys.com
ben.carlyle at invensys.com
Fri Apr 30 01:36:34 EDT 2004
Russell,
Russell Stuart <russell at stuart.id.au>
Sent by: general-bounces at lists.humbug.org.au
30/04/2004 01:41 PM
To: general at lists.humbug.org.au
cc:
Subject: Re: [H-GEN] Which is better?
> So now I don't profile. The new languages I use - Java / C#, don't even
> come with a profiler. Odd, isn't it? 20 years ago the C compiler I
> used came with "cc -p". The newer languages don't. This is sad. I
> obviously don't understand the languages I am using now near as well as
> I understood C+Unix, and I can't profile either.
In my professional C++ development I have found the quantify tool
invaluable for finding performance bottlenecks in my own code. It has also
been useful in learing some of the performance rules associated with the
stl[1]. I don't think profiling is dead in either C or C++. The other
language[2] I've been dipping my toes in is xslt. xsltproc has a profiler
built in. I haven't found that quite as useful yet, but I've only been
developing fairly simple transforms.
As for C# and Java, it's obviously even more difficult to profile
accurately. A slow method that is executed often will (theoretically) be
optimised to greater and greater degrees, just in time. It's an
interesting and bold approach when you think about it. It's essentially
saying "Don't ever try and optimise anything. The machine is just smarter
than you are.". While that may not always be true it's an interesting
trend. In the olden-days people used to revise processor instruction sets
or hand-code assembler to improve basic performance. Now tweaking at that
level is rare, and basic compiler optimisation is enough for just about
everyone. Will the JIT compiler take this another step forward with the
whole profiling function really a job for the compiler?
I guess the higher-level you go with the language the better compilers can
reason about what you're saying and therefore optimise it. I'm a little
dubious as to whether Java and C# are there, yet. I wouldn't be surprised
if we hit that point in the next fifteen years, though. Even algorithm
choice could be just-in-time determined for some kinds of operations. To
some extent this already happens in database applications.
Now I'm just rambling, but I'm waiting for a compile to finish so it's
warranted:
I think that databases are the first really successful example of a trend
towards higher-level languages throughout software. Most user-level
applications are really about getting and displaying data according to
reletavely well-estblished patterns and could be described in terms of
fairly high-level concepts. As we move towards concepts that are more and
more universal while also being more and more well-defined software will
become much more powerful. Traditional third-generation languages attempt
to leverage this trend by defining libraries and standard APIs that
attempt to encapsultate these concepts while permitting general-purpose
computation to be done above the API. I think we'll continue to see a
trend of dumbing down that comes with standardisation of concepts that
will see languages become simpler. In the future an average farm boy[3]
with an elementry education can develop useful data processing
applications using standard APIs and easy-to-use languages[4]
I see this as a good thing, and personally look forward to the day when
most professional software engineers are redundant ;) Hmm... I'll have to
reskill. Oh, well.
Benjamin.
[1] Never use a hash map unless you're sure you're going to put a lot of
entries in it or use it for a long time. Sure it has constant time
insertion and lookup, but it has a large fixed constant-time
initialisation :)
[2] Well... it's kind-of-a-functional-language
[3] Uhh, no offence to any farm-boys here. I use this term to refer to
someone who is as far as possible from the traditional technical geek
persona.
[4] Text-based or otherwise.
More information about the General
mailing list