But that would be your time spent that was efficient, not the code itself (ie you used your time more efficiently in using a language that took less time to develop). The code itself would be inefficient but it enabled yourself to use your development time more efficiently.
I know I'm arguing semantics here (and please bare with me as I launch into a rant hehehe) but I think many people are wrongly taught in college / uni that code efficiency doesn't matter given the power of modern hardware. While this is true for desktop applications, it makes a significant difference the moment you start looking at server appliances and/or real time applications. Given that code efficiency is easier to teach when you're learning to develop than it is to unteach lazy habits, I do see the benefit in being specific about efficiency.
I know these challenges are just for fun and I know I probably come across as a touch elitist (despite the that that the majority of my examples have been in a JIT byte-code compiled language), but I've seen so many examples of wonky code over the years that it's not even funny (though ironically the worst source code I've ever seen was Microsoft's own example of DDE calls to Windows applications (going back a few years now lol). The code was so bad it literally fork bombed itself!!!)
Any how, rant over now
Edited by Plan9 - 12/29/11 at 7:36am