Overclock.net banner

How can you tell if a programmer is good or not?

3873 Views 9 Replies 7 Participants Last post by  Mrzev
I am a software dev with 10+ years, and I really dont know how to answer this. I look at other peoples code and feel envious, but they feel the same about my code too. So, when I was looking for a job, when i was asked how good of a developer i was, I really didnt know how to answer it. Being good or bad at something is relative to others. My vision can be 20:300 , but when everyone around me is blind, my vision is good. So how do you compare yourself to average? I also have high expectations, but once again thats a relative term based on a relative value... Ughhhhh.. Then there is skills in different areas... I suck at java, but i can get by. I have never written in GO , but i can learn fast... sooo? Is it a combination of experiences in areas that match yours? In a job interview is it what aligns with their current projects?

1. How do you know if your a good or bad programmer?
2. How do you know that other person is a good programmer.
1 - 1 of 1 Posts

· Overclocker
Joined
·
11,665 Posts
You can't unless you give them work and you review that work. I don't mean code on stupid paper for 10min like most interviewers do, but give them work for a month, some project, etc. to do the job and then review.
This difference between people is very easy to see at University where everyone has to do the same thing but the results vary wildly in performance, readability, bugs, reusability. Most people just do the bare minimum to pass and this for sure translates into their work ethic too.

There are people who are good at memorizing rubbish and writing memorized crap on paper and then there are other who suck at this but excel at problem solving, designing and writing fast readable code on a computer.

In the end it seems most companies are looking for coding monkeys with a knowledge of some language to implement/code, not to be programmers, developers, to solve anything, to think, nope, just code.

Error handling, for sure, the lazy ones don't bother or even realize that something could go wrong.
I've had a test image to feed a program that many people implemented, this test image was larger and more random than what most people would bother to think of... as a result two of us could process the image in around 1 second, others would take days, weeks, months or outright crash because they ran out of RAM etc.

1) you can only know where you lie on the spectrum by comparing yourself to others in different areas
2) by comparing the other person's results

Most of the optimizations mentioned are pointless unless you test performance in specific language and implementation as often the compilers nowadays are not what they used to be in 80s, 90s and do a lot of optimizations on their own and your "clever" optimizations may actually work against it and make less readable/understandable code. Divide by 2 using bit shifting? Why? What compiler does not optimize this for you to the same fastest code?
 
1 - 1 of 1 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top