Need help devising interview questions for a junior
-
RavensCry wrote:
Ask them about database interaction
Aw why :( That's not even part of .NET.. Is that even a basic thing? (I would be unable to answer those two questions)
-
Just ask "Spoons or Chopsticks?". Watch their faces, it befuddles them. Then ask a technical question, see if their brains can swiftly recover, it is a good technique for whittling out the permanently bewildered.
------------------------------------ I will never again mention that I was the poster of the One Millionth Lounge Post, nor that it was complete drivel. Dalek Dave CCC League Table Link CCC Link[^]
Definatly Chopsticks for foods which are indigenous to Asia.. else spoons. :laugh:
I'd blame it on the Brain farts.. But let's be honest, it really is more like a Methane factory between my ears some days then it is anything else...
-----
"The conversations he was having with himself were becoming ominous."-.. On the radio... -
Iain Clarke, Warrior Programmer wrote:
OK, I'll bite! That *is* the point, isn't
No. The point is to prove it doesn't work (ie find a defect) For any piece of software, one of these three must be true: 1) The software contains no defects. 2) The software contains at least one defect, and I know what it is. 3) The software contains at least one defect, and I do not know what it is. The role of the testing process is to determine which one is true. If 1) is true, you are finished. Release the product. If 2) is true, you fix the defect, and reevaluate. If 3) is true, you continue testing until either 1) or 2) is true. So, the point of testing is to move from 3) to either 1) or 2), and that involves finding defects. Of course, at some point, you have to take the view that "absence of proof is proof of absence", in that you can't find any more defects, so you assume there are none (ie you assume 1) is true, when it may not be). When and how you make that decision is what puts the 'engineering' into 'software engineering', and one of the things that makes the question a jumping off point for a broader discussion.
Iain Clarke, Warrior Programmer wrote:
but what about utf8?
Indeed. One of the points of the question is to see if the candidate things in terms of
char*
only, or can think about other aspects of a (deliberately) vague spec. For example, I didn't even say if the buffer was null-terminated or not.You're mostly playing a word game with this. How do you make sure something works? Make sure it does what it says it will do. A bug will keep the software from doing what it is said to do. I will say that you have been out of the engineering business far too long if you actually think that your 1) is possible. Bug-free code does not exist except in small examples where no user interaction is needed and only very basic logic is used. Even then, you could find a mistake or vulnerability somewhere.
-
Think of a small program, that you could write in say, four hours. Give them VS2008 / 10 / whatever you use, four hours and a spec. See how they get on. Other ideas: We've previously interviewed for a C++ / Windows position, and asked questions like: C / C++ specific, testing basic knowledge of bit operators and pointers. 1. Write code do determine how many bits are "on" in a byte. 2. Write code to reverse a string in place in a buffer. 3. Write the standard
atoi
function from scratch Windows Specific 1. When do you put elipsis on a menu or button? 2. Why would you use threads in an application? (we look for three different scenarios) General software engineering What's the point of testing software? (you'd be amazed how many people say "to make sure it works")Although I take issue with the last question as you can see below, I do like the way you think for the other questions. You seem to be more interested in the person's reasoning ability than their extensive knowledge of library 'X' that could be looked up in any real life situation. I run into too many "ability tests" from headhunting companies that focus too much on whether or not you've used a specific facet of the language instead of whether you can use tools to solve a problem.
-
Although I take issue with the last question as you can see below, I do like the way you think for the other questions. You seem to be more interested in the person's reasoning ability than their extensive knowledge of library 'X' that could be looked up in any real life situation. I run into too many "ability tests" from headhunting companies that focus too much on whether or not you've used a specific facet of the language instead of whether you can use tools to solve a problem.
Someone who knows everything about a language, but took 10 years to learn it, is, I think, much less valuable than someone who knows nothing about a language, but can learn it in one year. To summarise in a cute phrase, I'm trying to assess capability, not abiity.
-
You're mostly playing a word game with this. How do you make sure something works? Make sure it does what it says it will do. A bug will keep the software from doing what it is said to do. I will say that you have been out of the engineering business far too long if you actually think that your 1) is possible. Bug-free code does not exist except in small examples where no user interaction is needed and only very basic logic is used. Even then, you could find a mistake or vulnerability somewhere.
Billy Howell wrote:
You're mostly playing a word game with this.
NO, I'm looking for an approach and an attitude.
Billy Howell wrote:
Make sure it does what it says it will do.
I write a program, for which the spec is "accept three numbers, add them up, and print the total". I enter the numbers 2 and 2, and it prints 4. I enter the numbers 1, 2 and 3, and it prints 6. Clearly the program does what is says it will do, so it's tested. In fact, it multiplies them, not adds them, but still prints the correct answers. So my tests don't find my bug, but instead seem to show the program works. That is a contrived example, and yes, almost any other combination of numbers would show the bug. But I've seen too many developers who think their program is wonderful, and tiptoe round the testing, only trying to prove it works, and confirm their own programming abilities. We're getting into psychology rather than pure software engineering, but one of the things I'm looking for is the ability to want to try to knock down the thing you've just built, in an effort to prove how strong it is.
-
And I was holding back, maybe I'm just evil when it comes to potential employees :wtf: I did once write some interview questions for a junior and tried them out on my own team, they all failed!
You evil person :) I'd probably ask about something hard about algorithms and datastructures, and something "easy for me" about bithacks. That's probably evil too :) Still if someone doesn't even understand a Min-Heap, or tests for "is a power of 2" with a for loop and Math.Pow, there is something wrong with their education..
-
Need help from people with experience of interviewing, rather than being interview. I have been tasked with creating some .net question for the interview for a junior and I have no idea where to start and would love to have some sensible and as well as wacky (because I know here I'll get some!) from people that have had such experience
As barmey as a sack of badgers Dude, if I knew what I was doing in life, I'd be rich, retired, dating a supermodel and laughing at the rest of you from the sidelines.
You can work them out pretty quickly by not only asking the differences between interfaces and base classes but how and when it is best to apply each. I wouldn't bother asking silly questions like the number of piano tuners in london. Remeber, you're hiring a junior developer. They're likely to be a bit shy so don't terrify them within 2 minutes of the interview starting.
-
Billy Howell wrote:
You're mostly playing a word game with this.
NO, I'm looking for an approach and an attitude.
Billy Howell wrote:
Make sure it does what it says it will do.
I write a program, for which the spec is "accept three numbers, add them up, and print the total". I enter the numbers 2 and 2, and it prints 4. I enter the numbers 1, 2 and 3, and it prints 6. Clearly the program does what is says it will do, so it's tested. In fact, it multiplies them, not adds them, but still prints the correct answers. So my tests don't find my bug, but instead seem to show the program works. That is a contrived example, and yes, almost any other combination of numbers would show the bug. But I've seen too many developers who think their program is wonderful, and tiptoe round the testing, only trying to prove it works, and confirm their own programming abilities. We're getting into psychology rather than pure software engineering, but one of the things I'm looking for is the ability to want to try to knock down the thing you've just built, in an effort to prove how strong it is.
Ok. Now, I understand what you are getting at. I wonder if there is a better way to ask the question though. The question doesn't ask anything about the exhaustive nature of the testing. It only asks the reason for doing it in the first place.
-
Iain Clarke, Warrior Programmer wrote:
OK, I'll bite! That *is* the point, isn't
No. The point is to prove it doesn't work (ie find a defect) For any piece of software, one of these three must be true: 1) The software contains no defects. 2) The software contains at least one defect, and I know what it is. 3) The software contains at least one defect, and I do not know what it is. The role of the testing process is to determine which one is true. If 1) is true, you are finished. Release the product. If 2) is true, you fix the defect, and reevaluate. If 3) is true, you continue testing until either 1) or 2) is true. So, the point of testing is to move from 3) to either 1) or 2), and that involves finding defects. Of course, at some point, you have to take the view that "absence of proof is proof of absence", in that you can't find any more defects, so you assume there are none (ie you assume 1) is true, when it may not be). When and how you make that decision is what puts the 'engineering' into 'software engineering', and one of the things that makes the question a jumping off point for a broader discussion.
Iain Clarke, Warrior Programmer wrote:
but what about utf8?
Indeed. One of the points of the question is to see if the candidate things in terms of
char*
only, or can think about other aspects of a (deliberately) vague spec. For example, I didn't even say if the buffer was null-terminated or not.A junior programmer will not know this. They are applying for a JUNIOR programmer - they have no (or little) experience in these things. It's OK to know the basics, you don't want to hire someone who doesn't have a clue, but to ask from a junior programmer to elaborate on testing? I think you're asking too much. :)
-
First of all, you haven't been tasked with. You've been asked. Tasked with is vapid management bull serving to make something sound much more dynamic and go-to than it really is. Second - the silly question; ask how many piano tuners there are in London - this should help you to see how they respond to odd requests, pressure situations and what their thought processes are like. Do they start by assuming that the population of London is x, and out of that population, y% have pianos and it takes z tuners to service that many pianos? Do they tell you that they'd Google it? (Ironically, Chrome's spellchecker doesn't recognise Google; it offers Goggle, Googly, Goodly and the rather fun Go ogle as choices). Third - test them on the basics (here I'm assuming the position is for a .NET developer with some experience). Do they know what an interface is? Do they know what an abstract class is? When would you use one over the other? Do they know how to get data out of the database using something other than a DataSet/DataTable?
I have CDO, it's OCD with the letters in the right order; just as they ruddy well should be
Forgive your enemies - it messes with their heads
"He tasks me. He tasks me, and I shall have him...." Khan about a certain James T. Kirk
A while ago he asked me what he should have printed on my business cards. I said 'Wizard'. I read books which nobody else understand. Then I do something which nobody understands. After that the computer does something which nobody understands. When asked, I say things about the results which nobody understand. But everybody expects miracles from me on a regular basis. Looks to me like the classical definition of a wizard.
-
A junior programmer will not know this. They are applying for a JUNIOR programmer - they have no (or little) experience in these things. It's OK to know the basics, you don't want to hire someone who doesn't have a clue, but to ask from a junior programmer to elaborate on testing? I think you're asking too much. :)
A junior programmer with no experience of testing? That's not a junior programmer, that's a complete beginner!
-
Iain Clarke, Warrior Programmer wrote:
OK, I'll bite! That *is* the point, isn't
No. The point is to prove it doesn't work (ie find a defect) For any piece of software, one of these three must be true: 1) The software contains no defects. 2) The software contains at least one defect, and I know what it is. 3) The software contains at least one defect, and I do not know what it is. The role of the testing process is to determine which one is true. If 1) is true, you are finished. Release the product. If 2) is true, you fix the defect, and reevaluate. If 3) is true, you continue testing until either 1) or 2) is true. So, the point of testing is to move from 3) to either 1) or 2), and that involves finding defects. Of course, at some point, you have to take the view that "absence of proof is proof of absence", in that you can't find any more defects, so you assume there are none (ie you assume 1) is true, when it may not be). When and how you make that decision is what puts the 'engineering' into 'software engineering', and one of the things that makes the question a jumping off point for a broader discussion.
Iain Clarke, Warrior Programmer wrote:
but what about utf8?
Indeed. One of the points of the question is to see if the candidate things in terms of
char*
only, or can think about other aspects of a (deliberately) vague spec. For example, I didn't even say if the buffer was null-terminated or not. -
A junior programmer with no experience of testing? That's not a junior programmer, that's a complete beginner!
Well, when I applied for my first job, a junior programmer meant that work experience is not required.
-
Ok. Now, I understand what you are getting at. I wonder if there is a better way to ask the question though. The question doesn't ask anything about the exhaustive nature of the testing. It only asks the reason for doing it in the first place.
These are starting points for broader discussions. They aren't questions that someone can get right or wrong. For example, once they've written an answer to: 1. Write code do determine how many bits are "on" in a byte I'd probably introduce some more constrainsts. Some candidates use shift operatators and shift the byte left or right one place eight times, and test the most/least significant bit as appropriate. In that case, I'd ask how they would improve the speed. You could for, example, create a 256 entry lookup table, and use the byte as an index into that. Very quick result. You can then use that as a starting point for a discussion on whether the lookup table should be static, or built at runtime, and see what criteria they would apply to that decision. If they started with the lookup table approach, you can still have the dynamic vs static discussion, and then ask what they would do in a memory-constrained situation, where building a table was impossible. Essentially, I'm trying to engage them, to see a) how they interact with other people b) what their problem-solving approach is I'm most definitely not trying to get them to sit an exam, which gets marked.
-
Need help from people with experience of interviewing, rather than being interview. I have been tasked with creating some .net question for the interview for a junior and I have no idea where to start and would love to have some sensible and as well as wacky (because I know here I'll get some!) from people that have had such experience
As barmey as a sack of badgers Dude, if I knew what I was doing in life, I'd be rich, retired, dating a supermodel and laughing at the rest of you from the sidelines.
I do the some of the interviewing for my company, here are some of the questions I like (points are made up for emphasis): ********************************************1**************************** -for the white board or paper test Devise a Recursively searchable tree node structure, and show me an implementation. -wait for the inevitable questions(a good thing) What am i to search for? (and other class specs) A: a string named Name. (+1 pt) What does the recursion return? A: the node of interest. (+5 pts, not -10 pts) Forward recursive or reverse recursive-(+10 pts if they know the difference) ... let them work it out... If they are interviewing without threading background I like to see that they use an iterator, if threading NO ITERATOR ..... (5pts) if they can reason out that they need a temp node for the return value of the recursion before they get that far... (7pts) If it works (10 pts) If they show me that it can be done in less then say 5 minutes. (10 pts) if they use encapsulation (as per the specs that they ask... (10 pts) Yadda-yadda *********************************2************************************* Usually try to find one question which is very trivial for one or two of the random things which they claim they know.. ex: php:: Show me how to print the third member of the array bob. Javascript:: what is the difference between '==' and '===' ? ***********************************3*********************************** Ask at least one question with well defined specs up front but many "suitable" ways to answer it... Reverse the string "the fuzzy red fox". - check to see if they understand the conventions of whichever language you are using for strings, characters, and arrays. you also get to see if they are elegant in their choice of variable names, loop structure, and overall basic programming capabilities.. ********************************** 4 ***************** {Are you an active learner... etc.} do you have any personal projects, if so what.. {are you a member of a community} Do you frequent any forums, if so which one is your favorite and why? (plus 10000 pts if they say codeproject :cool: ) .. if they have a break in their employment history... What have you been doing to stay current with technologies..? ************************** summary ******************************* we like to stress the interviewee out to the point where they might be shaking a little (pressure stress) to see how clear headed they react and work under pressure. Check to see that th
-
Iain Clarke, Warrior Programmer wrote:
OK, I'll bite! That *is* the point, isn't
No. The point is to prove it doesn't work (ie find a defect) For any piece of software, one of these three must be true: 1) The software contains no defects. 2) The software contains at least one defect, and I know what it is. 3) The software contains at least one defect, and I do not know what it is. The role of the testing process is to determine which one is true. If 1) is true, you are finished. Release the product. If 2) is true, you fix the defect, and reevaluate. If 3) is true, you continue testing until either 1) or 2) is true. So, the point of testing is to move from 3) to either 1) or 2), and that involves finding defects. Of course, at some point, you have to take the view that "absence of proof is proof of absence", in that you can't find any more defects, so you assume there are none (ie you assume 1) is true, when it may not be). When and how you make that decision is what puts the 'engineering' into 'software engineering', and one of the things that makes the question a jumping off point for a broader discussion.
Iain Clarke, Warrior Programmer wrote:
but what about utf8?
Indeed. One of the points of the question is to see if the candidate things in terms of
char*
only, or can think about other aspects of a (deliberately) vague spec. For example, I didn't even say if the buffer was null-terminated or not.Electron Shepherd wrote:
The point is to prove it doesn't work (ie find a defect)
I'd back that up completely. Glenford Myers in his book 'The Art of Software Testing (2nd edn.)' argues that testing to demonstrate that errors are not present is impossible, and that proving that a program does what it is supposed to do does not prove that the program is error free. Therefore, Myers defines black box testing as “The destructive process of trying to find the errors (whose presence is assumed) in a program.” Myers, G. (2004) The Art of Software Testing (2nd edn.) Hoboken: John Wiley & Sons
Nobody can get the truth out of me because even I don't know what it is. I keep myself in a constant state of utter confusion. - Col. Flagg
-
And I was holding back, maybe I'm just evil when it comes to potential employees :wtf: I did once write some interview questions for a junior and tried them out on my own team, they all failed!
RavensCry wrote:
I did once write some interview questions for a junior and tried them out on my own team, they all failed!
Doesn't that tell you something :laugh:
Nobody can get the truth out of me because even I don't know what it is. I keep myself in a constant state of utter confusion. - Col. Flagg
-
These are starting points for broader discussions. They aren't questions that someone can get right or wrong. For example, once they've written an answer to: 1. Write code do determine how many bits are "on" in a byte I'd probably introduce some more constrainsts. Some candidates use shift operatators and shift the byte left or right one place eight times, and test the most/least significant bit as appropriate. In that case, I'd ask how they would improve the speed. You could for, example, create a 256 entry lookup table, and use the byte as an index into that. Very quick result. You can then use that as a starting point for a discussion on whether the lookup table should be static, or built at runtime, and see what criteria they would apply to that decision. If they started with the lookup table approach, you can still have the dynamic vs static discussion, and then ask what they would do in a memory-constrained situation, where building a table was impossible. Essentially, I'm trying to engage them, to see a) how they interact with other people b) what their problem-solving approach is I'm most definitely not trying to get them to sit an exam, which gets marked.
That is what you should be doing. I guess I am so used to people running an interview like an exam when they ask these sorts of questions that I am not expecting a logical approach from anyone anymore.
-
Need help from people with experience of interviewing, rather than being interview. I have been tasked with creating some .net question for the interview for a junior and I have no idea where to start and would love to have some sensible and as well as wacky (because I know here I'll get some!) from people that have had such experience
As barmey as a sack of badgers Dude, if I knew what I was doing in life, I'd be rich, retired, dating a supermodel and laughing at the rest of you from the sidelines.
Besides asking about how he/she individually contributed to projects ask them questions regarding the technologies and roles listed on their resumes. For example a prospective hire labeled himself as a database admin and programmer. When probing further into the database admin aspect we come to find he had no practical work experience as database admin other than a college class. Yet he considered himself an admin. This has happened time & time again.