Need help devising interview questions for a junior
-
Try and cover the basics, so things like: "Give an example of when you might use a static field on a class" "What's the difference between an interface and an abstract class" "Describe the purpose of the 'ref' keyword in a methods signature" Ask them about database interaction: "How would you execute a SELECT statement against a database" "Which method would you use to retrieve the value from the first column of the first row" Perhaps something on using things like app.config files? Also maybe a couple of questions to demonstrate that they know the difference between passing by value/reference. It's outside of .net but maybe something about version control as well? Hope that helps
-
Need help from people with experience of interviewing, rather than being interview. I have been tasked with creating some .net question for the interview for a junior and I have no idea where to start and would love to have some sensible and as well as wacky (because I know here I'll get some!) from people that have had such experience
As barmey as a sack of badgers Dude, if I knew what I was doing in life, I'd be rich, retired, dating a supermodel and laughing at the rest of you from the sidelines.
The easiest way for me to have an "idea" of the knowledge of the candidate was to develop a written test with general concepts of the thecnology and a few logic and OO questions. Then just read the test, see how he performs and then talk to him about general stuff, way he works/worked in the past, prospective about the future, etc.
-
Need help from people with experience of interviewing, rather than being interview. I have been tasked with creating some .net question for the interview for a junior and I have no idea where to start and would love to have some sensible and as well as wacky (because I know here I'll get some!) from people that have had such experience
As barmey as a sack of badgers Dude, if I knew what I was doing in life, I'd be rich, retired, dating a supermodel and laughing at the rest of you from the sidelines.
First, what not to do. Normally the interviewers' objective is to find out how much the interviewee knows and what experience they've had. Asking trick questions and posing zany challenges will not help you find this out - they are just traps that bad interviewers fall into. Also if you go in (or send the interviewer in) with a set of questions about .net it turns from an interview into an exam. You could do that by email. Waste of interviewing time so that is something else to avoid. Instead, just ask them what you want to know. A good technique is to start by asking them what they've done (I'm asuming that someone else is dealing with questions designed to put them at ease and to check their applications - those a pretty essential ways of starting the interview). So your questions can start by asking the junior just how much .net experience they have had. Stop. Let them speak, and listen. If they say some interesting stuff then pick up on it. The purpose of your questioning now has to be to find out how much of what they are claiming they did themselves: if part of a team what was their role? What did they contribute specifically? e.g. answer: we designed a piece of software that did wonderful things. follow-up question: yes but which part of it did you personally work on, what was your exact contribution (to be asked every time they say "we did" instead of "I did"). Then shut up and listen. Then probe specifics, to see whether they can support their claims by demonstrating a genuine understanding of the task they are describing. What else have they done? - same pattern: find out how much was their work and how much they understood. Repeat until you feel you have got a measure of the candidate's abilities. They might be nervous to the extent that they dry up, in which case it is OK to lead them gently, using their application as a way in. But stop at regular intervals and listen. Try to get them doing 95% of the talking. Above all, if you are the interviewer resist the temptation to pontificate or otherwise show off. This isn't the time or place for that. HTH
-
RavensCry wrote:
Ask them about database interaction
Aw why :( That's not even part of .NET.. Is that even a basic thing? (I would be unable to answer those two questions)
-
Just ask "Spoons or Chopsticks?". Watch their faces, it befuddles them. Then ask a technical question, see if their brains can swiftly recover, it is a good technique for whittling out the permanently bewildered.
------------------------------------ I will never again mention that I was the poster of the One Millionth Lounge Post, nor that it was complete drivel. Dalek Dave CCC League Table Link CCC Link[^]
Definatly Chopsticks for foods which are indigenous to Asia.. else spoons. :laugh:
I'd blame it on the Brain farts.. But let's be honest, it really is more like a Methane factory between my ears some days then it is anything else...
-----
"The conversations he was having with himself were becoming ominous."-.. On the radio... -
Iain Clarke, Warrior Programmer wrote:
OK, I'll bite! That *is* the point, isn't
No. The point is to prove it doesn't work (ie find a defect) For any piece of software, one of these three must be true: 1) The software contains no defects. 2) The software contains at least one defect, and I know what it is. 3) The software contains at least one defect, and I do not know what it is. The role of the testing process is to determine which one is true. If 1) is true, you are finished. Release the product. If 2) is true, you fix the defect, and reevaluate. If 3) is true, you continue testing until either 1) or 2) is true. So, the point of testing is to move from 3) to either 1) or 2), and that involves finding defects. Of course, at some point, you have to take the view that "absence of proof is proof of absence", in that you can't find any more defects, so you assume there are none (ie you assume 1) is true, when it may not be). When and how you make that decision is what puts the 'engineering' into 'software engineering', and one of the things that makes the question a jumping off point for a broader discussion.
Iain Clarke, Warrior Programmer wrote:
but what about utf8?
Indeed. One of the points of the question is to see if the candidate things in terms of
char*
only, or can think about other aspects of a (deliberately) vague spec. For example, I didn't even say if the buffer was null-terminated or not.You're mostly playing a word game with this. How do you make sure something works? Make sure it does what it says it will do. A bug will keep the software from doing what it is said to do. I will say that you have been out of the engineering business far too long if you actually think that your 1) is possible. Bug-free code does not exist except in small examples where no user interaction is needed and only very basic logic is used. Even then, you could find a mistake or vulnerability somewhere.
-
Think of a small program, that you could write in say, four hours. Give them VS2008 / 10 / whatever you use, four hours and a spec. See how they get on. Other ideas: We've previously interviewed for a C++ / Windows position, and asked questions like: C / C++ specific, testing basic knowledge of bit operators and pointers. 1. Write code do determine how many bits are "on" in a byte. 2. Write code to reverse a string in place in a buffer. 3. Write the standard
atoi
function from scratch Windows Specific 1. When do you put elipsis on a menu or button? 2. Why would you use threads in an application? (we look for three different scenarios) General software engineering What's the point of testing software? (you'd be amazed how many people say "to make sure it works")Although I take issue with the last question as you can see below, I do like the way you think for the other questions. You seem to be more interested in the person's reasoning ability than their extensive knowledge of library 'X' that could be looked up in any real life situation. I run into too many "ability tests" from headhunting companies that focus too much on whether or not you've used a specific facet of the language instead of whether you can use tools to solve a problem.
-
Although I take issue with the last question as you can see below, I do like the way you think for the other questions. You seem to be more interested in the person's reasoning ability than their extensive knowledge of library 'X' that could be looked up in any real life situation. I run into too many "ability tests" from headhunting companies that focus too much on whether or not you've used a specific facet of the language instead of whether you can use tools to solve a problem.
Someone who knows everything about a language, but took 10 years to learn it, is, I think, much less valuable than someone who knows nothing about a language, but can learn it in one year. To summarise in a cute phrase, I'm trying to assess capability, not abiity.
-
You're mostly playing a word game with this. How do you make sure something works? Make sure it does what it says it will do. A bug will keep the software from doing what it is said to do. I will say that you have been out of the engineering business far too long if you actually think that your 1) is possible. Bug-free code does not exist except in small examples where no user interaction is needed and only very basic logic is used. Even then, you could find a mistake or vulnerability somewhere.
Billy Howell wrote:
You're mostly playing a word game with this.
NO, I'm looking for an approach and an attitude.
Billy Howell wrote:
Make sure it does what it says it will do.
I write a program, for which the spec is "accept three numbers, add them up, and print the total". I enter the numbers 2 and 2, and it prints 4. I enter the numbers 1, 2 and 3, and it prints 6. Clearly the program does what is says it will do, so it's tested. In fact, it multiplies them, not adds them, but still prints the correct answers. So my tests don't find my bug, but instead seem to show the program works. That is a contrived example, and yes, almost any other combination of numbers would show the bug. But I've seen too many developers who think their program is wonderful, and tiptoe round the testing, only trying to prove it works, and confirm their own programming abilities. We're getting into psychology rather than pure software engineering, but one of the things I'm looking for is the ability to want to try to knock down the thing you've just built, in an effort to prove how strong it is.
-
And I was holding back, maybe I'm just evil when it comes to potential employees :wtf: I did once write some interview questions for a junior and tried them out on my own team, they all failed!
You evil person :) I'd probably ask about something hard about algorithms and datastructures, and something "easy for me" about bithacks. That's probably evil too :) Still if someone doesn't even understand a Min-Heap, or tests for "is a power of 2" with a for loop and Math.Pow, there is something wrong with their education..
-
Need help from people with experience of interviewing, rather than being interview. I have been tasked with creating some .net question for the interview for a junior and I have no idea where to start and would love to have some sensible and as well as wacky (because I know here I'll get some!) from people that have had such experience
As barmey as a sack of badgers Dude, if I knew what I was doing in life, I'd be rich, retired, dating a supermodel and laughing at the rest of you from the sidelines.
You can work them out pretty quickly by not only asking the differences between interfaces and base classes but how and when it is best to apply each. I wouldn't bother asking silly questions like the number of piano tuners in london. Remeber, you're hiring a junior developer. They're likely to be a bit shy so don't terrify them within 2 minutes of the interview starting.
-
Billy Howell wrote:
You're mostly playing a word game with this.
NO, I'm looking for an approach and an attitude.
Billy Howell wrote:
Make sure it does what it says it will do.
I write a program, for which the spec is "accept three numbers, add them up, and print the total". I enter the numbers 2 and 2, and it prints 4. I enter the numbers 1, 2 and 3, and it prints 6. Clearly the program does what is says it will do, so it's tested. In fact, it multiplies them, not adds them, but still prints the correct answers. So my tests don't find my bug, but instead seem to show the program works. That is a contrived example, and yes, almost any other combination of numbers would show the bug. But I've seen too many developers who think their program is wonderful, and tiptoe round the testing, only trying to prove it works, and confirm their own programming abilities. We're getting into psychology rather than pure software engineering, but one of the things I'm looking for is the ability to want to try to knock down the thing you've just built, in an effort to prove how strong it is.
Ok. Now, I understand what you are getting at. I wonder if there is a better way to ask the question though. The question doesn't ask anything about the exhaustive nature of the testing. It only asks the reason for doing it in the first place.
-
Iain Clarke, Warrior Programmer wrote:
OK, I'll bite! That *is* the point, isn't
No. The point is to prove it doesn't work (ie find a defect) For any piece of software, one of these three must be true: 1) The software contains no defects. 2) The software contains at least one defect, and I know what it is. 3) The software contains at least one defect, and I do not know what it is. The role of the testing process is to determine which one is true. If 1) is true, you are finished. Release the product. If 2) is true, you fix the defect, and reevaluate. If 3) is true, you continue testing until either 1) or 2) is true. So, the point of testing is to move from 3) to either 1) or 2), and that involves finding defects. Of course, at some point, you have to take the view that "absence of proof is proof of absence", in that you can't find any more defects, so you assume there are none (ie you assume 1) is true, when it may not be). When and how you make that decision is what puts the 'engineering' into 'software engineering', and one of the things that makes the question a jumping off point for a broader discussion.
Iain Clarke, Warrior Programmer wrote:
but what about utf8?
Indeed. One of the points of the question is to see if the candidate things in terms of
char*
only, or can think about other aspects of a (deliberately) vague spec. For example, I didn't even say if the buffer was null-terminated or not.A junior programmer will not know this. They are applying for a JUNIOR programmer - they have no (or little) experience in these things. It's OK to know the basics, you don't want to hire someone who doesn't have a clue, but to ask from a junior programmer to elaborate on testing? I think you're asking too much. :)
-
First of all, you haven't been tasked with. You've been asked. Tasked with is vapid management bull serving to make something sound much more dynamic and go-to than it really is. Second - the silly question; ask how many piano tuners there are in London - this should help you to see how they respond to odd requests, pressure situations and what their thought processes are like. Do they start by assuming that the population of London is x, and out of that population, y% have pianos and it takes z tuners to service that many pianos? Do they tell you that they'd Google it? (Ironically, Chrome's spellchecker doesn't recognise Google; it offers Goggle, Googly, Goodly and the rather fun Go ogle as choices). Third - test them on the basics (here I'm assuming the position is for a .NET developer with some experience). Do they know what an interface is? Do they know what an abstract class is? When would you use one over the other? Do they know how to get data out of the database using something other than a DataSet/DataTable?
I have CDO, it's OCD with the letters in the right order; just as they ruddy well should be
Forgive your enemies - it messes with their heads
"He tasks me. He tasks me, and I shall have him...." Khan about a certain James T. Kirk
A while ago he asked me what he should have printed on my business cards. I said 'Wizard'. I read books which nobody else understand. Then I do something which nobody understands. After that the computer does something which nobody understands. When asked, I say things about the results which nobody understand. But everybody expects miracles from me on a regular basis. Looks to me like the classical definition of a wizard.
-
A junior programmer will not know this. They are applying for a JUNIOR programmer - they have no (or little) experience in these things. It's OK to know the basics, you don't want to hire someone who doesn't have a clue, but to ask from a junior programmer to elaborate on testing? I think you're asking too much. :)
A junior programmer with no experience of testing? That's not a junior programmer, that's a complete beginner!
-
Iain Clarke, Warrior Programmer wrote:
OK, I'll bite! That *is* the point, isn't
No. The point is to prove it doesn't work (ie find a defect) For any piece of software, one of these three must be true: 1) The software contains no defects. 2) The software contains at least one defect, and I know what it is. 3) The software contains at least one defect, and I do not know what it is. The role of the testing process is to determine which one is true. If 1) is true, you are finished. Release the product. If 2) is true, you fix the defect, and reevaluate. If 3) is true, you continue testing until either 1) or 2) is true. So, the point of testing is to move from 3) to either 1) or 2), and that involves finding defects. Of course, at some point, you have to take the view that "absence of proof is proof of absence", in that you can't find any more defects, so you assume there are none (ie you assume 1) is true, when it may not be). When and how you make that decision is what puts the 'engineering' into 'software engineering', and one of the things that makes the question a jumping off point for a broader discussion.
Iain Clarke, Warrior Programmer wrote:
but what about utf8?
Indeed. One of the points of the question is to see if the candidate things in terms of
char*
only, or can think about other aspects of a (deliberately) vague spec. For example, I didn't even say if the buffer was null-terminated or not. -
A junior programmer with no experience of testing? That's not a junior programmer, that's a complete beginner!
Well, when I applied for my first job, a junior programmer meant that work experience is not required.
-
Ok. Now, I understand what you are getting at. I wonder if there is a better way to ask the question though. The question doesn't ask anything about the exhaustive nature of the testing. It only asks the reason for doing it in the first place.
These are starting points for broader discussions. They aren't questions that someone can get right or wrong. For example, once they've written an answer to: 1. Write code do determine how many bits are "on" in a byte I'd probably introduce some more constrainsts. Some candidates use shift operatators and shift the byte left or right one place eight times, and test the most/least significant bit as appropriate. In that case, I'd ask how they would improve the speed. You could for, example, create a 256 entry lookup table, and use the byte as an index into that. Very quick result. You can then use that as a starting point for a discussion on whether the lookup table should be static, or built at runtime, and see what criteria they would apply to that decision. If they started with the lookup table approach, you can still have the dynamic vs static discussion, and then ask what they would do in a memory-constrained situation, where building a table was impossible. Essentially, I'm trying to engage them, to see a) how they interact with other people b) what their problem-solving approach is I'm most definitely not trying to get them to sit an exam, which gets marked.
-
Need help from people with experience of interviewing, rather than being interview. I have been tasked with creating some .net question for the interview for a junior and I have no idea where to start and would love to have some sensible and as well as wacky (because I know here I'll get some!) from people that have had such experience
As barmey as a sack of badgers Dude, if I knew what I was doing in life, I'd be rich, retired, dating a supermodel and laughing at the rest of you from the sidelines.
I do the some of the interviewing for my company, here are some of the questions I like (points are made up for emphasis): ********************************************1**************************** -for the white board or paper test Devise a Recursively searchable tree node structure, and show me an implementation. -wait for the inevitable questions(a good thing) What am i to search for? (and other class specs) A: a string named Name. (+1 pt) What does the recursion return? A: the node of interest. (+5 pts, not -10 pts) Forward recursive or reverse recursive-(+10 pts if they know the difference) ... let them work it out... If they are interviewing without threading background I like to see that they use an iterator, if threading NO ITERATOR ..... (5pts) if they can reason out that they need a temp node for the return value of the recursion before they get that far... (7pts) If it works (10 pts) If they show me that it can be done in less then say 5 minutes. (10 pts) if they use encapsulation (as per the specs that they ask... (10 pts) Yadda-yadda *********************************2************************************* Usually try to find one question which is very trivial for one or two of the random things which they claim they know.. ex: php:: Show me how to print the third member of the array bob. Javascript:: what is the difference between '==' and '===' ? ***********************************3*********************************** Ask at least one question with well defined specs up front but many "suitable" ways to answer it... Reverse the string "the fuzzy red fox". - check to see if they understand the conventions of whichever language you are using for strings, characters, and arrays. you also get to see if they are elegant in their choice of variable names, loop structure, and overall basic programming capabilities.. ********************************** 4 ***************** {Are you an active learner... etc.} do you have any personal projects, if so what.. {are you a member of a community} Do you frequent any forums, if so which one is your favorite and why? (plus 10000 pts if they say codeproject :cool: ) .. if they have a break in their employment history... What have you been doing to stay current with technologies..? ************************** summary ******************************* we like to stress the interviewee out to the point where they might be shaking a little (pressure stress) to see how clear headed they react and work under pressure. Check to see that th
-
Iain Clarke, Warrior Programmer wrote:
OK, I'll bite! That *is* the point, isn't
No. The point is to prove it doesn't work (ie find a defect) For any piece of software, one of these three must be true: 1) The software contains no defects. 2) The software contains at least one defect, and I know what it is. 3) The software contains at least one defect, and I do not know what it is. The role of the testing process is to determine which one is true. If 1) is true, you are finished. Release the product. If 2) is true, you fix the defect, and reevaluate. If 3) is true, you continue testing until either 1) or 2) is true. So, the point of testing is to move from 3) to either 1) or 2), and that involves finding defects. Of course, at some point, you have to take the view that "absence of proof is proof of absence", in that you can't find any more defects, so you assume there are none (ie you assume 1) is true, when it may not be). When and how you make that decision is what puts the 'engineering' into 'software engineering', and one of the things that makes the question a jumping off point for a broader discussion.
Iain Clarke, Warrior Programmer wrote:
but what about utf8?
Indeed. One of the points of the question is to see if the candidate things in terms of
char*
only, or can think about other aspects of a (deliberately) vague spec. For example, I didn't even say if the buffer was null-terminated or not.Electron Shepherd wrote:
The point is to prove it doesn't work (ie find a defect)
I'd back that up completely. Glenford Myers in his book 'The Art of Software Testing (2nd edn.)' argues that testing to demonstrate that errors are not present is impossible, and that proving that a program does what it is supposed to do does not prove that the program is error free. Therefore, Myers defines black box testing as “The destructive process of trying to find the errors (whose presence is assumed) in a program.” Myers, G. (2004) The Art of Software Testing (2nd edn.) Hoboken: John Wiley & Sons
Nobody can get the truth out of me because even I don't know what it is. I keep myself in a constant state of utter confusion. - Col. Flagg