Interview questions - best way to learn the answers
-
Through inheritance, a class can be used as more than one type; it can be used as its own type, any base types, or any interface type if it implements interfaces. This is called polymorphism. In C#, every type is polymorphic...has nothing to do with run time, code changing during run time (dll getting dynamically created) is known as "reflection" correct?
>Through inheritance, a class can be used as more than one type The key phrase there is "can be used". You have no "usage" in your example. All you have is inheritance. You don't have polymorphism until the types get used as different types. I know where you got that definition, and typical of Microsoft it is somewhat poorly worded. But technically, because it says "can be used", I wouldn't call it incorrect. But your original "nutshell" definition is incorrect. I'm not giving you a hard time, I just wouldn't want you to answer that way in an interview.
-
>Through inheritance, a class can be used as more than one type The key phrase there is "can be used". You have no "usage" in your example. All you have is inheritance. You don't have polymorphism until the types get used as different types. I know where you got that definition, and typical of Microsoft it is somewhat poorly worded. But technically, because it says "can be used", I wouldn't call it incorrect. But your original "nutshell" definition is incorrect. I'm not giving you a hard time, I just wouldn't want you to answer that way in an interview.
-
I consider myself a good developer, fellow developers and managers as well as clients have told me the same. I code to standards and make sure it is done correctly. So why is it that in an interview when asked a question about code I get stumped and not able to answer it correctly? Am I the only one that does this? Can you BE a great developer without being able to tell you the definition of polymorphism or the like? I know I can do the work, very well. So what can I do to learn the definitions of things? I am thinking of making cue cards and going from there. They have helped me in the past. What do you think? What is the best way for you to learn? Also, do you know definitions and meanings of everything you do? Thanks
Sorry to say, but I can remember all those definitions. Furthermore, teams I have worked with know what patterns are, and do talk about them occasionally. Knowing what things are called is the first step in being able to communicate about them. It means you understand the nugget of information and can abstract it mentally and look at it and think about it. There is an actual difference between being able to write a virtual function and knowning what polymorphism is. When I am a hiring manager, I look for this difference in a senior person, because not everybody gets it. My advice to you is, if you can't remember what (for instance) polymorphism is, then your training is not complete. You need to study oo design, not just memorize definitions. It's not just for interviewing, but for the good of your whole career.
-
Through inheritance, a class can be used as more than one type; it can be used as its own type, any base types, or any interface type if it implements interfaces. This is called polymorphism. In C#, every type is polymorphic...has nothing to do with run time, code changing during run time (dll getting dynamically created) is known as "reflection" correct?
>code changing during run time (dll getting dynamically created) is known as "reflection" correct? Well, I think you're jumbling several different issues together. Reflection is a different issue. Often, if you're using reflection in .NET, you're doing "anti-OO". You're finding out stuff about the object type when polymorphism would not require you to do that. (DLL creation is a third topic altogether). Here are 2 code snippet examples in C#. The first is an exmaple of polymorphism. The second is an example of using reflection to do a similar thing. abstract class Animal { public virtual void Move( ) { Console.WriteLine("An animal is moving"); } } class Bird : Animal { public override void Move( ) { Console.WriteLine("A bird is flying"); } } class Fish : Animal { public override void Move( ) { Console.WriteLine("A fish is swimming"); } } // An object of type Animal can hold an Animal, Bird or any // other subtype. Classes are static, objects are dynamic, // so using bird1 in this way can be considered polymorphism. Animal bird1 = new Bird(); Fish fish1 = new Fish(); // this by itself is not polymorphism foo(bird1); foo(fish1); void foo( Animal animal ) { animal.Move(); // Polymorphism // Where this goes is dynamically determined at runtime, // and if you only were shown only the foo function, you // could not tell where this goes } Or... // Reflection // You are asking the object to "look at itself" (reflect) to determine // what type it is from its metadata. if (animal.GetType() == typeof(Bird)) { Console.WriteLine("A bird is flying"); } else if (animal.GetType() == typeof(Fish)) { Console.WriteLine("A fish is swimming"); }
-
I consider myself a good developer, fellow developers and managers as well as clients have told me the same. I code to standards and make sure it is done correctly. So why is it that in an interview when asked a question about code I get stumped and not able to answer it correctly? Am I the only one that does this? Can you BE a great developer without being able to tell you the definition of polymorphism or the like? I know I can do the work, very well. So what can I do to learn the definitions of things? I am thinking of making cue cards and going from there. They have helped me in the past. What do you think? What is the best way for you to learn? Also, do you know definitions and meanings of everything you do? Thanks
We mentioned earlier interviewers that were fishing for specific answers. Here is one you might run into. For C++ and C#, sometimes they want to hear the word "virtual" when speaking about OO or polymorphism. Because this is the keyword that enables polymorphism. If you do not have the keyword virtual in your code, then you do not have polymorphism. (Some would argue that you could, though, simply by having derived class objects being assigned to base class variables, even if virtual methods are not overridden anywhere. Others would argue that you would have to have polymorphic *behavior*, and that requires overridden methods.) If you see the word virtual, then you know there is at least potential for polymorphism. You would have to inspect the code to see if that method is ever overridden. Polymorphism doesn't necessarily exist simply because of the virtual keyword. In math that's called a "necessary but not sufficent condition". Without the virtual keyword, there's no polymorphism. With it, there might be polymorphism.
-
Chris Losinger wrote:
in my 17 years of programming, i have never had a discussion with a co-worker about a "pattern".
Switch to Java (or SmallTalk if it still exists) and you'll have plenty of such discussions.
Nemanja Trifunovic wrote:
Switch to Java (or SmallTalk if it still exists) and you'll have plenty of such discussions.
Not in my experience. Even on the java forums (the Sun ones) discussion of patterns is low. And even lower still when one removes the questions from inexperienced users.
-
OK, this'll be a little long because this is a huge bug bear of mine. The only reason you can't explain something is that you don't understand it. I have never met anyone who was unable to explain something that they understood. I meet lots of people who think they understand things but can't explain them, and on pressing they discover they don't. I fall into this category myself quite often. I saw this so many times when lecturing. People could give the text book definition of something but couldn't explain it. It's the difference between knowing and understanding. I test knowledge by whether or not someone can "teach" it. So in an interview situation I would ask the person to "teach" me how something works, or why some technique can be handy. Testbook definitions are meaningless in that situation. If you find you are stumped when it comes to explaining polymorphism it's because a) you don't actually understand it and b) while compensating for not understanding it you are getting hung up on the text book definition. For the record, my significant other has exactly the same problem as you, also with Polymorphism funnily enough. Now, you can use techniques in your code without fully understanding them. The hundrends of libraries, tools, frameworks and principles etc that we have to use mean that we spend big chunks of our time using things that we only superficially know (but don't understand). I still need my cheat sheets for a significant number of technologies. And that's fine, it's fine with technologies and tools to not get them on a deep level. If I had someone sitting in front of me who had trouble remembering the exact syntax of how to mock an object using Rhino Mocks, It would be interesting, but I wouldn't care too much. If they couldn't explain in their own words why you would want to mock an object I'd be more concerned. If I had someone sitting in front of me who claimed to be good at OO but who couldn't teach me about polymorphism I'd be very concerned. The best way test whether you understand something is to find someone who doesn't understand it and try to teach them. If you feel you get stuck on Polymorphism then use it. Create a demo and show it to a work colleague. You will feel the click in your head when you move from knowledge to understanding (for me it's "duhn duhn" the sound from Law And Order, it might be different for you). Stick at it, but forget about memorising text books. -Richard
Hit any user to continue.
Richard A. Dalton wrote:
The only reason you can't explain something is that you don't understand it. I have never met anyone who was unable to explain something that they understood. I meet lots of people who think they understand things but can't explain them, and on pressing they discover they don't. ....
However that applies to the interviewer as well. And given that the mostly likely reason that the interviewer is in that position has absolutely nothing to do with their people skills it is probably just as possible that one will end up in a guessing game where one succeeds only by spewing out exactly the right technical term.
-
Swelborn wrote:
So why is it that in an interview when asked a question about code I get stumped and not able to answer it correctly?
That's the way it's set up :)
Swelborn wrote:
Can you BE a great developer without being able to tell you the definition of polymorphism or the like?
You don't have to answer everything correctly, you're not Google and no-one will be expecting that you can rehash all your studybooks. You will be judged on how you react when confronted with something that's not in your short-term memory. Will you propose to further investigate, or would you become angry?
Swelborn wrote:
I know I can do the work, very well. So what can I do to learn the definitions of things?
Being able to sum up (correct) definitions doesn't impress me - too many developers who can vaguely tell what's on the stack and what isn't, while not being able to implement basic error-handling. Show me that you understand the definition, I want to make sure that you know what you're doing. And it's a bonus if you have a strategy for the moments that you're confronted with a question/situation that you don't know the answer to.
Swelborn wrote:
What is the best way for you to learn?
To teach :)
Swelborn wrote:
Also, do you know definitions and meanings of everything you do?
Yes/no. Once there's a need to explain something, you'll need to define some things. The most concise explanation is often equal to the definition of a subject.
I are Troll :suss:
Eddy Vluggen wrote:
You don't have to answer everything correctly, you're not Google and no-one will be expecting that you can rehash all your studybooks. You will be judged on how you react when confronted with something that's not in your short-term memory. Will you propose to further investigate, or would you become angry?
That is idealistic. There is no reason to believe that programmers as a group have anything but average interview skills. At best.
-
>code changing during run time (dll getting dynamically created) is known as "reflection" correct? Well, I think you're jumbling several different issues together. Reflection is a different issue. Often, if you're using reflection in .NET, you're doing "anti-OO". You're finding out stuff about the object type when polymorphism would not require you to do that. (DLL creation is a third topic altogether). Here are 2 code snippet examples in C#. The first is an exmaple of polymorphism. The second is an example of using reflection to do a similar thing. abstract class Animal { public virtual void Move( ) { Console.WriteLine("An animal is moving"); } } class Bird : Animal { public override void Move( ) { Console.WriteLine("A bird is flying"); } } class Fish : Animal { public override void Move( ) { Console.WriteLine("A fish is swimming"); } } // An object of type Animal can hold an Animal, Bird or any // other subtype. Classes are static, objects are dynamic, // so using bird1 in this way can be considered polymorphism. Animal bird1 = new Bird(); Fish fish1 = new Fish(); // this by itself is not polymorphism foo(bird1); foo(fish1); void foo( Animal animal ) { animal.Move(); // Polymorphism // Where this goes is dynamically determined at runtime, // and if you only were shown only the foo function, you // could not tell where this goes } Or... // Reflection // You are asking the object to "look at itself" (reflect) to determine // what type it is from its metadata. if (animal.GetType() == typeof(Bird)) { Console.WriteLine("A bird is flying"); } else if (animal.GetType() == typeof(Fish)) { Console.WriteLine("A fish is swimming"); }
-
Sorry to say, but I can remember all those definitions. Furthermore, teams I have worked with know what patterns are, and do talk about them occasionally. Knowing what things are called is the first step in being able to communicate about them. It means you understand the nugget of information and can abstract it mentally and look at it and think about it. There is an actual difference between being able to write a virtual function and knowning what polymorphism is. When I am a hiring manager, I look for this difference in a senior person, because not everybody gets it. My advice to you is, if you can't remember what (for instance) polymorphism is, then your training is not complete. You need to study oo design, not just memorize definitions. It's not just for interviewing, but for the good of your whole career.
Member 2941392 wrote:
Sorry to say, but I can remember all those definitions. Furthermore, teams I have worked with know what patterns are, and do talk about them occasionally.
Then presumably you are capable of teaching someone, on those rare occasions when it seems relevant because it did come up in a conversation. Which would certainly be more than enough in almost all situations where it is actually important.
Member 2941392 wrote:
Knowing what things are called is the first step in being able to communicate about them. It means you understand the nugget of information and can abstract it mentally and look at it and think about it.
One can only wonder then how did people discuss pattern implementation before the work 'pattern' was used to encapsulate it? Since all of the patterns in the GoF book, by definition, existed before the book how exactly are you proposing that they were used and understood?
Member 2941392 wrote:
There is an actual difference between being able to write a virtual function and knowning what polymorphism is.
And yet I have been using virtual functions for more than 25 years and yet have very, very seldom ever felt a need much less a requirement to classify a difference specific to "polymorphism." I have also had many conversations about virtual functions and almost none that mentioned polymorphism.
Member 2941392 wrote:
When I am a hiring manager, I look for this difference in a senior person, because not everybody gets it.
Myself I understand that my skill in programming is unlikely to translate into communication skills much less people skills. And that is very likely true for other programmers, regardless of their skill level.
Member 2941392 wrote:
My advice to you is, if you can't remember what (for instance) polymorphism is, then your training is not complete. You need to study oo design, not just memorize definitions. It's not just for interviewing, but for the good of your whole career.
I am quite certain that someone could have an incredible career without know an exact definition for polymorphism, not even really knowing what it means and most certainly without the ability to spew it out at a moments notice at the whim of any random interviewer who thinks
-
Marc Clifton wrote:
Personally, what I find much more difficult, interesting, and useful, is learning the lingo of the domain, be it Wall St. or aeronautics or *cough* the entertainment sector, boatyards, etc.
The way I earn my living, this is essential. I don't suppose I write academic-standard code (but hey, it compiles, runs, and is stable :) ) but when I get a project, the first thing I do is really understand the client's business, to the point where I can sit in a management meeting and understand everything, jargon and all. I also interview all the users who are going to work on the system, because their understanding of the problem is often very different to management's at the fine detail level. This also gives the users some feeling of ownership in the project, and this can be hugely useful further down the road. Oops - wandered off topic there, sorry. :-O
Chris C-B wrote:
but when I get a project, the first thing I do is really understand the client's business, to the point where I can sit in a management meeting and understand everything, jargon and all.
That's ok if the client actually understands their own business. My boss, for example, asked me to create a servicing database for the machines we make, only problem was he doesn't understand how the services are organised in the first place! :doh: And what makes this little tale even better, is the fact that he wanted ME to design the database functionality to model servicing when he can't tell me what procedures are used in the first place. I wouldn't mind if it were a really big company employing 100s of people, but it's not; it's a two man band with 4 staff on the shop floor. Talk about pissing in the wind.
Nobody can get the truth out of me because even I don't know what it is. I keep myself in a constant state of utter confusion. - Col. Flagg
-
Member 2941392 wrote:
Sorry to say, but I can remember all those definitions. Furthermore, teams I have worked with know what patterns are, and do talk about them occasionally.
Then presumably you are capable of teaching someone, on those rare occasions when it seems relevant because it did come up in a conversation. Which would certainly be more than enough in almost all situations where it is actually important.
Member 2941392 wrote:
Knowing what things are called is the first step in being able to communicate about them. It means you understand the nugget of information and can abstract it mentally and look at it and think about it.
One can only wonder then how did people discuss pattern implementation before the work 'pattern' was used to encapsulate it? Since all of the patterns in the GoF book, by definition, existed before the book how exactly are you proposing that they were used and understood?
Member 2941392 wrote:
There is an actual difference between being able to write a virtual function and knowning what polymorphism is.
And yet I have been using virtual functions for more than 25 years and yet have very, very seldom ever felt a need much less a requirement to classify a difference specific to "polymorphism." I have also had many conversations about virtual functions and almost none that mentioned polymorphism.
Member 2941392 wrote:
When I am a hiring manager, I look for this difference in a senior person, because not everybody gets it.
Myself I understand that my skill in programming is unlikely to translate into communication skills much less people skills. And that is very likely true for other programmers, regardless of their skill level.
Member 2941392 wrote:
My advice to you is, if you can't remember what (for instance) polymorphism is, then your training is not complete. You need to study oo design, not just memorize definitions. It's not just for interviewing, but for the good of your whole career.
I am quite certain that someone could have an incredible career without know an exact definition for polymorphism, not even really knowing what it means and most certainly without the ability to spew it out at a moments notice at the whim of any random interviewer who thinks
>And yet I have been using virtual functions for more than 25 years and yet have very, very seldom ever felt a need much less a requirement >to classify a difference specific to "polymorphism." I've been using pencils for 40 years, but that doesn't make me an artist. Whether or not you're a good OO programmer doesn't have much to do with how long you've been "using" virtual functions, it has more to do with how well you understand OO programming. >I have also had many conversations about virtual functions and almost none that mentioned polymorphism. I have had many conversations about gas mileage, and none of them mentioned internal combustion. If I were speaking to my mechanic, I would not expect him to mention internal combustion. If I owned a car shop and were interviewing a mechanic, I would expect him to know what internal combustion was. If I'm speaking to my programming peers, we all know we know polymorphism because of implicit combination of phrases like "virtual", "at runtime", "inheritance", "type conversion", etc. But of course every one of us can "spew it out" at a random moment's notice if necessary if asked the definition. >Myself I understand that my skill in programming is unlikely to translate into communication skills much less people skills. Since the OP mentioned an interview, I'm assuming we're talking about the whole package here - software development in a team environment. Programming is a big part of that, but not all. If your "skill" is unlikely to translate into communication of your ideas, or people skills, you're much less likely to be a successful software developer. >I am quite certain that someone could have an incredible career without know an exact definition for polymorphism The first person to mention "exact definition" in this thread, I think, is you. >not even really knowing what it means and most certainly without the ability to spew it out at a moments notice at the whim of any random >interviewer who thinks term definition is actually objectively significant. "Spew" it out? Getting a little sarcastic, aren't we? "Random" interviewer, or the specific one that can grant you the job at the company you want? I find your arguments a little silly. If you called yourself an OO programmer and yet could not provide me with a basic explanation of polymorphism, you certainly wouldn't get a job at any company I've ever worked at. This is not rocket science, and it's not irrelevent academia or trivia. It's a fundamental concept of OO programming.
-
This is probably one of the best threads I've read in a while. I'm glad to know that I'm not the only one out there. I'm a software architect, but I'm also an Infantryman in the Army National Guard. For those of you in foreign countries, this basically means that I am a volunteer military reservist. I've been called up for military duty five times during my programming career. My last deployment was to Iraq. My memory has not been the same since I returned from a one year deployment to Iraq. My programming ability is still there, but I have a hard time some remembering syntax and definitions. I've compensated for this by writing a private blog that functions as a knowledge base. When I do something at work that I know I will have to remember, I write what I did in the blog and save it. Now, this process, or syntax or procedure is in the blog so that I can search for it and find it if I can't remember how I did something. I can't even begin to describe the frustration I encountered when I returned from Iraq and I started interviewing for work. I could not remember the answer basic syntactical questions. I found that drawing my answers on a notepad or a white board before I spoke helped me remember things in interviews sometimes, but many interviewers just found this odd. Usually, other software architects would just look smug as I squirmed in a chair trying to remember how to write a join command in SQL. I knew when to use a join command, but for the life of me, I could not write one from memory. The funny thing is that I could remember high level things just fine. If you asked me a logical programming question, I could describe how I could do something. I could describe how I did things in the past and why. I could tell you the difference between an Interface and an Abstract class and when you would want to use one or the other. I flew through the technical phone screens just fine, but when I was face to face with someone and had to describe how I would access a SQL database, I just couldn't write it without using IntelliSense or Google. My brain just didn't have the words anymore. I'm getting better with remembering syntax. A lot of this comes from me forcing myself to memorize things that I used to be able to remember with ease just a few years ago. The Code Project has helped. I check the Q&A forums once a day to see if I can answer any questions. So no, brother. You are not the only one.
Ryan, I suspect your problems were not so much to do with memory loss or anything that happened in Iraq. But simply the natural result of being away from programming for an extended period. I can only imagine what it's like to interview after that long away. If I were interviewing I'd hope I could tailor the interview to get around that. It's also common with women who've been on maternity leave for example. I suspect that if you tried now to remember the syntax of something you have been using a lot recently you'd have less trouble. I chop and change between different Databases a lot, and between VB and C# etc. I still have to remind myself that variables are declared differently in VB and C# and more than once I've stuck a ; on the end of a VB statement, or forgotten them in C#. A few weeks ago I had to google the syntax for a cursor in PL/SQL even though a year or two ago I was writing the damn things every day. What you're touching on again here is Knowledge vs Understanding. Knowledge is fickle, it'll leave you if you don't use it. But it's a bit of a slut and it'll come rushing back the first time you show a bit of interest in it. -Richard
Hit any user to continue.
-
Member 2941392 wrote:
Sorry to say, but I can remember all those definitions. Furthermore, teams I have worked with know what patterns are, and do talk about them occasionally.
Then presumably you are capable of teaching someone, on those rare occasions when it seems relevant because it did come up in a conversation. Which would certainly be more than enough in almost all situations where it is actually important.
Member 2941392 wrote:
Knowing what things are called is the first step in being able to communicate about them. It means you understand the nugget of information and can abstract it mentally and look at it and think about it.
One can only wonder then how did people discuss pattern implementation before the work 'pattern' was used to encapsulate it? Since all of the patterns in the GoF book, by definition, existed before the book how exactly are you proposing that they were used and understood?
Member 2941392 wrote:
There is an actual difference between being able to write a virtual function and knowning what polymorphism is.
And yet I have been using virtual functions for more than 25 years and yet have very, very seldom ever felt a need much less a requirement to classify a difference specific to "polymorphism." I have also had many conversations about virtual functions and almost none that mentioned polymorphism.
Member 2941392 wrote:
When I am a hiring manager, I look for this difference in a senior person, because not everybody gets it.
Myself I understand that my skill in programming is unlikely to translate into communication skills much less people skills. And that is very likely true for other programmers, regardless of their skill level.
Member 2941392 wrote:
My advice to you is, if you can't remember what (for instance) polymorphism is, then your training is not complete. You need to study oo design, not just memorize definitions. It's not just for interviewing, but for the good of your whole career.
I am quite certain that someone could have an incredible career without know an exact definition for polymorphism, not even really knowing what it means and most certainly without the ability to spew it out at a moments notice at the whim of any random interviewer who thinks
Ooh, wow. It almost soundslike somebody defending ignorance. I'm sure you can have a fine career and make many dollars without being able to communicate. I'm also pretty sure that people (unless they have a brain injury) can remember the words that name concepts that they understand. Your inability to remember the word probably indicates you don't grasp the concept. I hope you can have a fine career and make many dollars without understanding polymorphism or any other concept. I'm sure there will always be folks around with CS degrees and years of experience who can explain the concepts to you again and again, so that you need never internalize them. You're right. What bearing could understanding software concepts possibly have on writing programs? Only an idiot would interview for such knowledge! As for patterns, the whole point of the design pattern movement is to give patterns names so you can speak about them and remember them. While the patterns existed before they had names, many programmers, even experienced ones, weren't aware of using them. They developed the same things over and over again from first principles. They often made mistakes (google "Double checked locking" or "singleton" for examples). By naming and providing standard descriptions of patterns, you can talk about or code them by reference to a body of written-down knowledge. This is all in the introduction to the GoF book. A lot of less experienced, self-taught developers figure they're so smart they can program rings around their peers even if they have always to work from first principles. A lot of experienced devs know that nobody is that smart.
-
Eddy Vluggen wrote:
You don't have to answer everything correctly, you're not Google and no-one will be expecting that you can rehash all your studybooks. You will be judged on how you react when confronted with something that's not in your short-term memory. Will you propose to further investigate, or would you become angry?
That is idealistic. There is no reason to believe that programmers as a group have anything but average interview skills. At best.
jschell wrote:
That is idealistic.
Yes. Did I set the bar too high?
jschell wrote:
There is no reason to believe that programmers as a group have anything but average interview skills. At best.
He needs to stand out from the other solicitors :) "Programmers as a group" is a very broad public, with a lot of variation in skills. You don't need to encourage someone who loves to code to learn a language or a definition, they'll do that anyway.
I are Troll :suss:
-
That is my stock answer when anyone asks me about polymorphisms. Some get it, and smile, and some just don't. If they give me a strange look I explain about parameters types, method signatures, and such.
Simply Elegant Designs JimmyRopes Designs
Think inside the box! ProActive Secure Systems
I'm on-line therefore I am. JimmyRopes -
I use polymorphism all the time (as I imagine we all do) yet the word is too abstract (no pun intended) for me to spit out a concise definition on demand, but if you want me to describe how inheritance can be used to change the behavior based on type, I can do that readily. On the other hand, I have a friend who can wax eloquently on polymorphism for hours but couldn't explain type inheritance. He lives in a much more abstract world than me, while I live in a rather more concrete world. We have great discussions, because I can take his abstractions and put them into some really interesting implementation, and when I talk to him about implementation, he often points me to new ideas in abstraction. The point being, I don't really think it's a lack of understanding that I can't spit out the definition of polymorphism, it's more related to what domain (that word again) I live in and where I choose to focus my attention. Most IT techy-words I basically just bleep over, like Linus reading War and Peace, because I don't connect to abstract terms. Maybe I should have taken Latin in school. ;) Marc
In a recent job interview I was asked what polymorphism meant. I explained that when refrerring to software the term is the exact opposite of the proper meaning. Polymorphism traditionally means a thing that has different outward appearances -- a compound with different crystal structures or a species with different forms (the only example I can think of is a tadpole and frog). In software design it seems to mean different things that have the same outward appearance (interface). Anyway, I didn't get the job. BTW, Marc your articles on CP inspired me to write one - http://www.codeproject.com/KB/architecture/develmethodologies1.aspx.
Andrew Phillips http://www.hexedit.com andrew @ hexedit.com
-
I consider myself a good developer, fellow developers and managers as well as clients have told me the same. I code to standards and make sure it is done correctly. So why is it that in an interview when asked a question about code I get stumped and not able to answer it correctly? Am I the only one that does this? Can you BE a great developer without being able to tell you the definition of polymorphism or the like? I know I can do the work, very well. So what can I do to learn the definitions of things? I am thinking of making cue cards and going from there. They have helped me in the past. What do you think? What is the best way for you to learn? Also, do you know definitions and meanings of everything you do? Thanks
i feel the same
manoj sharma 09911087802 manoj.great@yahoo.com
-
In a recent job interview I was asked what polymorphism meant. I explained that when refrerring to software the term is the exact opposite of the proper meaning. Polymorphism traditionally means a thing that has different outward appearances -- a compound with different crystal structures or a species with different forms (the only example I can think of is a tadpole and frog). In software design it seems to mean different things that have the same outward appearance (interface). Anyway, I didn't get the job. BTW, Marc your articles on CP inspired me to write one - http://www.codeproject.com/KB/architecture/develmethodologies1.aspx.
Andrew Phillips http://www.hexedit.com andrew @ hexedit.com
Andrew Phillips wrote:
I explained that when refrerring to software the term is the exact opposite of the proper meaning.
Outstanding! I will have to use that, I always had an issue with that word, now I know why!
Andrew Phillips wrote:
BTW, Marc your articles on CP inspired me to write one -
I just scanned it, it's great! Gave you a five, I really like the simplicity of your analogy. One comment though, I was disappointed that you didn't write more about Iterative development, as that is actually the form that I find works best and so therefore I have a deeper personal investment in. :) If you're interested, I could write you up a couple paragraphs on how I found it to work, and its strengths and weaknesses, that you could then add to the article. Marc
-
>The way I see it, you learn karate so you need never use it. Wow, what a bad analogy :-)