What are 'the basics'?
-
Knowing "the basics" allows you to consider memory consumption, performance and maintainability as you write your code. Exactly, instead of waiting until it is 2 weeks before the release and deciding to run a performance validator which invalidates 6-12 months of development effort!
IMHO the best developers are a mix of: - implementation knowledge - IDEs, frameworks, version control, etc. - theory (the comp sci thing) - technical writing and presentation skills, the ability to sell an idea - interpersonal skills and ability to mentor - the mindset that simple is better, and the methodology or language du jour is not always best
-
The definition of basics probably changes with time. It might be binary calculation and machine assumbly a long time ago. But now it becomes very broad depending on what areas of computing.
TOMZ_KV
Perhaps 'the basics' (be it Computer Science or Programming) begin with understanding the five basic functions of a CPU, which are: 1. Input/Output (read/write) 2. Program Control (branch, jump, compare) 3. Arithmetic (add, subtract - everything else stems from those two) 4. Data Transfer (load register / store register / move / etc) 5. Logical (boolean operations including bitwise functions like OR/XOR and Shift) Or perhaps understanding the four basic components of a desktop computer: 1. Input devices (mouse/keyboard/barcode scanner/etc) 2. Output devices (printer/monitor/etc) 3. CPU (includes GPU now) 4. Storage Devices (disk/CD/DVD/Thumb Drives/etc) Of course, you could throw in the things I had to learn in college, like Hollerith Code, Wiring boards in IBM Accounting Machines (an early form of "programming"), Bios on Charles Babbage and Alan Turing, etc. But the VERY MOST basic of 'the basics" is this: totally understanding the friggin problem you are trying to solve. Programming is not an end unto itself. And Computer Science is not a science devoted to its own sphere of existence.
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
The top of the list is data structures. Then basic algorithms. You should know the tradeoffs between choices of data structures and algorithms. Generally useful techniques: Hashing, state machines, design of a GUI that responds to keyboard and mouse events in at least one language. For math, geometry seems the most generally useful, followed by linear algebra and probability. Graph theory helps with abstract reasoning about connections (which is often useful) but isn't essential. If you're doing engineering/technical programming then calculus and numerical methods. Basic software engineering: Object-oriented programming (virtual functions, inheritance, & polymorphism), and function-oriented programming (the mark of a beginner is 200-300+ line functions). Hexadecimal, binary, ASCII, and Unicode. Boolean operations.
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
One basic: understand (roughly) how your code is implemented in the memory at execution time. (Arrows and boxes on a whiteboard suffice) e.g. I saw some code the other day that probably created 5-10,000 string objects when it could have used a single dynamic buffer with maybe a single re-alloc. It was simple to read the algorithm, but would not scale well.
-
It comes down to realising that every instruction you ask the machine to execute has physical implications: memory allocation, CPU cycles used, power consumed, screen space needing to be redrawn. For me the basics are understanding how a computer actually does stuff, and asking the computer to do this in a sensible way by following sensible, tried and true patterns and using efficient algorithms. Tied closely is understanding the fundamentals of the framework and library you are using so you can colour your judgement calls appropriately. After this there's the level of how you actually write code. Architecting, Testing, code cleanliess and maintainability. Basically: don't be selfish and write code only for yourself. Write it for others.
cheers, Chris Maunder The Code Project | Co-founder Microsoft C++ MVP
You nailed it. I read a lot of comments about this and this was the one I most agree with. An understanding of "this code will make the computer do..." whatever it does, is severely lacking in most beginners I see these days. The problem with that is, they don't write good code by instinct - they have to think about it. Here's an example... a real life example something I'm working on now...
if (TB_Grade.DataSource == null && !IsPostBack) {
{Stuff to do...}
}if (DD_CreditType.DataSource == null && !IsPostBack) {
{Stuff to do...}
}Anyone who knows the basics can see the problem with that. If you know the basics, and you have a little experience, you wouldn't even type that in the first place... you would type this.
if (!IsPostback) {
if (TB_Grade.DataSource == null) {
{Stuff to do...}
}if (DD\_CreditType.DataSource == null) { {Stuff to do...} }
}
I find there are a lot of programmers coming out of college these days who don't know why the second one is better. They could explain it after a while, but this kind of thing should be instinct. Knowing how computers work, and the basics of what you're doing to the machine, will help you understand why checking things twice is going to be a problem. I see a lot of code lately which looks like it was written for infinitely powerful machines with infinite memory - because the programmers didn't understand resource usage. Now that we don't have to explicitly allocate memory and such, people just don't think about it - but it's still happening, and you need to understand how and why.
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
Distind wrote:
So what exactly are the basics of computer science?
I've been programming computers for nearly 40 years now, and if there's one thing I've learned about "the basics", it is that todays technologies are tomorrow's basics. So, today's basics are the things that you learn so you can better understand today's technologies -- why they behave the way they do, why they were implemented the way they were, and why they fail they way they do. Pick a technology you use today, and ask yourself how it works inside and why it was implemented they way it was. If you don't know, you have some basics to fill in. There are so many tehchnologies these days that there is no one list of "basics" that will help you understand any technology you are likely to use.
patbob
-
You nailed it. I read a lot of comments about this and this was the one I most agree with. An understanding of "this code will make the computer do..." whatever it does, is severely lacking in most beginners I see these days. The problem with that is, they don't write good code by instinct - they have to think about it. Here's an example... a real life example something I'm working on now...
if (TB_Grade.DataSource == null && !IsPostBack) {
{Stuff to do...}
}if (DD_CreditType.DataSource == null && !IsPostBack) {
{Stuff to do...}
}Anyone who knows the basics can see the problem with that. If you know the basics, and you have a little experience, you wouldn't even type that in the first place... you would type this.
if (!IsPostback) {
if (TB_Grade.DataSource == null) {
{Stuff to do...}
}if (DD\_CreditType.DataSource == null) { {Stuff to do...} }
}
I find there are a lot of programmers coming out of college these days who don't know why the second one is better. They could explain it after a while, but this kind of thing should be instinct. Knowing how computers work, and the basics of what you're doing to the machine, will help you understand why checking things twice is going to be a problem. I see a lot of code lately which looks like it was written for infinitely powerful machines with infinite memory - because the programmers didn't understand resource usage. Now that we don't have to explicitly allocate memory and such, people just don't think about it - but it's still happening, and you need to understand how and why.
Jasmine2501 wrote:
An understanding of "this code will make the computer do..." whatever it does, is severely lacking in most beginners I see these days.
Exactly which year in the past was it when beginners did understand that?
Jasmine2501 wrote:
I find there are a lot of programmers coming out of college these days who don't know why the second one is better.
Where "better" means what exactly? Faster? Less cost to maintain? Paid by the curly bracket so a higher billing? Anything I can think of as far as better for that code snippet (with no other information) would make the actual utility so far beneath the noise level that one would need to delete the entire code base to move it up. Conversely I would much rather, for example, have a "beginner" demonstrate a basic understanding of a profiler and be able to create a small design document versus your code snippet.
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
Depending on what you are programming, what is demanded from you and how good you want to be. You may pass by knowing nothing more but only how to manipulate data with your language if you are making a business application or something duckt taped for some rip off company, but you will never be able to create for example a good driver or code OS or anything that can be used as infrastructure necessary for databases, internet servers etc. to exist. Knowing all you listed and actually being able to use those things are two separate issues. If you are not patient and can't visualize, can't manipulate and mutate data, remember crucial things when they are needed it's not going to help you much. It is like saying: "I knew all mathematical operations, I knew 12.000 digits of Pi and I can translate your problem in to 4 (earth) languages and therefore I can solve any problem you throw at me." The thing is you either get it or you don't. Just because in past there were hardcore people actually made stuff that worked and made it easier for beginners, it does not mean that anyone can go and do hardcore. To do hardcore you have to be hardcore (learn and work until you fall a sleep and forget about drugs). Doing hardcore stuff does imply draconian coding/learning/debugging cycles. It does however imply job well done and job well done consists of great code, great documentation, simple user interface with advanced options where needed but hidden and pleasant to see user interface. User interface comes at the end of development cycle. Giving a user an application and trying to implement his/her ideas as far as possible while introducing my own and in less and in than 10 cycles of user testing and me coding for a small app, user interface ends up all of the above. Of course having and intelligent user that understands what application is suppose to do (not how it's built) is essential for obvious reasons - if you built a dog house and test it with fishes now one is going to be happy except worms and maybe grass too if you remembered to test outside and add water at the same time :)
I like being sober. It gives some kind of quality to life that I can't really put my finger on... it is like running life from console way less colors but so much more control.
modified on Tuesday, October 5, 2010 8:33 PM
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
Basics, hmm there are lots of basics, depends on what are You doing. We can go as deep as semiconductors and counting the volume of electrons in it. But if You are doing web stuff like asp.net or php, or else why bother?? You have to always find the right buttons to push it. It is nice to know all but I think, that older guys don't understand that people who know some frameworks, worked for it really hard to know one in and out. Learning some framework is like learning whole new language even if You know the language with it was written. In my opinion basics for programming are: control statemants like loops, and ifs, then goes structures like tables, stack, queue with this one can achieve quite a lot and these are basic things. As well absolutely minimum is knowledge about types of variables, and how to convert them. Everything else come in time when one will be programming, and I've seen people who did not understand tables. Understandig what they are doing, but who care about one if statemant, one if does not take now any significant time of cpu, maybe when You are embedded developer then Yes but for full blown computer one if does not make difference, so knowing that type of things is not healping.
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
The single most valuable course I ever took in my Computer Systems Engineering course was... ahh.... i forgot the name but it started out like this.... 1. Get a Motorola 68HC12 dev board. 2. Program it with machine code in HEX to print to blink an LED. No assembly, no compilers, no libraries. The IO on this microcontroller was memory mapped so the program would simply write a bit high and low but we had to look up the hex for each processor command and use the memory diagram to know where to write our program to, write to IO and jump to functions. Later in the course we move up to assembly and even C on the same platform. In my opinion, you dont need to do this for a programming career but if you want to be a half decent programmer, do this until you understand it.
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
I'm not bloody telling you! That would be to give-away my income.
-
It comes down to realising that every instruction you ask the machine to execute has physical implications: memory allocation, CPU cycles used, power consumed, screen space needing to be redrawn. For me the basics are understanding how a computer actually does stuff, and asking the computer to do this in a sensible way by following sensible, tried and true patterns and using efficient algorithms. Tied closely is understanding the fundamentals of the framework and library you are using so you can colour your judgement calls appropriately. After this there's the level of how you actually write code. Architecting, Testing, code cleanliess and maintainability. Basically: don't be selfish and write code only for yourself. Write it for others.
cheers, Chris Maunder The Code Project | Co-founder Microsoft C++ MVP
Chris Maunder wrote:
Basically: don't be selfish and write code only for yourself. Write it for others.
"others" does include your "future self" - try reading code you've written a year ago or longer and haven't touched since then, and you will realize that clean and readable code is just as important for your own good.
-
You nailed it. I read a lot of comments about this and this was the one I most agree with. An understanding of "this code will make the computer do..." whatever it does, is severely lacking in most beginners I see these days. The problem with that is, they don't write good code by instinct - they have to think about it. Here's an example... a real life example something I'm working on now...
if (TB_Grade.DataSource == null && !IsPostBack) {
{Stuff to do...}
}if (DD_CreditType.DataSource == null && !IsPostBack) {
{Stuff to do...}
}Anyone who knows the basics can see the problem with that. If you know the basics, and you have a little experience, you wouldn't even type that in the first place... you would type this.
if (!IsPostback) {
if (TB_Grade.DataSource == null) {
{Stuff to do...}
}if (DD\_CreditType.DataSource == null) { {Stuff to do...} }
}
I find there are a lot of programmers coming out of college these days who don't know why the second one is better. They could explain it after a while, but this kind of thing should be instinct. Knowing how computers work, and the basics of what you're doing to the machine, will help you understand why checking things twice is going to be a problem. I see a lot of code lately which looks like it was written for infinitely powerful machines with infinite memory - because the programmers didn't understand resource usage. Now that we don't have to explicitly allocate memory and such, people just don't think about it - but it's still happening, and you need to understand how and why.
I've seen similar pieces of code frequently and always crave to fix them, but unfortunately, most often they're more convoluted than this example and involve else cases, the call of functions that may have side effects (global variables or huge classes that basically make lots of variables quasi-global) or similar things which make a refactoring difficult. :doh: But then parts of the codebase is older than twenty years, and I don't feel like "touching a running system" if I don't need to...
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
For me, 'basics' include the understanding how a computer will walk through your code, i. e. the order of evaluating expressions: - If there is a function call inside an expression, this function will have to be executed before the expression can be evaluated - If abovementioned function calls change in some way the value of other parts of the expression, the resulting value will be undetermined, unless the order of evaluation is clear (and even then, the result might be unexpected/wrong) - if you have multithreaded code, the state of globally shared variables may change at any time, so there is no point in reading them and storing them locally for later reference Specifically parallel programming (multithreaded or otherwise) requires a very good understanding what it means when several streams of instructions are being executed 'at the same time', most prominently what it means when several threads require access to a limited resource. A lot of what used to be part of 'the basics' has become obsolete however. This is specifically true for stuff like the number of CPU cycles used for particular operations: 30 years ago, a multiplication was much more costly than an addition, so it was important to know which kind of instructions would result in assembler code multiplication ops. For instance it used to be more effective using pointers instead of index values to iterate over an array, because it was less costly to increment and dereference a pointer once per iteration, instead of dereferencing the pointer to the start of the array, and adding an offset that is a multiple of the index value (the multiple being the data size). But today's CPUs calculate integer multiplications just as fast as additions, and also compilers are intelligent enough to optimize such code by themselves where neccessary. So these considerations are now (mostly) a thing of the past.
-
Basics, hmm there are lots of basics, depends on what are You doing. We can go as deep as semiconductors and counting the volume of electrons in it. But if You are doing web stuff like asp.net or php, or else why bother?? You have to always find the right buttons to push it. It is nice to know all but I think, that older guys don't understand that people who know some frameworks, worked for it really hard to know one in and out. Learning some framework is like learning whole new language even if You know the language with it was written. In my opinion basics for programming are: control statemants like loops, and ifs, then goes structures like tables, stack, queue with this one can achieve quite a lot and these are basic things. As well absolutely minimum is knowledge about types of variables, and how to convert them. Everything else come in time when one will be programming, and I've seen people who did not understand tables. Understandig what they are doing, but who care about one if statemant, one if does not take now any significant time of cpu, maybe when You are embedded developer then Yes but for full blown computer one if does not make difference, so knowing that type of things is not healping.
Mozim wrote:
I've seen people who did not understand tables
Well, a table is where your monitor stands on, no? ;P
-
If you suck, keep on sucking until you succeed ;)
-
Jasmine2501 wrote:
An understanding of "this code will make the computer do..." whatever it does, is severely lacking in most beginners I see these days.
Exactly which year in the past was it when beginners did understand that?
Jasmine2501 wrote:
I find there are a lot of programmers coming out of college these days who don't know why the second one is better.
Where "better" means what exactly? Faster? Less cost to maintain? Paid by the curly bracket so a higher billing? Anything I can think of as far as better for that code snippet (with no other information) would make the actual utility so far beneath the noise level that one would need to delete the entire code base to move it up. Conversely I would much rather, for example, have a "beginner" demonstrate a basic understanding of a profiler and be able to create a small design document versus your code snippet.
I think you're missing the story being told. This is a basic illustration of efficient flow control. This, but a single illustration, is representative of code that is (depending upon the optimizer) less efficient and harder to understand and maintain. If a number of items are all dependent upon the same state of the same member, that question should only be asked once, unless compelling reasons exist for the unnecessary separation. Indeed, scenarios could exist where the initial two statements should be kept separate, such as potential side-effects of the test: but even here, there's an error lying in wait: compilers often do not guarantee the order in which the conditions within the parenthesis will be executed . . . unless forced to do so by nesting. If code existed between these two statements, that could change everything: but not as illustrated.
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein
"As far as we know, our computer has never had an undetected error." - Weisert
"If you are searching for perfection in others, then you seek dissappointment. If you are searching for perfection in yourself, then you seek failure." - Balboos HaGadol Mar 2010
-
I've seen similar pieces of code frequently and always crave to fix them, but unfortunately, most often they're more convoluted than this example and involve else cases, the call of functions that may have side effects (global variables or huge classes that basically make lots of variables quasi-global) or similar things which make a refactoring difficult. :doh: But then parts of the codebase is older than twenty years, and I don't feel like "touching a running system" if I don't need to...
Fear of old code is a big problem too, but it requires more than the basics to fix usually.
-
Mozim wrote:
I've seen people who did not understand tables
Well, a table is where your monitor stands on, no? ;P
Yep meant arrays ... X|
-
I think you're missing the story being told. This is a basic illustration of efficient flow control. This, but a single illustration, is representative of code that is (depending upon the optimizer) less efficient and harder to understand and maintain. If a number of items are all dependent upon the same state of the same member, that question should only be asked once, unless compelling reasons exist for the unnecessary separation. Indeed, scenarios could exist where the initial two statements should be kept separate, such as potential side-effects of the test: but even here, there's an error lying in wait: compilers often do not guarantee the order in which the conditions within the parenthesis will be executed . . . unless forced to do so by nesting. If code existed between these two statements, that could change everything: but not as illustrated.
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein
"As far as we know, our computer has never had an undetected error." - Weisert
"If you are searching for perfection in others, then you seek dissappointment. If you are searching for perfection in yourself, then you seek failure." - Balboos HaGadol Mar 2010
Balboos wrote:
I think you're missing the story being told.
Obviously you are missing the point. Beginners are in fact beginners. If they were not beginners then they would not be beginners. If they knew as much or more than senior programmers then there would be something wrong with the world.
Balboos wrote:
This is a basic illustration of efficient flow control. This, but a single illustration, is representative of code that is (depending upon the optimizer) less efficient and harder to understand and maintain. If a number of items are all dependent upon the same state of the same member, that question should only be asked once, unless compelling reasons exist for the unnecessary separation.
After 40 years of programming in multiple languages I understand the example. With that experience I also understand that there is almost no chance that it will make a difference in real code.
Balboos wrote:
Indeed, scenarios could exist where the initial two statements should be kept separate, such as potential side-effects of the test: but even here, there's an error lying in wait: compilers often do not guarantee the order in which the conditions within the parenthesis will be executed . . . unless forced to do so by nesting. If code existed between these two statements, that could change everything: but not as illustrated.
Which has nothing to do with my point. Provide an argument where 90% of the applications in the world would be measurably affected if one choice was always used. Conversely provide an argument that demonstrates that if beginners were to know fundamentals of design/profilers it would not be a measurable advantage.