What are 'the basics'?
-
I've known many people who are good at Computer Science but suck at programming. And I don't think that's very strange, there is a lot more to real life programming than some big O or big theta.
Yup, I too have known (know) such people. However, I have also seen(at least in my sample size) that a CS guy stops sucking at coding a lot faster than a coder being able to do even something as simple as analysing the complexity of his code.
...byte till it megahertz... my donation to web rubbish
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
Knowing "the basics" allows you to step comfortably outside the framework-de-jours, and get the job done. Knowing "the basics" allows you to consider memory consumption, performance and maintainability as you write your code. Knowing "the basics" allows you to think outside the box and approach a given problem from multiple (and wildly different) directions. What are the basics? If you ask 1000 programmers, you'll get eight different answers (sorry, that's a computer joke), depending on their level of experience and when they started coding.
.45 ACP - because shooting twice is just silly
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001 -
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
-
Yup, I too have known (know) such people. However, I have also seen(at least in my sample size) that a CS guy stops sucking at coding a lot faster than a coder being able to do even something as simple as analysing the complexity of his code.
...byte till it megahertz... my donation to web rubbish
-
Knowing "the basics" allows you to step comfortably outside the framework-de-jours, and get the job done. Knowing "the basics" allows you to consider memory consumption, performance and maintainability as you write your code. Knowing "the basics" allows you to think outside the box and approach a given problem from multiple (and wildly different) directions. What are the basics? If you ask 1000 programmers, you'll get eight different answers (sorry, that's a computer joke), depending on their level of experience and when they started coding.
.45 ACP - because shooting twice is just silly
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001John Simmons / outlaw programmer wrote:
If you ask 1000 programmers, you'll get eight different answers
Actually it is the other way around, when you ask eight programmers you get at least 1000 different answers. :)
Luc Pattyn [Forum Guidelines] [Why QA sucks] [My Articles] Nil Volentibus Arduum
Please use <PRE> tags for code snippets, they preserve indentation, and improve readability.
-
Knowing "the basics" allows you to step comfortably outside the framework-de-jours, and get the job done. Knowing "the basics" allows you to consider memory consumption, performance and maintainability as you write your code. Knowing "the basics" allows you to think outside the box and approach a given problem from multiple (and wildly different) directions. What are the basics? If you ask 1000 programmers, you'll get eight different answers (sorry, that's a computer joke), depending on their level of experience and when they started coding.
.45 ACP - because shooting twice is just silly
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001John Simmons / outlaw programmer wrote:
If you ask 1000 programmers, you'll get eight different answers (sorry, that's a computer joke)
Nice... And anyone who didn't immediately "get" that, doesn't know the basics :)
Proud to have finally moved to the A-Ark. Which one are you in?
Author of the Guardians Saga (Sci-Fi/Fantasy novels) -
Knowing "the basics" allows you to step comfortably outside the framework-de-jours, and get the job done. Knowing "the basics" allows you to consider memory consumption, performance and maintainability as you write your code. Knowing "the basics" allows you to think outside the box and approach a given problem from multiple (and wildly different) directions. What are the basics? If you ask 1000 programmers, you'll get eight different answers (sorry, that's a computer joke), depending on their level of experience and when they started coding.
.45 ACP - because shooting twice is just silly
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001John Simmons / outlaw programmer wrote:
sorry, that's a computer joke
and if you don't get it, you don't know the basics ? ;)
Watched code never compiles.
-
John Simmons / outlaw programmer wrote:
If you ask 1000 programmers, you'll get eight different answers
Actually it is the other way around, when you ask eight programmers you get at least 1000 different answers. :)
Luc Pattyn [Forum Guidelines] [Why QA sucks] [My Articles] Nil Volentibus Arduum
Please use <PRE> tags for code snippets, they preserve indentation, and improve readability.
Right. I actually typed it that way to see if anyone would catch it. :)
.45 ACP - because shooting twice is just silly
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001 -
Knowing "the basics" allows you to step comfortably outside the framework-de-jours, and get the job done. Knowing "the basics" allows you to consider memory consumption, performance and maintainability as you write your code. Knowing "the basics" allows you to think outside the box and approach a given problem from multiple (and wildly different) directions. What are the basics? If you ask 1000 programmers, you'll get eight different answers (sorry, that's a computer joke), depending on their level of experience and when they started coding.
.45 ACP - because shooting twice is just silly
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001Knowing "the basics" allows you to consider memory consumption, performance and maintainability as you write your code. Exactly, instead of waiting until it is 2 weeks before the release and deciding to run a performance validator which invalidates 6-12 months of development effort!
-
It comes down to realising that every instruction you ask the machine to execute has physical implications: memory allocation, CPU cycles used, power consumed, screen space needing to be redrawn. For me the basics are understanding how a computer actually does stuff, and asking the computer to do this in a sensible way by following sensible, tried and true patterns and using efficient algorithms. Tied closely is understanding the fundamentals of the framework and library you are using so you can colour your judgement calls appropriately. After this there's the level of how you actually write code. Architecting, Testing, code cleanliess and maintainability. Basically: don't be selfish and write code only for yourself. Write it for others.
cheers, Chris Maunder The Code Project | Co-founder Microsoft C++ MVP
-
Knowing "the basics" allows you to consider memory consumption, performance and maintainability as you write your code. Exactly, instead of waiting until it is 2 weeks before the release and deciding to run a performance validator which invalidates 6-12 months of development effort!
IMHO the best developers are a mix of: - implementation knowledge - IDEs, frameworks, version control, etc. - theory (the comp sci thing) - technical writing and presentation skills, the ability to sell an idea - interpersonal skills and ability to mentor - the mindset that simple is better, and the methodology or language du jour is not always best
-
The definition of basics probably changes with time. It might be binary calculation and machine assumbly a long time ago. But now it becomes very broad depending on what areas of computing.
TOMZ_KV
Perhaps 'the basics' (be it Computer Science or Programming) begin with understanding the five basic functions of a CPU, which are: 1. Input/Output (read/write) 2. Program Control (branch, jump, compare) 3. Arithmetic (add, subtract - everything else stems from those two) 4. Data Transfer (load register / store register / move / etc) 5. Logical (boolean operations including bitwise functions like OR/XOR and Shift) Or perhaps understanding the four basic components of a desktop computer: 1. Input devices (mouse/keyboard/barcode scanner/etc) 2. Output devices (printer/monitor/etc) 3. CPU (includes GPU now) 4. Storage Devices (disk/CD/DVD/Thumb Drives/etc) Of course, you could throw in the things I had to learn in college, like Hollerith Code, Wiring boards in IBM Accounting Machines (an early form of "programming"), Bios on Charles Babbage and Alan Turing, etc. But the VERY MOST basic of 'the basics" is this: totally understanding the friggin problem you are trying to solve. Programming is not an end unto itself. And Computer Science is not a science devoted to its own sphere of existence.
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
The top of the list is data structures. Then basic algorithms. You should know the tradeoffs between choices of data structures and algorithms. Generally useful techniques: Hashing, state machines, design of a GUI that responds to keyboard and mouse events in at least one language. For math, geometry seems the most generally useful, followed by linear algebra and probability. Graph theory helps with abstract reasoning about connections (which is often useful) but isn't essential. If you're doing engineering/technical programming then calculus and numerical methods. Basic software engineering: Object-oriented programming (virtual functions, inheritance, & polymorphism), and function-oriented programming (the mark of a beginner is 200-300+ line functions). Hexadecimal, binary, ASCII, and Unicode. Boolean operations.
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
One basic: understand (roughly) how your code is implemented in the memory at execution time. (Arrows and boxes on a whiteboard suffice) e.g. I saw some code the other day that probably created 5-10,000 string objects when it could have used a single dynamic buffer with maybe a single re-alloc. It was simple to read the algorithm, but would not scale well.
-
It comes down to realising that every instruction you ask the machine to execute has physical implications: memory allocation, CPU cycles used, power consumed, screen space needing to be redrawn. For me the basics are understanding how a computer actually does stuff, and asking the computer to do this in a sensible way by following sensible, tried and true patterns and using efficient algorithms. Tied closely is understanding the fundamentals of the framework and library you are using so you can colour your judgement calls appropriately. After this there's the level of how you actually write code. Architecting, Testing, code cleanliess and maintainability. Basically: don't be selfish and write code only for yourself. Write it for others.
cheers, Chris Maunder The Code Project | Co-founder Microsoft C++ MVP
You nailed it. I read a lot of comments about this and this was the one I most agree with. An understanding of "this code will make the computer do..." whatever it does, is severely lacking in most beginners I see these days. The problem with that is, they don't write good code by instinct - they have to think about it. Here's an example... a real life example something I'm working on now...
if (TB_Grade.DataSource == null && !IsPostBack) {
{Stuff to do...}
}if (DD_CreditType.DataSource == null && !IsPostBack) {
{Stuff to do...}
}Anyone who knows the basics can see the problem with that. If you know the basics, and you have a little experience, you wouldn't even type that in the first place... you would type this.
if (!IsPostback) {
if (TB_Grade.DataSource == null) {
{Stuff to do...}
}if (DD\_CreditType.DataSource == null) { {Stuff to do...} }
}
I find there are a lot of programmers coming out of college these days who don't know why the second one is better. They could explain it after a while, but this kind of thing should be instinct. Knowing how computers work, and the basics of what you're doing to the machine, will help you understand why checking things twice is going to be a problem. I see a lot of code lately which looks like it was written for infinitely powerful machines with infinite memory - because the programmers didn't understand resource usage. Now that we don't have to explicitly allocate memory and such, people just don't think about it - but it's still happening, and you need to understand how and why.
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
Distind wrote:
So what exactly are the basics of computer science?
I've been programming computers for nearly 40 years now, and if there's one thing I've learned about "the basics", it is that todays technologies are tomorrow's basics. So, today's basics are the things that you learn so you can better understand today's technologies -- why they behave the way they do, why they were implemented the way they were, and why they fail they way they do. Pick a technology you use today, and ask yourself how it works inside and why it was implemented they way it was. If you don't know, you have some basics to fill in. There are so many tehchnologies these days that there is no one list of "basics" that will help you understand any technology you are likely to use.
patbob
-
You nailed it. I read a lot of comments about this and this was the one I most agree with. An understanding of "this code will make the computer do..." whatever it does, is severely lacking in most beginners I see these days. The problem with that is, they don't write good code by instinct - they have to think about it. Here's an example... a real life example something I'm working on now...
if (TB_Grade.DataSource == null && !IsPostBack) {
{Stuff to do...}
}if (DD_CreditType.DataSource == null && !IsPostBack) {
{Stuff to do...}
}Anyone who knows the basics can see the problem with that. If you know the basics, and you have a little experience, you wouldn't even type that in the first place... you would type this.
if (!IsPostback) {
if (TB_Grade.DataSource == null) {
{Stuff to do...}
}if (DD\_CreditType.DataSource == null) { {Stuff to do...} }
}
I find there are a lot of programmers coming out of college these days who don't know why the second one is better. They could explain it after a while, but this kind of thing should be instinct. Knowing how computers work, and the basics of what you're doing to the machine, will help you understand why checking things twice is going to be a problem. I see a lot of code lately which looks like it was written for infinitely powerful machines with infinite memory - because the programmers didn't understand resource usage. Now that we don't have to explicitly allocate memory and such, people just don't think about it - but it's still happening, and you need to understand how and why.
Jasmine2501 wrote:
An understanding of "this code will make the computer do..." whatever it does, is severely lacking in most beginners I see these days.
Exactly which year in the past was it when beginners did understand that?
Jasmine2501 wrote:
I find there are a lot of programmers coming out of college these days who don't know why the second one is better.
Where "better" means what exactly? Faster? Less cost to maintain? Paid by the curly bracket so a higher billing? Anything I can think of as far as better for that code snippet (with no other information) would make the actual utility so far beneath the noise level that one would need to delete the entire code base to move it up. Conversely I would much rather, for example, have a "beginner" demonstrate a basic understanding of a profiler and be able to create a small design document versus your code snippet.
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
Depending on what you are programming, what is demanded from you and how good you want to be. You may pass by knowing nothing more but only how to manipulate data with your language if you are making a business application or something duckt taped for some rip off company, but you will never be able to create for example a good driver or code OS or anything that can be used as infrastructure necessary for databases, internet servers etc. to exist. Knowing all you listed and actually being able to use those things are two separate issues. If you are not patient and can't visualize, can't manipulate and mutate data, remember crucial things when they are needed it's not going to help you much. It is like saying: "I knew all mathematical operations, I knew 12.000 digits of Pi and I can translate your problem in to 4 (earth) languages and therefore I can solve any problem you throw at me." The thing is you either get it or you don't. Just because in past there were hardcore people actually made stuff that worked and made it easier for beginners, it does not mean that anyone can go and do hardcore. To do hardcore you have to be hardcore (learn and work until you fall a sleep and forget about drugs). Doing hardcore stuff does imply draconian coding/learning/debugging cycles. It does however imply job well done and job well done consists of great code, great documentation, simple user interface with advanced options where needed but hidden and pleasant to see user interface. User interface comes at the end of development cycle. Giving a user an application and trying to implement his/her ideas as far as possible while introducing my own and in less and in than 10 cycles of user testing and me coding for a small app, user interface ends up all of the above. Of course having and intelligent user that understands what application is suppose to do (not how it's built) is essential for obvious reasons - if you built a dog house and test it with fishes now one is going to be happy except worms and maybe grass too if you remembered to test outside and add water at the same time :)
I like being sober. It gives some kind of quality to life that I can't really put my finger on... it is like running life from console way less colors but so much more control.
modified on Tuesday, October 5, 2010 8:33 PM
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
Basics, hmm there are lots of basics, depends on what are You doing. We can go as deep as semiconductors and counting the volume of electrons in it. But if You are doing web stuff like asp.net or php, or else why bother?? You have to always find the right buttons to push it. It is nice to know all but I think, that older guys don't understand that people who know some frameworks, worked for it really hard to know one in and out. Learning some framework is like learning whole new language even if You know the language with it was written. In my opinion basics for programming are: control statemants like loops, and ifs, then goes structures like tables, stack, queue with this one can achieve quite a lot and these are basic things. As well absolutely minimum is knowledge about types of variables, and how to convert them. Everything else come in time when one will be programming, and I've seen people who did not understand tables. Understandig what they are doing, but who care about one if statemant, one if does not take now any significant time of cpu, maybe when You are embedded developer then Yes but for full blown computer one if does not make difference, so knowing that type of things is not healping.
-
I was looking over the newsletter this morning and caught this posting: New Generation does not realy understand computers[^] Which has left me wondering what exactly the basics are. I understand the concepts behind computers from the basic circuits and bit math, up though general language concepts and framework use almost entirely from my education. I can take a given concept and implement it in at least four languages off the top of my head, not counting C# and VB.net as separate. But I will admit file parsing was passed right over in my program(Software Engineering rather than Computer Science). So what exactly are the basics of computer science? I figure having a clue here may help those of us who are that generation avoid things like this in the future.
The single most valuable course I ever took in my Computer Systems Engineering course was... ahh.... i forgot the name but it started out like this.... 1. Get a Motorola 68HC12 dev board. 2. Program it with machine code in HEX to print to blink an LED. No assembly, no compilers, no libraries. The IO on this microcontroller was memory mapped so the program would simply write a bit high and low but we had to look up the hex for each processor command and use the memory diagram to know where to write our program to, write to IO and jump to functions. Later in the course we move up to assembly and even C on the same platform. In my opinion, you dont need to do this for a programming career but if you want to be a half decent programmer, do this until you understand it.