Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Are there reasons for beginner programmers to learn C ?

Are there reasons for beginner programmers to learn C ?

Scheduled Pinned Locked Moved The Lounge
learningc++oopperformance
100 Posts 44 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • A Alan Balkany

    Yes: 1. C forms the basis for many other languages: e.g. C++, C#, and Java. 2. C is good for learning core, low-level programming skills. After these are learned, you can approach object-oriented programming without being distracted by low-level programming issues. 3. C is good for learning function-oriented programming, which complements object-oriented programming. (People who only know object-oriented programming create classes with unmaintainable 300-line methods.) 4. C can be used as a kind of "assembly language", i.e. for low-level modules that have to be the most efficient, without the overhead of C++. 5. C is a small simple language that won't overwhelm a beginner with the complexity of C++. 6. The time spent studying C isn't wasted, since C++ is a superset of C.

    "Microsoft -- Adding unnecessary complexity to your work since 1987!"

    S Offline
    S Offline
    Stefan_Lang
    wrote on last edited by
    #69

    I agree with all but item 3 in your list: my experience is quite the contrary of what you claim: it used to be the old C-style programmers wgo produced the unmaintainable 300 lines of code functions, not the OO programmers who normally recognized repetitive parts and refactored them into seperate functions. And it's the same C-programmers that used copy/paste programming to implement an alternate branch in a function, needlessly bloating them to several 1000 lines during the lifetime of an application. However, in truth this may be less connected to being C or C++ programmers, but rather to being good programmers! ;)

    1 Reply Last reply
    0
    • M Maximilien

      Are there reasons for beginner programmers to be taught C instead of C++? I'm not even thinking about Object Oriented programming, but simple declarative programming. I'm reading a lot of questions on CodeProject and on StackOverflow where people ask about issues with C language features that are so prone to errors and defect that it makes me cringe. A lot of those issues could be handled by simple C++ features (memory management (new/delete, smart pointers), strings, collections, references, ... ) I know there are lot of legacy code out there and it should still be maintained, but old code "ways" should not be the emphasis of the education. :confused:

      Watched code never compiles.

      S Offline
      S Offline
      svella
      wrote on last edited by
      #70

      For a beginner, if the choice was between C and C++, I'd definitely start with C. Why? because C++ has too many voodoo automatic behaviors that bite even experienced C++ developers in the ass. C, by comparison is relatively straightforward to understand the complete behavior of the language. But I wouldn't pick either one as a first language. My first language was APL, but I wouldn't recommend that either. I'd probably start with one of the current crop of dynamic languages. -Shon

      1 Reply Last reply
      0
      • M Maximilien

        Are there reasons for beginner programmers to be taught C instead of C++? I'm not even thinking about Object Oriented programming, but simple declarative programming. I'm reading a lot of questions on CodeProject and on StackOverflow where people ask about issues with C language features that are so prone to errors and defect that it makes me cringe. A lot of those issues could be handled by simple C++ features (memory management (new/delete, smart pointers), strings, collections, references, ... ) I know there are lot of legacy code out there and it should still be maintained, but old code "ways" should not be the emphasis of the education. :confused:

        Watched code never compiles.

        P Offline
        P Offline
        patbob
        wrote on last edited by
        #71

        Maximilien wrote:

        I know there are lot of legacy code out there and it should still be maintained, but old code "ways" should not be the emphasis of the education.

        Emphasis? Certainly not. However, all of software is made up of layers upon layers of leaky abstractions. Sooner or later, quirks from the lower levels leak through to the upper levels. If you already have a working understanding of the lower level, then what's happening makes sense and you already have an idea of what to do about it. If not, you sit there dumbfounded, without a clue where to start debugging the problem. You will absolutely need to know how the next layer down works to deal with bugs that appear in the layer you're working at.

        Maximilien wrote:

        Are there reasons for beginner programmers to be taught C instead of C++?

        I can't think of any. In fact, I wouldn't recommend either language for beginner programmers. Beginner programmers are learning basic concepts, like if statements, boolean logic, loops, variables, functions, etc. Any language that allows them to learn those concepts without having to worry about other details is the best one to start with. In fact, it shouldn't be a language they might use for real work, because they'll also be learning lots of bad habits that the rest of us don't want them to be bringing into our production code. Sometimes a student has to start learning with the language they'll be using for production code, but that just makes it harder for them because they'll have to unlearn a lot of bad habits without really understanding what's going on, why those habits are bad, and what to replace them with. Its actually easier to take concepts forward into another language, learning good habits as you learn the language, than unlearn bad habits.

        We can program with only 1's, but if all you've got are zeros, you've got nothing.

        1 Reply Last reply
        0
        • P PIEBALDconsult

          Not as a first language. Nor should an OOP-only language (VB, C#, etc.) be the first language. In my opinion BASIC and Pascal (and maybe Perl?) are still good first languages even though they won't apply very well to modern business. Professional developers still to be smacked with C.

          S Offline
          S Offline
          spotsknight
          wrote on last edited by
          #72

          I had some initial introduction to Basic in high school - way back in trash-80 (TRS-80) days. But my first and I feel MOST beneficial college class I EVER took was programming logic with pseudo-code and flow charting. After that my first language was C then later C++. If I had tried to learn C++ first I'm not sure I would have understood what was actually happening. When I'm writing code I want to make sure I understand what exact is happening. There is a lot in C++ that was much easier to learn knowing the basic C language.

          P 1 Reply Last reply
          0
          • M Maximilien

            Are there reasons for beginner programmers to be taught C instead of C++? I'm not even thinking about Object Oriented programming, but simple declarative programming. I'm reading a lot of questions on CodeProject and on StackOverflow where people ask about issues with C language features that are so prone to errors and defect that it makes me cringe. A lot of those issues could be handled by simple C++ features (memory management (new/delete, smart pointers), strings, collections, references, ... ) I know there are lot of legacy code out there and it should still be maintained, but old code "ways" should not be the emphasis of the education. :confused:

            Watched code never compiles.

            J Offline
            J Offline
            Jonas Hammarberg
            wrote on last edited by
            #73

            Depends on the type of programmer ... I would say that letting a Java-programmer lose with C++ are a receipt for disasters to come ... If you're a hardware/os-agnostic, you might get away with not learning but I wouldn't bet on it.

            1 Reply Last reply
            0
            • R Rob Ford 2

              C should not be inflicted upon anyone. It is clumsy, slow to code and inelegant, and is a source of the World's most inefficient and buggy programs. It is only due to inertia and legacy that it still survives, mostly within failing companies.

              O Offline
              O Offline
              Oshtri Deka
              wrote on last edited by
              #74

              Rob Ford 2 wrote:

              It is clumsy, slow to code and inelegant, and is a source of the World's most inefficient and buggy programs. It is only due to inertia and legacy that it still survives, mostly within failing companies.

              Bold statement, but what arguments do you have to support it?

              1 Reply Last reply
              0
              • M Maximilien

                Are there reasons for beginner programmers to be taught C instead of C++? I'm not even thinking about Object Oriented programming, but simple declarative programming. I'm reading a lot of questions on CodeProject and on StackOverflow where people ask about issues with C language features that are so prone to errors and defect that it makes me cringe. A lot of those issues could be handled by simple C++ features (memory management (new/delete, smart pointers), strings, collections, references, ... ) I know there are lot of legacy code out there and it should still be maintained, but old code "ways" should not be the emphasis of the education. :confused:

                Watched code never compiles.

                T Offline
                T Offline
                Trajan McGill
                wrote on last edited by
                #75

                Yes, absolutely and unequivocally yes (except for the "instead of C++" part). The point is not to emphasize old ways. The reason there are so many cringe-worthy issues with C-programming is that so many people never properly learned how to do it. C is practically the ideal first real language for an introductory programming course. Why? Because everything that you learn to do on your own in C is still being done in newer languages, just behind the scenes. (Are there pointers in C# and Java? Absolutely, you just don't handle them as straightforwardly, and the difference between object types and primitive types will just be pure mystery to a beginner...unless that beginner has first understood pointers.) You don't learn mathematics by starting with higher order abstractions and using calculators to do the mere arithmetic, then only later going and learning to add and subtract if you ever get a job that requires it. You learn what numbers mean, how they move and relate to one another on a lower level, and how little things build into bigger ones. You count, then you add and subtract, then you multiply and divide, and so on. C, unlike assembly, is about the right distance from the machine that you don't have to directly think about hardware in most cases, but you do need to think about the basic issues involved in understanding programming concepts. What does a computer do? How does it do it? How are data remembered and acted upon? Why does it make a difference which way you do something? How does a programmer look out for edge cases, efficiency issues, and all the pitfalls that arise from a computer doing exactly what you tell it? C teaches precision, algorithm design and choice, memory efficiency, and other basic things a beginner won't know (like how to translate a notion of what you want to do into a precisely specified set of instructions), all in a pretty simple package with straightforward syntax, and the ability to split pieces of your program out into separate routines for organizational and abstraction purposes. Object-orientation is a mode of programming that builds upon everything learned in C, and it not only becomes a lot to swallow at once, it also bears so much less of a straightforward relationship to what the computer is doing that it should be learned afterward. Garbage-collected languages, or those with massive built-in libraries, are the same way: "Look, now that you understand what is going on in managing memory, here is a tool that, when appropriate, can do it for you." Basically it b

                1 Reply Last reply
                0
                • M Maximilien

                  Well, I don't care (one example of many) how std::string internally manages the string, I just want to do std::string s("hello world");. it is safe, it is efficient.

                  Watched code never compiles.

                  B Offline
                  B Offline
                  Bram van Kampen
                  wrote on last edited by
                  #76

                  Well,

                  Maximilien wrote:

                  std::string s("hello world");.

                  I'm using VC5. Never missed that Construct, and I don't see the efficiency improvement against what I would write:

                  'CString s("Hello World");

                  Bram van Kampen

                  1 Reply Last reply
                  0
                  • R Rob Ford 2

                    C should not be inflicted upon anyone. It is clumsy, slow to code and inelegant, and is a source of the World's most inefficient and buggy programs. It is only due to inertia and legacy that it still survives, mostly within failing companies.

                    B Offline
                    B Offline
                    Bram van Kampen
                    wrote on last edited by
                    #77

                    Rob Ford 2 wrote:

                    C should not be inflicted upon anyone.
                    It is clumsy, slow to code and inelegant, and is a source of the World's most inefficient and buggy programs. It is only due to inertia and legacy that it still survives, mostly within failing companies.

                    Well, This is an increadibly shortsighted view of things. The New NET type interface is probably very efficient for writing Web Pages. Some of us write Device Drivers. This is a job to be done, every time a new device is being put on the market. Challenge: Try to write a basic Device Driver in Managed Code! Come to it,99% of the OS (e.g Windows 7 or 8) is written in C or CPP. The remainder is written in assembler. What makes your Managed Code run, is ultimately the effort of C and C++ programmers. They have been extremely successful. So Successful in fact, that people like yourself who use those advanced interfaces end up thinking that CPP is Obsolete. To put facts straight:- When you write a piece of NET Code, your result is syntax checked, and, if it passes, compiled into 'Intermediate Code' The compiler that performs this feat, was written in C and CPP.( It would be Impossible to write this sort of compiler in a NET Language) Never mind NET Languges, the compilers for this are still written in C and C++. The RunTime, (that's when you see your code on a screen somewhere) equally attractsOS System based code, originaly written in CPP. In Short: The Crap you write in Managed Code, could never be displayed, if others had not written code in C or CPP, to actually display it!

                    Bram van Kampen

                    1 Reply Last reply
                    0
                    • J Joe Woodbury

                      Absolutely not! Code should do the best it can to solve the problem (for your customer.) I'm having this very argument now over writing generic code that can be ported to Linux vs. sharing that code which you can, but tailoring the core to each platform. (The current version is more generic and simply doesn't scale well.) I'm also tired of using programs that suck on every platform, all in the name of being cross-platform.

                      B Offline
                      B Offline
                      Bram van Kampen
                      wrote on last edited by
                      #78

                      Hi, Those things sometimes happen! C and CPP are Closest to the Hardware, Managed Code is closest to the Overal Concept. At the same time, Managed code requires Hard Wired Code to Run.

                      Bram van Kampen

                      1 Reply Last reply
                      0
                      • S spotsknight

                        I had some initial introduction to Basic in high school - way back in trash-80 (TRS-80) days. But my first and I feel MOST beneficial college class I EVER took was programming logic with pseudo-code and flow charting. After that my first language was C then later C++. If I had tried to learn C++ first I'm not sure I would have understood what was actually happening. When I'm writing code I want to make sure I understand what exact is happening. There is a lot in C++ that was much easier to learn knowing the basic C language.

                        P Offline
                        P Offline
                        PIEBALDconsult
                        wrote on last edited by
                        #79

                        spotsknight wrote:

                        If I had tried to learn C++ first I'm not sure I would have understood what was actually happening.

                        Exactly, it needs to be learned in layers. Students should have a firm grasp of the fundementals before advancing to OOP and such.

                        1 Reply Last reply
                        0
                        • R Russell Ranshaw

                          It's interesting that you mention "assembly". The very history of C as a programming language speaks loudly. Long ago and long ago (as the Native Americans would say), Bell Laboratories purchased a DEC PDP-7 computer. That machine was quite primitive, more or less a "minimal" computer. It had either 4k or 8k memory of 18 bit words, 16 op-codes, no multiply or divide instructions, no index registers, and primitive indirect addressing. It did have a set of memory locations that were "auto-increment" that simulated very primitive index registers, but were not terribly useful over all. The principle I/O were paper tape and a model 33 teletype. In fact, a bare-bones PDP-7 only had a teletype, in which case it would have been a model 35 ASR, which had paper tape read and write included. VERY slow! Bell Labs wrote a language called BPL (for Bell Programming Language, I think) which they used as an alternative to the assembly language supplied by DEC. I've never seen any details on BPL. However, Bell Labs used BPL to write a FORTRAN compiler for the PDP-7, as odd at that might sound. When DEC came out with the PDP-11, Bell Labs bought one and jumped on it like a duck on a June bug! They wrote a translator that converted the BPL translator (probably other programs as well) to run on the PDP-11. Using the translated BPL, they developed a new language, C. From rags to riches in terms of machine language capability, the C designers included features in the language to utilize many of the newly available features of the PDP-11. In particular, the auto-increment/decrement and to-memory instruction modifiers were incorporated in the ++/--/+=/-= operators. The indirection modifiers gave rise to the pointer operators. The above information I learned from a Bell Labs programmer/developer at at DECUS (Digital Equipment Computer User's Society) meeting. He was one of the original creators of C, but unfortunately I forget his name. He told me that when they developed C, they had in mind a "portable assembler" that would allow them to port code to any architecture by merely writing a translator for C for that new machine. Good C programmers, he said, visualized assembly code as they wrote in C. For anyone interested, here is a link to the PDP-11 "card": [^] Given the .NET availability these days, it ma

                          S Offline
                          S Offline
                          Sanmayce
                          wrote on last edited by
                          #80

                          Hi Mr. Ranshaw, very informative thanks. "He told me that when they developed C, they had in mind a "portable assembler" that would allow them to port code to any architecture by merely writing a translator for C for that new machine. Good C programmers, he said, visualized assembly code as they wrote in C." Yes, yes but they failed [well partially they succeeded] to achieve this nifty goal.

                          Get down get down get down get it on show love and give it up What are you waiting on?

                          R 1 Reply Last reply
                          0
                          • M Maximilien

                            Are there reasons for beginner programmers to be taught C instead of C++? I'm not even thinking about Object Oriented programming, but simple declarative programming. I'm reading a lot of questions on CodeProject and on StackOverflow where people ask about issues with C language features that are so prone to errors and defect that it makes me cringe. A lot of those issues could be handled by simple C++ features (memory management (new/delete, smart pointers), strings, collections, references, ... ) I know there are lot of legacy code out there and it should still be maintained, but old code "ways" should not be the emphasis of the education. :confused:

                            Watched code never compiles.

                            S Offline
                            S Offline
                            Sanmayce
                            wrote on last edited by
                            #81

                            Hi Maximilien, >I'm reading a lot of questions on CodeProject and on StackOverflow where people ask about issues with C language features that are so prone to errors and defect that it makes me cringe. Yes this is the bad side of going down to hell, in order to forge a high quality blade you need hellish heath, you know. >Are there reasons for beginner programmers to be taught C instead of C++? Pretty simple: when you treat a child as such you underestimate his/her potential by imposing your limits. If the beginner is afraid to enter deep waters (that is to evolve) it is better not to deal with C at all. >I know there are lot of legacy code out there and it should still be maintained, but old code "ways" should not be the emphasis of the education. What about the old mantra 'data plus algorithms equals programs'? In my view it is absolutely mandatory to know the basics of algorithms, as for the programming languages they come as a natural 'NEXT-STEP'.

                            Get down get down get down get it on show love and give it up What are you waiting on?

                            1 Reply Last reply
                            0
                            • S Sanmayce

                              Hi Mr. Ranshaw, very informative thanks. "He told me that when they developed C, they had in mind a "portable assembler" that would allow them to port code to any architecture by merely writing a translator for C for that new machine. Good C programmers, he said, visualized assembly code as they wrote in C." Yes, yes but they failed [well partially they succeeded] to achieve this nifty goal.

                              Get down get down get down get it on show love and give it up What are you waiting on?

                              R Offline
                              R Offline
                              Russell Ranshaw
                              wrote on last edited by
                              #82

                              "... but they failed [well partially they succeeded] to achieve this nifty goal." In what way did they fail to achieve that goal? More of the history is that Bell Labs developed an operating system for the PDP-7 (for what there was none previously). After they got C running on the PDP-11, they used C to write a translator from BPL (ie, their compiler for the PDP-7). Using that BPL -> C translator, they ported their operating system onto the PDP-11. Of course they had to write assembly code to handle the various low-level drivers. It was that port of their PDP-7 operating system that grew into UNIX(TM). By the way, "UNIX" means "UNIversal eXecutive" according to the Bell Labs guy I talked to. Also witness the various ports of UNIX to a plethora of platforms, all using (as far as I know) some manifestation of C. For example, LINUX and BSD.

                              S 1 Reply Last reply
                              0
                              • R Russell Ranshaw

                                "... but they failed [well partially they succeeded] to achieve this nifty goal." In what way did they fail to achieve that goal? More of the history is that Bell Labs developed an operating system for the PDP-7 (for what there was none previously). After they got C running on the PDP-11, they used C to write a translator from BPL (ie, their compiler for the PDP-7). Using that BPL -> C translator, they ported their operating system onto the PDP-11. Of course they had to write assembly code to handle the various low-level drivers. It was that port of their PDP-7 operating system that grew into UNIX(TM). By the way, "UNIX" means "UNIversal eXecutive" according to the Bell Labs guy I talked to. Also witness the various ports of UNIX to a plethora of platforms, all using (as far as I know) some manifestation of C. For example, LINUX and BSD.

                                S Offline
                                S Offline
                                Sanmayce
                                wrote on last edited by
                                #83

                                >In what way did they fail to achieve that goal? I meant the present time, you are into the genesis whereas I am a simple nowadays C user. Assembly is the core, C mimicks it, for example recently (for reference Fastest strstr-like Function in C!? article)I defined a variable as a register and guess what despite the desperate need of this C has had other agenda - I mean C is good Assembly is best. >By the way, "UNIX" means "UNIversal eXecutive" It makes sense, it is strange how such basic/important names are unknown.

                                Get down get down get down get it on show love and give it up What are you waiting on?

                                R 1 Reply Last reply
                                0
                                • S Sanmayce

                                  >In what way did they fail to achieve that goal? I meant the present time, you are into the genesis whereas I am a simple nowadays C user. Assembly is the core, C mimicks it, for example recently (for reference Fastest strstr-like Function in C!? article)I defined a variable as a register and guess what despite the desperate need of this C has had other agenda - I mean C is good Assembly is best. >By the way, "UNIX" means "UNIversal eXecutive" It makes sense, it is strange how such basic/important names are unknown.

                                  Get down get down get down get it on show love and give it up What are you waiting on?

                                  R Offline
                                  R Offline
                                  Russell Ranshaw
                                  wrote on last edited by
                                  #84

                                  "I defined a variable as a register and guess what despite the desperate need of this C has had other agenda - I mean C is good Assembly is best." Well, ideally C ought to be kept distinct from processor specific features. Unfortunately, as I mentioned in my first post, such have been part and parcel of the language since its original specification. Or soon after. In fact, I for one would love to see the ++x/--x et al removed from the language, as they ARE based on the capabilities of the PDP-11. That aside, I have ported C to the 6502, 6800, and even the PDP-10 processors, retaining various portions of the language as were feasible on a particular CPU. They worked very well. Yes, I had to rely on assembly code for the "down and dirty" things. But not all CPUs have addressable registers, which would make such a capability in C a tad awkward, to say the least. To quote the Bible (sort of): "Render unto the assembler that which is the assembler's."

                                  S 1 Reply Last reply
                                  0
                                  • R Russell Ranshaw

                                    "I defined a variable as a register and guess what despite the desperate need of this C has had other agenda - I mean C is good Assembly is best." Well, ideally C ought to be kept distinct from processor specific features. Unfortunately, as I mentioned in my first post, such have been part and parcel of the language since its original specification. Or soon after. In fact, I for one would love to see the ++x/--x et al removed from the language, as they ARE based on the capabilities of the PDP-11. That aside, I have ported C to the 6502, 6800, and even the PDP-10 processors, retaining various portions of the language as were feasible on a particular CPU. They worked very well. Yes, I had to rely on assembly code for the "down and dirty" things. But not all CPUs have addressable registers, which would make such a capability in C a tad awkward, to say the least. To quote the Bible (sort of): "Render unto the assembler that which is the assembler's."

                                    S Offline
                                    S Offline
                                    Sanmayce
                                    wrote on last edited by
                                    #85

                                    Having "flying hours" in Assembly is a precious thing - it marks one's way of thinking for life with a strong base and steady sight on all kind of nasty problems. I see your wish for an untainted language, however I am in despair dealing STILL with such basic things as basic memory management (hash, memmem, b-tree) functions. I firmly believe that building programs on rotten ground (slow basic functions) is out-of-style boring and artless dead-end. Call me out-of-date and delusional but I still have strong romantic affinity towards artistic approach in programming - I hate this doomed situation in which we all are trapped now - fast PCs and slow software not exploiting the might given by the technology - I am sure you feel that even better than me having dealt with assemblers. In a few words: it is a crime not to utilize fully the potential of a given processor - this shows sloppy attitude not only toward programming but also toward everything else. I would like to hear from you how do you see one particular task properly get done: How to rip (down to unique phrases of some order) the whole electronic English language with a C console program (or rather etude). I am talking about my pride-and-joy Leprechaun - the fastest written in C x-gram ripper on the Internet. My point is that out there exist a lot of programming languages but when comes to the most basic things as helping people with natural languages sidekick/statistical/suggestion tools programmers are in debt to users.

                                    Get down get down get down get it on show love and give it up What are you waiting on?

                                    R 1 Reply Last reply
                                    0
                                    • S Sanmayce

                                      Having "flying hours" in Assembly is a precious thing - it marks one's way of thinking for life with a strong base and steady sight on all kind of nasty problems. I see your wish for an untainted language, however I am in despair dealing STILL with such basic things as basic memory management (hash, memmem, b-tree) functions. I firmly believe that building programs on rotten ground (slow basic functions) is out-of-style boring and artless dead-end. Call me out-of-date and delusional but I still have strong romantic affinity towards artistic approach in programming - I hate this doomed situation in which we all are trapped now - fast PCs and slow software not exploiting the might given by the technology - I am sure you feel that even better than me having dealt with assemblers. In a few words: it is a crime not to utilize fully the potential of a given processor - this shows sloppy attitude not only toward programming but also toward everything else. I would like to hear from you how do you see one particular task properly get done: How to rip (down to unique phrases of some order) the whole electronic English language with a C console program (or rather etude). I am talking about my pride-and-joy Leprechaun - the fastest written in C x-gram ripper on the Internet. My point is that out there exist a lot of programming languages but when comes to the most basic things as helping people with natural languages sidekick/statistical/suggestion tools programmers are in debt to users.

                                      Get down get down get down get it on show love and give it up What are you waiting on?

                                      R Offline
                                      R Offline
                                      Russell Ranshaw
                                      wrote on last edited by
                                      #86

                                      Much of my professional software design/implementation career was filled with the necessity to milk the last smidgen of performance out of some code. The environment was based on DEC PDP-10 processors (KL series) in a commercial time-sharing setting (CompuServe). For the most part, we used a language called BLISS, which was truly a magnificent implementation language. Its optimizer produced code that was probably 99% as good as I could do by "hand". This, coupled with something I've not seen elsewhere in a programming language, lexical processing, was the foundation of a great deal of CompuServe's CIS software. One of my areas involved the error-correcting protocols used for file upload/download. The most used protocol, B Plus, was the cornerstone of what was called HMI, Host Micro Interface. This protocol performed so well that it earned me the "honor" of frequent verbal battles with the OS developers because a B Plus data transfer would hit the "sweet spot" in performnce. This means that by the time a packet was finished being transmitted, the acknowledgement of the previous packet would already be waiting, and several packets could usually be sent in one time-slice. Fun stuff! Natural language processing is something I often pondered, with no real break throughs. I always believed that a different kind of memory access was needed, something akin to the current CPU chips (Pentium) with their cache memory. The cache is a marvel of design. If that technology was to be extended to what I term "content addressable memory", where say a word fetches data by using the actual numeric letter value is the "address" presented to be found. But this is only a beginning of the processing utilized by the human brain. I think that neural network programming might open the door to fast cognitive processing, but we're a loooong way from anything practical along these lines. Way back, the trio of Newel, Simon and Shaw were among the first to contemplate what they termed "Information Processing". One of them said, "The problem with trying to teach a computer to understand natural language is that so few of we humans understand it to begin with." After fifty some years, the situation really hasn't progressed all that much.

                                      S 1 Reply Last reply
                                      0
                                      • R Russell Ranshaw

                                        Much of my professional software design/implementation career was filled with the necessity to milk the last smidgen of performance out of some code. The environment was based on DEC PDP-10 processors (KL series) in a commercial time-sharing setting (CompuServe). For the most part, we used a language called BLISS, which was truly a magnificent implementation language. Its optimizer produced code that was probably 99% as good as I could do by "hand". This, coupled with something I've not seen elsewhere in a programming language, lexical processing, was the foundation of a great deal of CompuServe's CIS software. One of my areas involved the error-correcting protocols used for file upload/download. The most used protocol, B Plus, was the cornerstone of what was called HMI, Host Micro Interface. This protocol performed so well that it earned me the "honor" of frequent verbal battles with the OS developers because a B Plus data transfer would hit the "sweet spot" in performnce. This means that by the time a packet was finished being transmitted, the acknowledgement of the previous packet would already be waiting, and several packets could usually be sent in one time-slice. Fun stuff! Natural language processing is something I often pondered, with no real break throughs. I always believed that a different kind of memory access was needed, something akin to the current CPU chips (Pentium) with their cache memory. The cache is a marvel of design. If that technology was to be extended to what I term "content addressable memory", where say a word fetches data by using the actual numeric letter value is the "address" presented to be found. But this is only a beginning of the processing utilized by the human brain. I think that neural network programming might open the door to fast cognitive processing, but we're a loooong way from anything practical along these lines. Way back, the trio of Newel, Simon and Shaw were among the first to contemplate what they termed "Information Processing". One of them said, "The problem with trying to teach a computer to understand natural language is that so few of we humans understand it to begin with." After fifty some years, the situation really hasn't progressed all that much.

                                        S Offline
                                        S Offline
                                        Sanmayce
                                        wrote on last edited by
                                        #87

                                        Thanks, despite the difference between us (me being an amateur) I share your vision. The powerful (graph) algorithms implemented by some real programmers is what fascinates me. I have this romantic expectation for so long. Sadly my knowledge of graphs is next to nothing, because of that I wonder how the/my brute-force i.e. dummy approach would handle (in my calculations) some 5 billion four-word-phrases i.e. 4-grams. >After fifty some years, the situation really hasn't progressed all that much. Fully agree, a change (at least some small but firm steps) is needed otherwise we all look like a monkey playing with a playstation. Very glad to learn from you, best regards.

                                        Get down get down get down get it on show love and give it up What are you waiting on?

                                        R 1 Reply Last reply
                                        0
                                        • S Sanmayce

                                          Thanks, despite the difference between us (me being an amateur) I share your vision. The powerful (graph) algorithms implemented by some real programmers is what fascinates me. I have this romantic expectation for so long. Sadly my knowledge of graphs is next to nothing, because of that I wonder how the/my brute-force i.e. dummy approach would handle (in my calculations) some 5 billion four-word-phrases i.e. 4-grams. >After fifty some years, the situation really hasn't progressed all that much. Fully agree, a change (at least some small but firm steps) is needed otherwise we all look like a monkey playing with a playstation. Very glad to learn from you, best regards.

                                          Get down get down get down get it on show love and give it up What are you waiting on?

                                          R Offline
                                          R Offline
                                          Russell Ranshaw
                                          wrote on last edited by
                                          #88

                                          Well, we are a very long way from the HAL computer of "2001", that's for sure. There are several components to the problem of computer "intelligence". The first problem is one of storage capacity. The human brain is made up of about one trillion neurons. Each neuron is linked with from 10 to 10,000 other neurons. The number of pathways is in the vicinity of 10 to the 100th power. That's ten followed by 100 zeros, a might huge number! It exceeds the estimated number of particles in the entire universe! The brain stores everything in this massive network of neurons in the form of nerve impulses that traverse one or more of these pathways. On top of that, the individual circulating thoughts (or memories, or whatever) are also connected by other pathways of circulating nerve impulses. In computer terms, the brain is an organic associative memory. Add to the above the existence of emotions, images, sounds, smells, and all of the other senses, all of which are remembered in the same way. Thus, when you hear the word "rose" your thoughts instantly conjure up an image of a rose, it's smell, the fact that the plant has thorns, a memory of the time you gave your mother a rose and whe hugged you, perhaps of a girl to whom you gave a rose and she kissed you. All of this happens in a flash, filling your consciousness with great feelings. So the question is, how do we accomplish this with "artificial intelligence"? It beats me! Animals are no different. They possess memories and emotions, even love, strange as that might seem. When I was in high school, I kept an aquarium. One of my favorite fish was a male beta "fighting" fish. (The only thing it will fight with is another male beta.) We used to catch flies and toss them into the tank, whereupon the beta would swim over and gulp it down. During the winter, small amounts of ground meat replaced the flies. After a while of doing this, the beta would swim over to the side of the tank whenever I approached. He would accept dead flies or ground me from my fingers, and even let me gently stroke his sides. But only for me. My parents would not receive the same acceptance. I believe that silly fish loved me! I think that the key lies in what we call experience. As we mature, we undergo the slow implanting of our conglomerate memories and associations. In order to produce a HAL, it would have to undergo a "growing up" process. And there are so many variables that it staggers the mind, and surely would tax the brains of several billions of pro

                                          S 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups