Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. What I want.....

What I want.....

Scheduled Pinned Locked Moved The Lounge
linuxhelpquestionlearning
45 Posts 23 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • F Forogar

    OMG!! Pass the mind-bleach!

    - I would love to change the world, but they won’t give me the source code.

    OriginalGriffO Offline
    OriginalGriffO Offline
    OriginalGriff
    wrote on last edited by
    #11

    OK - Mind Bleach[^]

    Sent from my Amstrad PC 1640 Never throw anything away, Griff Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!

    "I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
    "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt

    F 1 Reply Last reply
    0
    • OriginalGriffO OriginalGriff

      OK - Mind Bleach[^]

      Sent from my Amstrad PC 1640 Never throw anything away, Griff Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!

      F Offline
      F Offline
      Forogar
      wrote on last edited by
      #12

      I knew what is was going to be and yet I still went there! You b&^$@$*&! ;P

      - I would love to change the world, but they won’t give me the source code.

      OriginalGriffO 1 Reply Last reply
      0
      • F Forogar

        I knew what is was going to be and yet I still went there! You b&^$@$*&! ;P

        - I would love to change the world, but they won’t give me the source code.

        OriginalGriffO Offline
        OriginalGriffO Offline
        OriginalGriff
        wrote on last edited by
        #13

        Took your mind off of Melissa Lynn tho' ... My work here is done. :-D

        Sent from my Amstrad PC 1640 Never throw anything away, Griff Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!

        "I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
        "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt

        1 Reply Last reply
        0
        • OriginalGriffO OriginalGriff

          I don't. Some languages - like C# - are strongly typed for a reason: to catch errors early. Firstly, by catching type conversions at compile time means that the code does exactly what you wanted, or it doesn't compile. Secondly, by making you explicitly convert things like user input to the type you want and providing exceptions (or "failed" responses as appropriate) if the user input doesn't match up. The global implicit typing you seem to prefer leads to errors because the compiler has to "guess" what you wanted - and that means that bad data gets into the system undetected. And until you've had to unpick a 100,000 row DB to try and fix dates that are entered as dd-MM-yy and MM-dd-YY you probably don't realise just how much of a PITA that is. And then there is the "pointer problem": a pointer to a ASCII char is a pointer to a byte, but a Unicode character is a pointer to a word. You can - by casting - explicitly convert a byte pointer to a word pointer but that doesn't change the underlying data, and it means that half the data accesses aren't going to work properly, because the byte pointer can be "half way up" a word value. And strong typing (in C#, if not in C or C++) also eliminates your "=" vs "==" in most cases because the result is not a bool so it can't be used in an conditional. You can set the C or C++ compiler to give warnings or errors when you type it wrong , but it is valid because being old languages they don't have a native boolean value: any nonzero value is "true". You want to get away from the complexity and want consistent declarations? Try C# ... (but hurry, it's getting complicated as well now).

          Sent from my Amstrad PC 1640 Never throw anything away, Griff Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!

          R Offline
          R Offline
          Rick York
          wrote on last edited by
          #14

          I agree with everything you wrote except the last paragraph. I despise everything about dot nyet.

          "They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"

          OriginalGriffO M 2 Replies Last reply
          0
          • R Rick York

            I agree with everything you wrote except the last paragraph. I despise everything about dot nyet.

            "They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"

            OriginalGriffO Offline
            OriginalGriffO Offline
            OriginalGriff
            wrote on last edited by
            #15

            .NET is useful - it provides a huge library of classes that work in a consistent way, unlike the libraries you had to play with with C / C++, and reduces the memory leak problems endemic to pointer based code written by people who think they know what they are doing ... :laugh: It's a tool - and one that works across multiple platforms with a high degree of "similarity". Try that with a native C compiler, and it becomes a struggle to get anything to work in a short timeframe. Don't get me wrong, I miss my assembler days (or decades more accurately) - but .NET is here to stay and I'm happy using it.

            Sent from my Amstrad PC 1640 Never throw anything away, Griff Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!

            "I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
            "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt

            C S 2 Replies Last reply
            0
            • OriginalGriffO OriginalGriff

              ...All I Want[^]

              Sent from my Amstrad PC 1640 Never throw anything away, Griff Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!

              Richard DeemingR Offline
              Richard DeemingR Offline
              Richard Deeming
              wrote on last edited by
              #16

              ...All That She Wants[^]


              "These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer

              "These people looked deep within my soul and assigned me a number based on the order in which I joined" - Homer

              1 Reply Last reply
              0
              • R rjmoses

                I do a lot of development, migrating, porting, etc., in different environments, different systems, different languages, different versions,....get the idea? I have specialized over the years in doing weird stuff. I just finished chasing a bug in some vintage commercial software that was written to accommodate big-endian and little-endian integers depending on the platform. It turned out be that it applied big-endian logic in one place where little-endian was required. The why's and wherefore's of doing this are not important. What is important is the line of thought I reached upon conclusion, i.e., What I want to see in software. 1) I want to be able to declare my variables of a type and then have the necessary conversions take place automatically. E.g., I want to declare a "X" as a 2-byte integer and "S" as an ASCII (or utf8, utf16, whatever...) character string and be able simply specify 'S = X AS "format" --not "S =X.toString("fmt");", "S = itoa(i);" or whatever language conversion functions are appropriate. I know what I have going in, I know what I want coming out--make it easy to get from A to B!! 2) I want my data declarations to be consistent across all platforms/languages/environments/...! Before some reader gets all hot and bothered about using C typedefs, etc., in declarations, I know this can be done. What I want is consistency--i.e., a "short" to be a 2 byte integer, a "long" to be a 4 byte integer,....get the drift. Part of problem I was chasing had to do with the software expecting a C int to be 32 bits, but some 64 bit environments define an int as 64 bits 3) I want my utilities and commands to operate the same way, with the same results, across all platforms. If I do a "myutil -x" command on Linux, I want to see EXACTLY the same output and results across cygwin, Windows 10 Ubuntu, Debian, etc. 4) I want clear, simple, understandable, comprehensible programming constructs. I am tired of chasing errors such as where somebody fat-fingered "=" when "==" was meant or where a brace was misplaced or omitted. I want to be able to look at a piece of code and understand what the author intended easily and clearly. 5) I want clear, complete commercial documentation. I have seen thousands of circular definitions such as: returntype FunctionXYZ(int p1, int p2); Returns XYZ of p1 and p2. BIG WHOOPING DEAL! I'm not an idiot--I can plainly see that from the function call. I often need to know HOW it uses p1 and p2 to arrive at XYZ. (Of course, by now,

                D Offline
                D Offline
                Dr Walt Fair PE
                wrote on last edited by
                #17

                rjmoses wrote:

                I do a lot of development, migrating, porting, etc., in different environments, different systems, different languages, different versions,....get the idea? I have specialized over the years in doing weird stuff.

                Good! I got a PHD by asking questions about things that everyone else assumed, so that seemed weird at the time, but I was right to question the assumptions, it turns out they were wrong. The weird problems are the most interesting.

                CQ de W5ALT

                Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software

                D R 2 Replies Last reply
                0
                • F Forogar

                  I concur - but I skip the cheese part.

                  - I would love to change the world, but they won’t give me the source code.

                  D Offline
                  D Offline
                  Dr Walt Fair PE
                  wrote on last edited by
                  #18

                  Agreed, and also skip the burger part. What's wrong with a plain bacon sandwich?

                  CQ de W5ALT

                  Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software

                  1 Reply Last reply
                  0
                  • OriginalGriffO OriginalGriff

                    I don't. Some languages - like C# - are strongly typed for a reason: to catch errors early. Firstly, by catching type conversions at compile time means that the code does exactly what you wanted, or it doesn't compile. Secondly, by making you explicitly convert things like user input to the type you want and providing exceptions (or "failed" responses as appropriate) if the user input doesn't match up. The global implicit typing you seem to prefer leads to errors because the compiler has to "guess" what you wanted - and that means that bad data gets into the system undetected. And until you've had to unpick a 100,000 row DB to try and fix dates that are entered as dd-MM-yy and MM-dd-YY you probably don't realise just how much of a PITA that is. And then there is the "pointer problem": a pointer to a ASCII char is a pointer to a byte, but a Unicode character is a pointer to a word. You can - by casting - explicitly convert a byte pointer to a word pointer but that doesn't change the underlying data, and it means that half the data accesses aren't going to work properly, because the byte pointer can be "half way up" a word value. And strong typing (in C#, if not in C or C++) also eliminates your "=" vs "==" in most cases because the result is not a bool so it can't be used in an conditional. You can set the C or C++ compiler to give warnings or errors when you type it wrong , but it is valid because being old languages they don't have a native boolean value: any nonzero value is "true". You want to get away from the complexity and want consistent declarations? Try C# ... (but hurry, it's getting complicated as well now).

                    Sent from my Amstrad PC 1640 Never throw anything away, Griff Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!

                    D Offline
                    D Offline
                    Dr Walt Fair PE
                    wrote on last edited by
                    #19

                    OriginalGriff wrote:

                    Some languages - like C# - are strongly typed for a reason: to catch errors early.

                    I found going from Object Pascall (Delphi) to C# was pretty easy, since they are both strongly typed. Of course going from ALGOL to PASCAl was also fairly easy, but going from FORTRAN to ALGOL was painful.

                    CQ de W5ALT

                    Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software

                    1 Reply Last reply
                    0
                    • D Dr Walt Fair PE

                      rjmoses wrote:

                      I do a lot of development, migrating, porting, etc., in different environments, different systems, different languages, different versions,....get the idea? I have specialized over the years in doing weird stuff.

                      Good! I got a PHD by asking questions about things that everyone else assumed, so that seemed weird at the time, but I was right to question the assumptions, it turns out they were wrong. The weird problems are the most interesting.

                      CQ de W5ALT

                      Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software

                      D Offline
                      D Offline
                      Daniel Pfeffer
                      wrote on last edited by
                      #20

                      Never neglect the "trivial" roots of an equation. :)

                      Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.

                      1 Reply Last reply
                      0
                      • R Rick York

                        I agree with everything you wrote except the last paragraph. I despise everything about dot nyet.

                        "They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"

                        M Offline
                        M Offline
                        Marc Clifton
                        wrote on last edited by
                        #21

                        Rick York wrote:

                        I despise everything about dot nyet.

                        I'm curious about why? Particularly since I have quite the opposite reaction. :)

                        Latest Article - Azure Function - Compute Pi Stress Test Learning to code with python is like learning to swim with those little arm floaties. It gives you undeserved confidence and will eventually drown you. - DangerBunny Artificial intelligence is the only remedy for natural stupidity. - CDP1802

                        C 1 Reply Last reply
                        0
                        • OriginalGriffO OriginalGriff

                          He's playing with you: VB6 died for new projects in 2002 or so when .NET was released.

                          Sent from my Amstrad PC 1640 Never throw anything away, Griff Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!

                          R Offline
                          R Offline
                          RickZeeland
                          wrote on last edited by
                          #22

                          QuickBasic then, it is still possible to use it ! https://www.slant.co/topics/9807/viewpoints/6/~basic-like-programming-languages~freebasic[^]

                          1 Reply Last reply
                          0
                          • R rjmoses

                            I do a lot of development, migrating, porting, etc., in different environments, different systems, different languages, different versions,....get the idea? I have specialized over the years in doing weird stuff. I just finished chasing a bug in some vintage commercial software that was written to accommodate big-endian and little-endian integers depending on the platform. It turned out be that it applied big-endian logic in one place where little-endian was required. The why's and wherefore's of doing this are not important. What is important is the line of thought I reached upon conclusion, i.e., What I want to see in software. 1) I want to be able to declare my variables of a type and then have the necessary conversions take place automatically. E.g., I want to declare a "X" as a 2-byte integer and "S" as an ASCII (or utf8, utf16, whatever...) character string and be able simply specify 'S = X AS "format" --not "S =X.toString("fmt");", "S = itoa(i);" or whatever language conversion functions are appropriate. I know what I have going in, I know what I want coming out--make it easy to get from A to B!! 2) I want my data declarations to be consistent across all platforms/languages/environments/...! Before some reader gets all hot and bothered about using C typedefs, etc., in declarations, I know this can be done. What I want is consistency--i.e., a "short" to be a 2 byte integer, a "long" to be a 4 byte integer,....get the drift. Part of problem I was chasing had to do with the software expecting a C int to be 32 bits, but some 64 bit environments define an int as 64 bits 3) I want my utilities and commands to operate the same way, with the same results, across all platforms. If I do a "myutil -x" command on Linux, I want to see EXACTLY the same output and results across cygwin, Windows 10 Ubuntu, Debian, etc. 4) I want clear, simple, understandable, comprehensible programming constructs. I am tired of chasing errors such as where somebody fat-fingered "=" when "==" was meant or where a brace was misplaced or omitted. I want to be able to look at a piece of code and understand what the author intended easily and clearly. 5) I want clear, complete commercial documentation. I have seen thousands of circular definitions such as: returntype FunctionXYZ(int p1, int p2); Returns XYZ of p1 and p2. BIG WHOOPING DEAL! I'm not an idiot--I can plainly see that from the function call. I often need to know HOW it uses p1 and p2 to arrive at XYZ. (Of course, by now,

                            realJSOPR Offline
                            realJSOPR Offline
                            realJSOP
                            wrote on last edited by
                            #23

                            My ex-wife has a big endian. A REALLY big endian. In fact, it's so big, she counteracts the effect of the moon on local tide tables.

                            ".45 ACP - because shooting twice is just silly" - JSOP, 2010
                            -----
                            You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
                            -----
                            When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013

                            S M 2 Replies Last reply
                            0
                            • realJSOPR realJSOP

                              My ex-wife has a big endian. A REALLY big endian. In fact, it's so big, she counteracts the effect of the moon on local tide tables.

                              ".45 ACP - because shooting twice is just silly" - JSOP, 2010
                              -----
                              You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
                              -----
                              When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013

                              S Offline
                              S Offline
                              Slacker007
                              wrote on last edited by
                              #24

                              When she sits around the house, she sits aaa-rrrr-ooo-uuuu-nnnn-dddd the house.

                              1 Reply Last reply
                              0
                              • realJSOPR realJSOP

                                My ex-wife has a big endian. A REALLY big endian. In fact, it's so big, she counteracts the effect of the moon on local tide tables.

                                ".45 ACP - because shooting twice is just silly" - JSOP, 2010
                                -----
                                You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
                                -----
                                When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013

                                M Offline
                                M Offline
                                MarkTJohnson
                                wrote on last edited by
                                #25

                                and THAT's why she's not getting your mustang, the shocks won't hold up.

                                1 Reply Last reply
                                0
                                • M Marc Clifton

                                  Rick York wrote:

                                  I despise everything about dot nyet.

                                  I'm curious about why? Particularly since I have quite the opposite reaction. :)

                                  Latest Article - Azure Function - Compute Pi Stress Test Learning to code with python is like learning to swim with those little arm floaties. It gives you undeserved confidence and will eventually drown you. - DangerBunny Artificial intelligence is the only remedy for natural stupidity. - CDP1802

                                  C Offline
                                  C Offline
                                  CodeWraith
                                  wrote on last edited by
                                  #26

                                  Just one word: Mickeysoft :-) .Net can never be so good that they get me to marry them and then let them move in and do whatever they like.

                                  I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                                  M 1 Reply Last reply
                                  0
                                  • OriginalGriffO OriginalGriff

                                    .NET is useful - it provides a huge library of classes that work in a consistent way, unlike the libraries you had to play with with C / C++, and reduces the memory leak problems endemic to pointer based code written by people who think they know what they are doing ... :laugh: It's a tool - and one that works across multiple platforms with a high degree of "similarity". Try that with a native C compiler, and it becomes a struggle to get anything to work in a short timeframe. Don't get me wrong, I miss my assembler days (or decades more accurately) - but .NET is here to stay and I'm happy using it.

                                    Sent from my Amstrad PC 1640 Never throw anything away, Griff Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!

                                    C Offline
                                    C Offline
                                    CodeWraith
                                    wrote on last edited by
                                    #27

                                    OriginalGriff wrote:

                                    and reduces the memory leak problems endemic to pointer based code written by people who think they know what they are doing ... :laugh:

                                    In other words: .Net is for those who don't know what they are doing. Excellent argument. :-)

                                    OriginalGriff wrote:

                                    It's a tool - and one that works across multiple platforms with a high degree of "similarity". Try that with a native C compiler, and it becomes a struggle to get anything to work in a short timeframe.

                                    Have been trying that in the last days. It's not as bad as you think anymore.

                                    OriginalGriff wrote:

                                    Don't get me wrong, I miss my assembler days (or decades more accurately) - but .NET is here to stay and I'm happy using it

                                    In other words: You have become too comfortable and now you finally love the Big Brother. :-)

                                    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                                    1 Reply Last reply
                                    0
                                    • R rjmoses

                                      I do a lot of development, migrating, porting, etc., in different environments, different systems, different languages, different versions,....get the idea? I have specialized over the years in doing weird stuff. I just finished chasing a bug in some vintage commercial software that was written to accommodate big-endian and little-endian integers depending on the platform. It turned out be that it applied big-endian logic in one place where little-endian was required. The why's and wherefore's of doing this are not important. What is important is the line of thought I reached upon conclusion, i.e., What I want to see in software. 1) I want to be able to declare my variables of a type and then have the necessary conversions take place automatically. E.g., I want to declare a "X" as a 2-byte integer and "S" as an ASCII (or utf8, utf16, whatever...) character string and be able simply specify 'S = X AS "format" --not "S =X.toString("fmt");", "S = itoa(i);" or whatever language conversion functions are appropriate. I know what I have going in, I know what I want coming out--make it easy to get from A to B!! 2) I want my data declarations to be consistent across all platforms/languages/environments/...! Before some reader gets all hot and bothered about using C typedefs, etc., in declarations, I know this can be done. What I want is consistency--i.e., a "short" to be a 2 byte integer, a "long" to be a 4 byte integer,....get the drift. Part of problem I was chasing had to do with the software expecting a C int to be 32 bits, but some 64 bit environments define an int as 64 bits 3) I want my utilities and commands to operate the same way, with the same results, across all platforms. If I do a "myutil -x" command on Linux, I want to see EXACTLY the same output and results across cygwin, Windows 10 Ubuntu, Debian, etc. 4) I want clear, simple, understandable, comprehensible programming constructs. I am tired of chasing errors such as where somebody fat-fingered "=" when "==" was meant or where a brace was misplaced or omitted. I want to be able to look at a piece of code and understand what the author intended easily and clearly. 5) I want clear, complete commercial documentation. I have seen thousands of circular definitions such as: returntype FunctionXYZ(int p1, int p2); Returns XYZ of p1 and p2. BIG WHOOPING DEAL! I'm not an idiot--I can plainly see that from the function call. I often need to know HOW it uses p1 and p2 to arrive at XYZ. (Of course, by now,

                                      D Offline
                                      D Offline
                                      Dean Roddey
                                      wrote on last edited by
                                      #28

                                      Once you start writing significantly sized software automatic conversions are the last thing you want. It's a mine field of errors waiting to happen, IMO. As is often the case in line, you can sort of pick one, easy or good. You can't really have both unless it's a fairly modest undertaking. When it gets serious, the old saying of measure twice, cut once really applies. The time you spend being very specific to the compiler about what you want to happen will save you endless woe. This is one of the things that really makes me shake my head at modern C++ where people are using 'auto' all over the place.

                                      Explorans limites defectum

                                      1 Reply Last reply
                                      0
                                      • R rjmoses

                                        I do a lot of development, migrating, porting, etc., in different environments, different systems, different languages, different versions,....get the idea? I have specialized over the years in doing weird stuff. I just finished chasing a bug in some vintage commercial software that was written to accommodate big-endian and little-endian integers depending on the platform. It turned out be that it applied big-endian logic in one place where little-endian was required. The why's and wherefore's of doing this are not important. What is important is the line of thought I reached upon conclusion, i.e., What I want to see in software. 1) I want to be able to declare my variables of a type and then have the necessary conversions take place automatically. E.g., I want to declare a "X" as a 2-byte integer and "S" as an ASCII (or utf8, utf16, whatever...) character string and be able simply specify 'S = X AS "format" --not "S =X.toString("fmt");", "S = itoa(i);" or whatever language conversion functions are appropriate. I know what I have going in, I know what I want coming out--make it easy to get from A to B!! 2) I want my data declarations to be consistent across all platforms/languages/environments/...! Before some reader gets all hot and bothered about using C typedefs, etc., in declarations, I know this can be done. What I want is consistency--i.e., a "short" to be a 2 byte integer, a "long" to be a 4 byte integer,....get the drift. Part of problem I was chasing had to do with the software expecting a C int to be 32 bits, but some 64 bit environments define an int as 64 bits 3) I want my utilities and commands to operate the same way, with the same results, across all platforms. If I do a "myutil -x" command on Linux, I want to see EXACTLY the same output and results across cygwin, Windows 10 Ubuntu, Debian, etc. 4) I want clear, simple, understandable, comprehensible programming constructs. I am tired of chasing errors such as where somebody fat-fingered "=" when "==" was meant or where a brace was misplaced or omitted. I want to be able to look at a piece of code and understand what the author intended easily and clearly. 5) I want clear, complete commercial documentation. I have seen thousands of circular definitions such as: returntype FunctionXYZ(int p1, int p2); Returns XYZ of p1 and p2. BIG WHOOPING DEAL! I'm not an idiot--I can plainly see that from the function call. I often need to know HOW it uses p1 and p2 to arrive at XYZ. (Of course, by now,

                                        M Offline
                                        M Offline
                                        Marc Clifton
                                        wrote on last edited by
                                        #29

                                        rjmoses wrote:

                                        What's your thoughts?

                                        After pondering this post all day, I finally came up with a response. IDIC[^] you must learn and become one with. (Argh, a Star Trek and Star Wars reference in one sentence.)

                                        Latest Article - Azure Function - Compute Pi Stress Test Learning to code with python is like learning to swim with those little arm floaties. It gives you undeserved confidence and will eventually drown you. - DangerBunny Artificial intelligence is the only remedy for natural stupidity. - CDP1802

                                        1 Reply Last reply
                                        0
                                        • OriginalGriffO OriginalGriff

                                          I don't. Some languages - like C# - are strongly typed for a reason: to catch errors early. Firstly, by catching type conversions at compile time means that the code does exactly what you wanted, or it doesn't compile. Secondly, by making you explicitly convert things like user input to the type you want and providing exceptions (or "failed" responses as appropriate) if the user input doesn't match up. The global implicit typing you seem to prefer leads to errors because the compiler has to "guess" what you wanted - and that means that bad data gets into the system undetected. And until you've had to unpick a 100,000 row DB to try and fix dates that are entered as dd-MM-yy and MM-dd-YY you probably don't realise just how much of a PITA that is. And then there is the "pointer problem": a pointer to a ASCII char is a pointer to a byte, but a Unicode character is a pointer to a word. You can - by casting - explicitly convert a byte pointer to a word pointer but that doesn't change the underlying data, and it means that half the data accesses aren't going to work properly, because the byte pointer can be "half way up" a word value. And strong typing (in C#, if not in C or C++) also eliminates your "=" vs "==" in most cases because the result is not a bool so it can't be used in an conditional. You can set the C or C++ compiler to give warnings or errors when you type it wrong , but it is valid because being old languages they don't have a native boolean value: any nonzero value is "true". You want to get away from the complexity and want consistent declarations? Try C# ... (but hurry, it's getting complicated as well now).

                                          Sent from my Amstrad PC 1640 Never throw anything away, Griff Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!

                                          R Offline
                                          R Offline
                                          rjmoses
                                          wrote on last edited by
                                          #30

                                          I also have had to fix problems such as the MM-DD-YY problem you described. Strong typing often causes as many problems as weak typing because people want to find a solution for their problem. I'm thinking along the lines of a "definable" strong type conversions. Using arithmetric conversions such as integer to character, I would simply like to say "string S = i" having previously declared "i" as an integer. Then, add optional meta-data such as format. Regarding pointers, C++ pointers has made debugging difficult. And having to cast causes even more confusion. I'm thinking the majority of pointer and casting problems are caused as a result of less experienced or lazy developers trying to find a quick, workable (in most cases) solution to their problem. I'm just wondering if there isn't perhaps a better solution.

                                          OriginalGriffO 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups