Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
CODE PROJECT For Those Who Code
  • Home
  • Articles
  • FAQ
Community
  1. Home
  2. The Lounge
  3. Is there a really a problem with safety?

Is there a really a problem with safety?

Scheduled Pinned Locked Moved The Lounge
businesshelpcsharpc++java
5 Posts 4 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • J Offline
    J Offline
    jschell
    wrote on last edited by
    #1

    Following is a link from the Code Project newsletter. Value-Oriented Programming[^] Besides being a bit long (academic journal so of course) it does have quite a bit about language maintaining 'safety'. If you are not familiar and do not want to read the article, think of it as the problem with using a pointer which is null when you expect it to be not null. Since that is the one most talked about. Although the article does have an example about 'int' overflow in C++. Which doesn't seem like a great example to me. Myself language problems have not been a significant source of problems since in the 90s. And that was once I learned to always initialize pointers (C++) and then when reviewing code insist that others do the same. Numeric overflow was sometimes considered but has always been found to not ever to be a problem (I even dealt with the analysis of one of in the recent past - it could never be a problem.) Now following is a list of the problems that I do deal with. And have been dealing with for decades. 1. Failures in design. 2. Failures in requirements. 3. Failures in customer expectation management. 4. Failures is misunderstandings (or total ignorance) about what some change might do. 5. Failures caused by complexity in massive systems. And sometimes not so massive. 6. Failures caused by prioritization (so same problem comes up and is never fixed even when fix is known.) 7. Failures caused by programmers implementing something and either explicitly ignoring stuff either in the code base or in the domain space (very often associated with databases and/or any network protocol.) There are probably some others. Now myself I just go in every day hoping that the most complicated new thing that I must prioritize immediately is a null pointer exception. Because those are easy to find and fix. I haven't worked in the same business domain either. But perhaps it is because I have only been working with C# and Java since the 90's (mostly)? Do others find that they must spend all of their time fixing null pointer errors and numeric overflows?

    O B P 3 Replies Last reply
    0
    • J jschell

      Following is a link from the Code Project newsletter. Value-Oriented Programming[^] Besides being a bit long (academic journal so of course) it does have quite a bit about language maintaining 'safety'. If you are not familiar and do not want to read the article, think of it as the problem with using a pointer which is null when you expect it to be not null. Since that is the one most talked about. Although the article does have an example about 'int' overflow in C++. Which doesn't seem like a great example to me. Myself language problems have not been a significant source of problems since in the 90s. And that was once I learned to always initialize pointers (C++) and then when reviewing code insist that others do the same. Numeric overflow was sometimes considered but has always been found to not ever to be a problem (I even dealt with the analysis of one of in the recent past - it could never be a problem.) Now following is a list of the problems that I do deal with. And have been dealing with for decades. 1. Failures in design. 2. Failures in requirements. 3. Failures in customer expectation management. 4. Failures is misunderstandings (or total ignorance) about what some change might do. 5. Failures caused by complexity in massive systems. And sometimes not so massive. 6. Failures caused by prioritization (so same problem comes up and is never fixed even when fix is known.) 7. Failures caused by programmers implementing something and either explicitly ignoring stuff either in the code base or in the domain space (very often associated with databases and/or any network protocol.) There are probably some others. Now myself I just go in every day hoping that the most complicated new thing that I must prioritize immediately is a null pointer exception. Because those are easy to find and fix. I haven't worked in the same business domain either. But perhaps it is because I have only been working with C# and Java since the 90's (mostly)? Do others find that they must spend all of their time fixing null pointer errors and numeric overflows?

      O Offline
      O Offline
      obermd
      wrote on last edited by
      #2

      The answer to your question is unequivocally YES. I can say that because every cyber-vulnerability is ultimately a failure of safety as elucidated in the article.

      J 1 Reply Last reply
      0
      • J jschell

        Following is a link from the Code Project newsletter. Value-Oriented Programming[^] Besides being a bit long (academic journal so of course) it does have quite a bit about language maintaining 'safety'. If you are not familiar and do not want to read the article, think of it as the problem with using a pointer which is null when you expect it to be not null. Since that is the one most talked about. Although the article does have an example about 'int' overflow in C++. Which doesn't seem like a great example to me. Myself language problems have not been a significant source of problems since in the 90s. And that was once I learned to always initialize pointers (C++) and then when reviewing code insist that others do the same. Numeric overflow was sometimes considered but has always been found to not ever to be a problem (I even dealt with the analysis of one of in the recent past - it could never be a problem.) Now following is a list of the problems that I do deal with. And have been dealing with for decades. 1. Failures in design. 2. Failures in requirements. 3. Failures in customer expectation management. 4. Failures is misunderstandings (or total ignorance) about what some change might do. 5. Failures caused by complexity in massive systems. And sometimes not so massive. 6. Failures caused by prioritization (so same problem comes up and is never fixed even when fix is known.) 7. Failures caused by programmers implementing something and either explicitly ignoring stuff either in the code base or in the domain space (very often associated with databases and/or any network protocol.) There are probably some others. Now myself I just go in every day hoping that the most complicated new thing that I must prioritize immediately is a null pointer exception. Because those are easy to find and fix. I haven't worked in the same business domain either. But perhaps it is because I have only been working with C# and Java since the 90's (mostly)? Do others find that they must spend all of their time fixing null pointer errors and numeric overflows?

        B Offline
        B Offline
        BernardIE5317
        wrote on last edited by
        #3

        just yesterday i dealt w/ a value problem (C++) . i was maintaining references to items stored in a vector via push_back() i.e. the items not the references were in the vector . i expected the references to still be valid since i was pushing back and not altering anything in the vector . oops . my mistake . it took a while to finally discover the vector size was being altered w/ each push_back . the fix was easy i.e. to wit merely set the initial capacity to a large number . recently i've been encountering a number of "tell my grandchildren about" kind of bugs . this may be one of them . another was due to merely omitting a "_" character in a file name . others i have yet to figure out as they disappeared on their own . i am assuming one of those was Explorer changing its mind .

        1 Reply Last reply
        0
        • O obermd

          The answer to your question is unequivocally YES. I can say that because every cyber-vulnerability is ultimately a failure of safety as elucidated in the article.

          J Offline
          J Offline
          jschell
          wrote on last edited by
          #4

          obermd wrote:

          because every cyber-vulnerability is ultimately a failure of safety

          I don't see the reference. The word "cyber" does not appear in the article at all. Nor does security. And as I read it is discussing safety as an attribute of a programming language and not how the programming language is used. As a counter example as far as I can recall most of the security problems related to SSL over the years had to do either with design or implementation. And not anything like a null pointer exception (could be some but not most.)

          1 Reply Last reply
          0
          • J jschell

            Following is a link from the Code Project newsletter. Value-Oriented Programming[^] Besides being a bit long (academic journal so of course) it does have quite a bit about language maintaining 'safety'. If you are not familiar and do not want to read the article, think of it as the problem with using a pointer which is null when you expect it to be not null. Since that is the one most talked about. Although the article does have an example about 'int' overflow in C++. Which doesn't seem like a great example to me. Myself language problems have not been a significant source of problems since in the 90s. And that was once I learned to always initialize pointers (C++) and then when reviewing code insist that others do the same. Numeric overflow was sometimes considered but has always been found to not ever to be a problem (I even dealt with the analysis of one of in the recent past - it could never be a problem.) Now following is a list of the problems that I do deal with. And have been dealing with for decades. 1. Failures in design. 2. Failures in requirements. 3. Failures in customer expectation management. 4. Failures is misunderstandings (or total ignorance) about what some change might do. 5. Failures caused by complexity in massive systems. And sometimes not so massive. 6. Failures caused by prioritization (so same problem comes up and is never fixed even when fix is known.) 7. Failures caused by programmers implementing something and either explicitly ignoring stuff either in the code base or in the domain space (very often associated with databases and/or any network protocol.) There are probably some others. Now myself I just go in every day hoping that the most complicated new thing that I must prioritize immediately is a null pointer exception. Because those are easy to find and fix. I haven't worked in the same business domain either. But perhaps it is because I have only been working with C# and Java since the 90's (mostly)? Do others find that they must spend all of their time fixing null pointer errors and numeric overflows?

            P Offline
            P Offline
            PIEBALDconsult
            wrote on last edited by
            #5

            Contrariwise, safety is not a problem. :D

            1 Reply Last reply
            0
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Don't have an account? Register

            • Login or register to search.
            • First post
              Last post
            0
            • Categories
            • Recent
            • Tags
            • Popular
            • World
            • Users
            • Groups