Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. C Sharps - how are you getting on with nullable reference types?

C Sharps - how are you getting on with nullable reference types?

Scheduled Pinned Locked Moved The Lounge
csharpcomperformancehelptutorial
48 Posts 19 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • R Rob Philpott

    I'm having one of those mornings where I try and catch up on new features they've stuck in C# and .NET. I have to say I don't always like what I read, maybe I'm a coding language Conservative or something, but I do sometimes come around to new features about five years late. Nullable reference types (they're nullable already surely?) ([Embracing nullable reference types | .NET Blog](https://devblogs.microsoft.com/dotnet/embracing-nullable-reference-types/)) is one such example that when I first heard about it a year or two ago I thought it so preposterous that I would never activate it, and sure enough there are no ?s at the end of my strings to date. Now do I dig in or accept change? A null reference, well a null pointer, ultimately a bad address in memory just seems to me an inherent trap with computers. I first did it 40 years ago on a Commodore VIC20 before they had invented exceptions, the thing would just go mental until you switched it off. And every day since for that matter but usually by production such errors are gone. In development they're a good pointer (hey, a pun!) to where things aren't quite right yet. I like things the way they are (were). How do you like nullable reference types?

    Regards, Rob Philpott.

    OriginalGriffO Offline
    OriginalGriffO Offline
    OriginalGriff
    wrote on last edited by
    #3

    That's a difficult one. I haven't started using them, but I suspect I probably should on the basis that the more errors I can catch at compile time rather than run time means less errors I have to specifically test for or code to handle. I can't help the feeling that there are too many C++ and VB fanboise trying to get the nastier bits of of their original code into the C# spec though: var without Linq, dynamic, and so on does kinda dumb down the language without adding any benefit in the real world.

    "I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!

    "I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
    "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt

    N R H 3 Replies Last reply
    0
    • OriginalGriffO OriginalGriff

      That's a difficult one. I haven't started using them, but I suspect I probably should on the basis that the more errors I can catch at compile time rather than run time means less errors I have to specifically test for or code to handle. I can't help the feeling that there are too many C++ and VB fanboise trying to get the nastier bits of of their original code into the C# spec though: var without Linq, dynamic, and so on does kinda dumb down the language without adding any benefit in the real world.

      "I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!

      N Offline
      N Offline
      Nelek
      wrote on last edited by
      #4

      OriginalGriff wrote:

      I haven't started using them, but I suspect I probably should on the basis that the more errors I can catch at compile time rather than run time means less errors I have to specifically test for or code to handle.

      That's actually a really good point. Thanks

      M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.

      1 Reply Last reply
      0
      • OriginalGriffO OriginalGriff

        That's a difficult one. I haven't started using them, but I suspect I probably should on the basis that the more errors I can catch at compile time rather than run time means less errors I have to specifically test for or code to handle. I can't help the feeling that there are too many C++ and VB fanboise trying to get the nastier bits of of their original code into the C# spec though: var without Linq, dynamic, and so on does kinda dumb down the language without adding any benefit in the real world.

        "I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!

        R Offline
        R Offline
        Rob Philpott
        wrote on last edited by
        #5

        Yes it seems var, Brexit and Covid lockdowns are the divisive issues of our times. I'd venture that the young probably are more advocates of var, particularly if they've come from some horrible dynamic language. But those of us who went through the OO revolution in the 90s and had the 'type is everything' mantra drummed into us just find it obscures things. I don't mind dynamic, but there has to be a very good reason for it!

        Regards, Rob Philpott.

        M 1 Reply Last reply
        0
        • N Nelek

          Rob Philpott wrote:

          How do you like nullable reference types?

          If this is a valid answer for you... So far I haven't used them (yet?)

          M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.

          R Offline
          R Offline
          Rob Philpott
          wrote on last edited by
          #6

          Perfectly valid. I like it too!

          Regards, Rob Philpott.

          1 Reply Last reply
          0
          • R Rob Philpott

            I'm having one of those mornings where I try and catch up on new features they've stuck in C# and .NET. I have to say I don't always like what I read, maybe I'm a coding language Conservative or something, but I do sometimes come around to new features about five years late. Nullable reference types (they're nullable already surely?) ([Embracing nullable reference types | .NET Blog](https://devblogs.microsoft.com/dotnet/embracing-nullable-reference-types/)) is one such example that when I first heard about it a year or two ago I thought it so preposterous that I would never activate it, and sure enough there are no ?s at the end of my strings to date. Now do I dig in or accept change? A null reference, well a null pointer, ultimately a bad address in memory just seems to me an inherent trap with computers. I first did it 40 years ago on a Commodore VIC20 before they had invented exceptions, the thing would just go mental until you switched it off. And every day since for that matter but usually by production such errors are gone. In development they're a good pointer (hey, a pun!) to where things aren't quite right yet. I like things the way they are (were). How do you like nullable reference types?

            Regards, Rob Philpott.

            L Offline
            L Offline
            lmoelleb
            wrote on last edited by
            #7

            One of the few features I was really looking forward to. And I am happy with it. Null is very often a special case - and now I can clearly express in the code if this special case is something that needs to be delt with, or something that will not happen. Meaning guard code is in added where it is needed, and not filling up the code where doesn't do anything. Why you would not want the compiler to help identifying inherit traps is something I just don't understand. Maybe because I never had to struggle with assembly on the VIC20 - I was one of those modern kids that did assembly on the C64. :)

            R 1 Reply Last reply
            0
            • R Rob Philpott

              I'm having one of those mornings where I try and catch up on new features they've stuck in C# and .NET. I have to say I don't always like what I read, maybe I'm a coding language Conservative or something, but I do sometimes come around to new features about five years late. Nullable reference types (they're nullable already surely?) ([Embracing nullable reference types | .NET Blog](https://devblogs.microsoft.com/dotnet/embracing-nullable-reference-types/)) is one such example that when I first heard about it a year or two ago I thought it so preposterous that I would never activate it, and sure enough there are no ?s at the end of my strings to date. Now do I dig in or accept change? A null reference, well a null pointer, ultimately a bad address in memory just seems to me an inherent trap with computers. I first did it 40 years ago on a Commodore VIC20 before they had invented exceptions, the thing would just go mental until you switched it off. And every day since for that matter but usually by production such errors are gone. In development they're a good pointer (hey, a pun!) to where things aren't quite right yet. I like things the way they are (were). How do you like nullable reference types?

              Regards, Rob Philpott.

              D Offline
              D Offline
              Daniel Pfeffer
              wrote on last edited by
              #8

              Coming from a C++ background, I would say that the whole point of references (as opposed to pointers) is that a reference can never be null; nullable references are a fundamental violation of the programming model. If you want pointers, why not use C++?

              Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.

              R Greg UtasG L 3 Replies Last reply
              0
              • L lmoelleb

                One of the few features I was really looking forward to. And I am happy with it. Null is very often a special case - and now I can clearly express in the code if this special case is something that needs to be delt with, or something that will not happen. Meaning guard code is in added where it is needed, and not filling up the code where doesn't do anything. Why you would not want the compiler to help identifying inherit traps is something I just don't understand. Maybe because I never had to struggle with assembly on the VIC20 - I was one of those modern kids that did assembly on the C64. :)

                R Offline
                R Offline
                Rob Philpott
                wrote on last edited by
                #9

                lmoelleb wrote:

                Why you would not want the compiler to help identifying inherit traps is something I just don't understand.

                That's a strong argument. Compile time errors have to beat runtime. But I guess I'm just very relaxed about things being null. It's a useful paradigm for 'there isn't one' etc. Reading from a StreamReader for instance gives you a null at the end of the stream, perfect condition on the loop. But then I guess the argument is that it should be a string? How are they going to do away with the divide by zeros? That's what I want to know!

                Regards, Rob Philpott.

                L 1 Reply Last reply
                0
                • D Daniel Pfeffer

                  Coming from a C++ background, I would say that the whole point of references (as opposed to pointers) is that a reference can never be null; nullable references are a fundamental violation of the programming model. If you want pointers, why not use C++?

                  Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.

                  R Offline
                  R Offline
                  Rob Philpott
                  wrote on last edited by
                  #10

                  Daniel Pfeffer wrote:

                  If you want pointers, why not use C++?

                  Because of header files, and 20 years of .NET has made me too stupid in general.

                  Regards, Rob Philpott.

                  N H 2 Replies Last reply
                  0
                  • R Rob Philpott

                    Daniel Pfeffer wrote:

                    If you want pointers, why not use C++?

                    Because of header files, and 20 years of .NET has made me too stupid in general.

                    Regards, Rob Philpott.

                    N Offline
                    N Offline
                    Nelek
                    wrote on last edited by
                    #11

                    To be aware of it is the first step :laugh: :laugh: :laugh:

                    M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.

                    1 Reply Last reply
                    0
                    • R Rob Philpott

                      Yes it seems var, Brexit and Covid lockdowns are the divisive issues of our times. I'd venture that the young probably are more advocates of var, particularly if they've come from some horrible dynamic language. But those of us who went through the OO revolution in the 90s and had the 'type is everything' mantra drummed into us just find it obscures things. I don't mind dynamic, but there has to be a very good reason for it!

                      Regards, Rob Philpott.

                      M Offline
                      M Offline
                      Mladen Jankovic
                      wrote on last edited by
                      #12

                      var and dynamic languages have nothing in common. If anything, var is coming from functional languages with stronger type safety than C# and which won't let you do it any other way :)

                      1 Reply Last reply
                      0
                      • R Rob Philpott

                        lmoelleb wrote:

                        Why you would not want the compiler to help identifying inherit traps is something I just don't understand.

                        That's a strong argument. Compile time errors have to beat runtime. But I guess I'm just very relaxed about things being null. It's a useful paradigm for 'there isn't one' etc. Reading from a StreamReader for instance gives you a null at the end of the stream, perfect condition on the loop. But then I guess the argument is that it should be a string? How are they going to do away with the divide by zeros? That's what I want to know!

                        Regards, Rob Philpott.

                        L Offline
                        L Offline
                        lmoelleb
                        wrote on last edited by
                        #13

                        No one is arguing against null being a useful paradigm. And now we have the syntax to tell if the paradigm is used or not. This serves two purposes: 1) Communicate intent. You can see from the type of my method if you need to handle a null being returned. You can see by the parameter definition if my code can handle a null being passed in. 2) Compiler can catch more mistakes. Either of these would be more than enough for me to use nullable - with both of these... don't get why you would not use it on any new code. Legacy code is of course always an issue... when to (and if) make the investment to change it.

                        R 1 Reply Last reply
                        0
                        • D Daniel Pfeffer

                          Coming from a C++ background, I would say that the whole point of references (as opposed to pointers) is that a reference can never be null; nullable references are a fundamental violation of the programming model. If you want pointers, why not use C++?

                          Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.

                          Greg UtasG Offline
                          Greg UtasG Offline
                          Greg Utas
                          wrote on last edited by
                          #14

                          Daniel Pfeffer wrote:

                          a reference can never be null

                          They're not supposed to be null, but they can be. If someone passes *ptr as an argument, and ptr is nullptr, defensive code still has to include if(&ref == nullptr).

                          Robust Services Core | Software Techniques for Lemmings | Articles
                          The fox knows many things, but the hedgehog knows one big thing.

                          <p><a href="https://github.com/GregUtas/robust-services-core/blob/master/README.md">Robust Services Core</a>
                          <em>The fox knows many things, but the hedgehog knows one big thing.</em></p>

                          D 1 Reply Last reply
                          0
                          • L lmoelleb

                            No one is arguing against null being a useful paradigm. And now we have the syntax to tell if the paradigm is used or not. This serves two purposes: 1) Communicate intent. You can see from the type of my method if you need to handle a null being returned. You can see by the parameter definition if my code can handle a null being passed in. 2) Compiler can catch more mistakes. Either of these would be more than enough for me to use nullable - with both of these... don't get why you would not use it on any new code. Legacy code is of course always an issue... when to (and if) make the investment to change it.

                            R Offline
                            R Offline
                            Rob Philpott
                            wrote on last edited by
                            #15

                            I have to say your starting to convince me, this is hard to argue with. I think I might try it out later on and see how it feels..

                            Regards, Rob Philpott.

                            1 Reply Last reply
                            0
                            • R Rob Philpott

                              I'm having one of those mornings where I try and catch up on new features they've stuck in C# and .NET. I have to say I don't always like what I read, maybe I'm a coding language Conservative or something, but I do sometimes come around to new features about five years late. Nullable reference types (they're nullable already surely?) ([Embracing nullable reference types | .NET Blog](https://devblogs.microsoft.com/dotnet/embracing-nullable-reference-types/)) is one such example that when I first heard about it a year or two ago I thought it so preposterous that I would never activate it, and sure enough there are no ?s at the end of my strings to date. Now do I dig in or accept change? A null reference, well a null pointer, ultimately a bad address in memory just seems to me an inherent trap with computers. I first did it 40 years ago on a Commodore VIC20 before they had invented exceptions, the thing would just go mental until you switched it off. And every day since for that matter but usually by production such errors are gone. In development they're a good pointer (hey, a pun!) to where things aren't quite right yet. I like things the way they are (were). How do you like nullable reference types?

                              Regards, Rob Philpott.

                              P Offline
                              P Offline
                              PIEBALDconsult
                              wrote on last edited by
                              #16

                              Pointless. Used by inferior practitioners.

                              B 1 Reply Last reply
                              0
                              • P PIEBALDconsult

                                Pointless. Used by inferior practitioners.

                                B Offline
                                B Offline
                                BabyYoda
                                wrote on last edited by
                                #17

                                PIEBALDconsult wrote:

                                Pointless

                                So are circles, but I still use them. ;P

                                1 Reply Last reply
                                0
                                • OriginalGriffO OriginalGriff

                                  That's a difficult one. I haven't started using them, but I suspect I probably should on the basis that the more errors I can catch at compile time rather than run time means less errors I have to specifically test for or code to handle. I can't help the feeling that there are too many C++ and VB fanboise trying to get the nastier bits of of their original code into the C# spec though: var without Linq, dynamic, and so on does kinda dumb down the language without adding any benefit in the real world.

                                  "I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!

                                  H Offline
                                  H Offline
                                  honey the codewitch
                                  wrote on last edited by
                                  #18

                                  I don't think var dumbs down the language any more than auto dumbs down C++. You may not like it, but it saves *typing* not *thinking*, IMO. Besides, in C# due to lack of a reasonable alternative to typedef it's par for the course, LINQ or no - if you're using a lot of generics.

                                  Real programmers use butterflies

                                  1 Reply Last reply
                                  0
                                  • R Rob Philpott

                                    Daniel Pfeffer wrote:

                                    If you want pointers, why not use C++?

                                    Because of header files, and 20 years of .NET has made me too stupid in general.

                                    Regards, Rob Philpott.

                                    H Offline
                                    H Offline
                                    honey the codewitch
                                    wrote on last edited by
                                    #19

                                    I recently came back home to C++ from years long C# development. I'm a bit rusty, but it's coming back. You may surprise yourself if you give it another go. *hides*

                                    Real programmers use butterflies

                                    1 Reply Last reply
                                    0
                                    • R Rob Philpott

                                      I'm having one of those mornings where I try and catch up on new features they've stuck in C# and .NET. I have to say I don't always like what I read, maybe I'm a coding language Conservative or something, but I do sometimes come around to new features about five years late. Nullable reference types (they're nullable already surely?) ([Embracing nullable reference types | .NET Blog](https://devblogs.microsoft.com/dotnet/embracing-nullable-reference-types/)) is one such example that when I first heard about it a year or two ago I thought it so preposterous that I would never activate it, and sure enough there are no ?s at the end of my strings to date. Now do I dig in or accept change? A null reference, well a null pointer, ultimately a bad address in memory just seems to me an inherent trap with computers. I first did it 40 years ago on a Commodore VIC20 before they had invented exceptions, the thing would just go mental until you switched it off. And every day since for that matter but usually by production such errors are gone. In development they're a good pointer (hey, a pun!) to where things aren't quite right yet. I like things the way they are (were). How do you like nullable reference types?

                                      Regards, Rob Philpott.

                                      M Offline
                                      M Offline
                                      Marc Clifton
                                      wrote on last edited by
                                      #20

                                      I suspect enabling this will wreak havoc on all the entity models with string properties where the backing field in the DB is nullable. ;)

                                      Latest Articles:
                                      Thread Safe Quantized Temporal Frame Ring Buffer

                                      J L 2 Replies Last reply
                                      0
                                      • R Rob Philpott

                                        I'm having one of those mornings where I try and catch up on new features they've stuck in C# and .NET. I have to say I don't always like what I read, maybe I'm a coding language Conservative or something, but I do sometimes come around to new features about five years late. Nullable reference types (they're nullable already surely?) ([Embracing nullable reference types | .NET Blog](https://devblogs.microsoft.com/dotnet/embracing-nullable-reference-types/)) is one such example that when I first heard about it a year or two ago I thought it so preposterous that I would never activate it, and sure enough there are no ?s at the end of my strings to date. Now do I dig in or accept change? A null reference, well a null pointer, ultimately a bad address in memory just seems to me an inherent trap with computers. I first did it 40 years ago on a Commodore VIC20 before they had invented exceptions, the thing would just go mental until you switched it off. And every day since for that matter but usually by production such errors are gone. In development they're a good pointer (hey, a pun!) to where things aren't quite right yet. I like things the way they are (were). How do you like nullable reference types?

                                        Regards, Rob Philpott.

                                        J Offline
                                        J Offline
                                        Jorgen Andersson
                                        wrote on last edited by
                                        #21

                                        Sounds like a great addition to me. And if I have understood it correctly there is no actual addition of nullable reference types. What you get is an option where the compiler looks for possible null reference errors. So when null references are EXPECTED you need to mark those fields as nullable to not get a warning.

                                        Wrong is evil and must be defeated. - Jeff Ello Never stop dreaming - Freddie Kruger

                                        1 Reply Last reply
                                        0
                                        • M Marc Clifton

                                          I suspect enabling this will wreak havoc on all the entity models with string properties where the backing field in the DB is nullable. ;)

                                          Latest Articles:
                                          Thread Safe Quantized Temporal Frame Ring Buffer

                                          J Offline
                                          J Offline
                                          Jorgen Andersson
                                          wrote on last edited by
                                          #22

                                          That's probably one of the reasons it's an option and not default. Entity framework will of course be updated to support it though.

                                          Wrong is evil and must be defeated. - Jeff Ello Never stop dreaming - Freddie Kruger

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups