Why do so many "developers" not understand 'null'?
-
And so is the color black (assuming an RGB color space). Or anything else represented as binary zeroes in your favorite computer architecture. That two values of completely different semantics have identical same internal representations doesn't imply that the two values are "very much the same thing" - if you just overlook the type/class information. In an OO world, any object instance, regardless of class, having a single member which is set to some value represented as all bits reset is also very much the same thing as a numeric zero. Or a null pointer. If you ignore its semantics, the way you do to claim that zero and null are very much the same thing.
-
And so is the color black (assuming an RGB color space). Or anything else represented as binary zeroes in your favorite computer architecture. That two values of completely different semantics have identical same internal representations doesn't imply that the two values are "very much the same thing" - if you just overlook the type/class information. In an OO world, any object instance, regardless of class, having a single member which is set to some value represented as all bits reset is also very much the same thing as a numeric zero. Or a null pointer. If you ignore its semantics, the way you do to claim that zero and null are very much the same thing.
-
I have just gone through four QA questions, each of which is an error caused by a null reference. And yet none of the posters seems to have any idea a) how to diagnose and fix it, or b) even what the error means. Do those of you who still work in teams find this is a common problem with younger team members?
I have to deal with Microsoft employees not understanding null for dates or GUIDs, thereby actually making programming, especially if a database is involved, even more difficult.
-
I have just gone through four QA questions, each of which is an error caused by a null reference. And yet none of the posters seems to have any idea a) how to diagnose and fix it, or b) even what the error means. Do those of you who still work in teams find this is a common problem with younger team members?
Richard MacCutchan wrote:
Do those of you who still work in teams find this is a common problem with younger team members?
Versus other problems by inexperienced (younger) persons in any craft? I didn't get better with null until I burned myself so many times that I started focusing on it intently. Thus experience.
-
I have just gone through four QA questions, each of which is an error caused by a null reference. And yet none of the posters seems to have any idea a) how to diagnose and fix it, or b) even what the error means. Do those of you who still work in teams find this is a common problem with younger team members?
-
I have just gone through four QA questions, each of which is an error caused by a null reference. And yet none of the posters seems to have any idea a) how to diagnose and fix it, or b) even what the error means. Do those of you who still work in teams find this is a common problem with younger team members?
-
I didn't have that problem, but I did have a junior developer think that the font was causing a unicode problem.
Bond Keep all things as simple as possible, but no simpler. -said someone, somewhere
-
I have just gone through four QA questions, each of which is an error caused by a null reference. And yet none of the posters seems to have any idea a) how to diagnose and fix it, or b) even what the error means. Do those of you who still work in teams find this is a common problem with younger team members?
No. Where are you finding these people? It seems hard to get far in life without knowing what null is. KNIGHT: The Knights Who Say Null demand a sacrifice! ARTHUR: O, Knights Who Say Null, we are but simple travelers who seek an enchanter who lives beyond these woods. KNIGHT: NULL! NULL! NULL! ARTHUR and PARTY: Ooh, ow! KNIGHT: We shall say NULL again to you, if you do not appease us. ARTHUR: Well, what is it you want? KNIGHT: We want... a :shipit: on this code-review.
-
In Norwegian, the name of numeric zero is 'null'. So Norwegian kids 'sort of' have an excuse for confusing the two. But they are quite different. Zero is a distinct, well defined numeric value that you may treat 100% like any other numeric value. 'null' is nothing, not a numeric value, but a void. Emptiness. An abyss. Not at a valid numeric value. Some programming languages use the term 'void'; it is really much more descriptive. I feel like digging up my old Robert Heinlein collection to re-read the short story—And He Built a Crooked House[^]. The story tells about a crazy architect (in California, obviously :-)) who designs a house which is a 3-dimensional projection of a 4-dimensional cube, a tesseract. The night before the house owners move in, there is an earthquake that makes the house fold up as a true tesseract, in 4 dimensions, not just as a 3-dim projection. I believe that Heinlein has taken liberties in his description of how a real tesseract would appear. But his description of the view out one window, of a total emptiness, not even black, gave me shivers when I first read it, many years ago. It is a beautiful literary description of the concept of a 'null'. I think that I didn't fully understand the concept of null, void, myself until I read the Heinlein story.
trønderen wrote:
'null' is nothing, not a numeric value, but a void.
Which arises after learning to count and has nothing to do with difference between numeric nothing or nothing at all. What has Heinlein to say about having no apples 40.000 years ago? Would it be 0 apples, null apples, or would the net result of no apples be the same?
Bastard Programmer from Hell :suss: "If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
-
In Norwegian, the name of numeric zero is 'null'. So Norwegian kids 'sort of' have an excuse for confusing the two. But they are quite different. Zero is a distinct, well defined numeric value that you may treat 100% like any other numeric value. 'null' is nothing, not a numeric value, but a void. Emptiness. An abyss. Not at a valid numeric value. Some programming languages use the term 'void'; it is really much more descriptive. I feel like digging up my old Robert Heinlein collection to re-read the short story—And He Built a Crooked House[^]. The story tells about a crazy architect (in California, obviously :-)) who designs a house which is a 3-dimensional projection of a 4-dimensional cube, a tesseract. The night before the house owners move in, there is an earthquake that makes the house fold up as a true tesseract, in 4 dimensions, not just as a 3-dim projection. I believe that Heinlein has taken liberties in his description of how a real tesseract would appear. But his description of the view out one window, of a total emptiness, not even black, gave me shivers when I first read it, many years ago. It is a beautiful literary description of the concept of a 'null'. I think that I didn't fully understand the concept of null, void, myself until I read the Heinlein story.
trønderen wrote:
'null' is nothing, not a numeric value, but a void
This is not strictly true. In any (programming) language null is a value representing an abstract concept, just like an integer is a value representing an instance of a specific class of numbers, which are themselves an abstract concept used to quantify the world. It is true 'null' is the concept of 'not having a valid value' (i.e. nothing) but it is represented in a program using a value in various different lexical forms; null/NULL/etc. The OP clearly raised a very a good question that could, perhaps, be more pointedly made with the more abstract question; Why do so many developers not understand abstraction?
-
Do you ever really have 0 apples from a weird sort of ship of Theseus-like perspective? Some cultures lack that sort of integer concept from a similar view... all apples are not created equal. No point talking about N of them as the potential distance from reality only increases the further from 1 you get. Forget apples to oranges, they don't even do apples to apples. More recently, quantum physics is saying there is no "nothing" and that there is always "quantum foam" that has always been. Maybe the young ones are so much smarter they look dumb. Nahhh haha. But maybe sending null across a wire as a matter of course is a little bit dumb too.
jochance wrote:
Do you ever really have 0 apples from a weird sort of ship of Theseus-like perspective?Prehistoric man? Pretty sure they know the concept of "no apples".
jochance wrote:
More recently, quantum physics is saying there is no "nothing" and that there is always "quantum foam" that has always been.
Physical BS. Quantum physics a bunch nonsense. Your mom always been fat, because quantum mech says so.
Bastard Programmer from Hell :suss: "If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
-
jochance wrote:
Do you ever really have 0 apples from a weird sort of ship of Theseus-like perspective?Prehistoric man? Pretty sure they know the concept of "no apples".
jochance wrote:
More recently, quantum physics is saying there is no "nothing" and that there is always "quantum foam" that has always been.
Physical BS. Quantum physics a bunch nonsense. Your mom always been fat, because quantum mech says so.
Bastard Programmer from Hell :suss: "If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
I guess my point was more that while I get the usefulness of bad models, they're still maybe bad in some specifics of practice. Which is still fine and all. It's not a full and total indictment of concepts. Tons of that self-referential integrity stuff going on in math, yeah? But most probably do not look at 0/null the same way as they look at other deeper mathematical concepts. It just sits there all happy exactly 1 unit from both -1 and 1. The reality seems to be that you can really never have 0 if you go splitting hairs. That said, it maybe makes it an even more perfect "init" value. Null though? Null was maybe never destined to be anything but a logical error. :java: "But I really want to store that something is specifically not known/unknowable!" "What about all the other stuff you don't even know you don't know?" "Look, you could've brought this up before we were a decade into RDBMS development. We're keeping it." "But I didn't know." "Exactly! And now we can record it!"
-
I guess my point was more that while I get the usefulness of bad models, they're still maybe bad in some specifics of practice. Which is still fine and all. It's not a full and total indictment of concepts. Tons of that self-referential integrity stuff going on in math, yeah? But most probably do not look at 0/null the same way as they look at other deeper mathematical concepts. It just sits there all happy exactly 1 unit from both -1 and 1. The reality seems to be that you can really never have 0 if you go splitting hairs. That said, it maybe makes it an even more perfect "init" value. Null though? Null was maybe never destined to be anything but a logical error. :java: "But I really want to store that something is specifically not known/unknowable!" "What about all the other stuff you don't even know you don't know?" "Look, you could've brought this up before we were a decade into RDBMS development. We're keeping it." "But I didn't know." "Exactly! And now we can record it!"
jochance wrote:
But most probably do not look at 0/null the same way as they look at other deeper mathematical concepts.
Like ALL programmers, I had to deal with floats nearing zero, which is not NULL.
jochance wrote:
Null though? Null was maybe never destined to be anything but a logical error. [Coffee]
Nope. Let me explain, little padawan; 0 is having no apples. null is having no concept of apples.
Bastard Programmer from Hell :suss: "If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
-
jochance wrote:
But most probably do not look at 0/null the same way as they look at other deeper mathematical concepts.
Like ALL programmers, I had to deal with floats nearing zero, which is not NULL.
jochance wrote:
Null though? Null was maybe never destined to be anything but a logical error. [Coffee]
Nope. Let me explain, little padawan; 0 is having no apples. null is having no concept of apples.
Bastard Programmer from Hell :suss: "If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
-
And here I might've supposed you'd want people talking less about things they have no concept of. What's really important is the ability to dictate to the apple that it is inferior, and they mostly left that out altogether. lol
jochance wrote:
What's really important is the ability to dictate to the apple that it is inferior, and they mostly left that out altogether.
That makes no sense at all. Pears are superior tough :thumbsup:
Bastard Programmer from Hell :suss: "If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
-
What I actually said was
Quote:
In C zero and NULL are very much the same thing.
Not that zero and null are the same everywhere.
We're both right. You're referring to an implementation, I was referring to the concept, the abstraction if you like, where null represents "value unknown" (or "not yet defined").
-
trønderen has also said it well. Even the C language has confusion on this topic. Yes, I too believe that they have distinct meanings. Remember my posing the question of why is this a valid statement in C zero = 0; value = malloc( zero ); value is not null nor zero. Why would I want to allocate zero memory? Here is what ChatAI says In the C programming language, calling malloc(0) is allowed and returns a pointer to a memory block of size 0. This is specified in the C standard, which states that malloc(0) is equivalent to malloc(1). The reason for this behavior is that malloc is intended to allocate memory dynamically, and a request for 0 bytes of memory is considered a valid request. Allocating a block of memory with a size of 0 can be useful in certain situations, such as when you want to create a zero-length array or when you want to allocate memory that you will later reallocate to a different size using realloc. However, it is important to note that malloc does not guarantee that it will return a pointer to a block of memory that is truly 0 bytes in size. The implementation of malloc may choose to return a block of memory that is larger than the requested size, in which case the additional memory will not be accessible to your program. Clear as mud.
"A little time, a little trouble, your better day" Badfinger
Forgive my ignorance, but what is the point or purpose of a zero length array? Or at the lower level, reserving zero memory space?
-
Forgive my ignorance, but what is the point or purpose of a zero length array? Or at the lower level, reserving zero memory space?
-
I have just gone through four QA questions, each of which is an error caused by a null reference. And yet none of the posters seems to have any idea a) how to diagnose and fix it, or b) even what the error means. Do those of you who still work in teams find this is a common problem with younger team members?
The best concept I found to explain Java/C# object references to newcomers is to use a “leash” as the metaphor. Just because you have a leash in your hand, does not mean there is a dog on the other end. The object variable/leash is in the null/no-dog state. Many Java programming errors occur when multiple leashes are attached to the same dog. (And each leash holder thinks they have a unique dog!) [This is why immutable objects are a good idea] A new leash is attached to your dog every time you call a method and “pass” your leash by value. The method can “pull” your dog around or stuff it with treats. How do you free the dog so it becomes a stray so that the dog catcher/garbage collector can pick it up? Assign the variable/leash to null, of course. You can also mix types in with cat leashes vs dog leashes. This can work into leashes as interface holders. This works with C pointers to some extent. You can build on the metaphor to include pointer arithmetic. This is partly a reply to BillWoodruff but I wanted the reply at top level as I always found this a great metaphor.
-
i tried to explain that to C# students (back when i had a few) by saying that a "null reference" is a named/typed placeholder/slot for a pointer/reference to something which is not defined. you, the programmer, have the "freedom" to create the named/typed slot. and, that the common exception "null reference" was what happened after your running code made demanded/needed/required a defined pointer. you have a favorite metaphor, or explanation ... i am all ears ! of course, now, Visual Studio/C# compiler (and ReSharper) are going to ... suggest a check. and, now, C#/>NET has new Attributes for static Type analysis: [^].
«The mind is not a vessel to be filled but a fire to be kindled» Plutarch