Deep Thought OTD
-
It wouldn't have a default value. You'd get a compiler error if you don't assign a value in the constructor. Just like the compiler error you get when you don't initialize all fields of a C# struct.
:mad: ..declare a generic list of forms; x = new List
(100); Now, do I really want 100 instantiated Forms? Null means "not there", which isn't the same as an 'empty' or default object. There is an easier way to prevent bugs from null-references: don't program :)
-
:mad: ..declare a generic list of forms; x = new List
(100); Now, do I really want 100 instantiated Forms? Null means "not there", which isn't the same as an 'empty' or default object. There is an easier way to prevent bugs from null-references: don't program :)
eddyvluggen wrote:
x = new List
(100);
That creates an empty List<form> with a capacity of 100. No nulls involved. OK, there may be nulls in the internal array representation, but the public interface of List<T> doesn't require nulls in any way. Otherwise you couldn't use List<T> with value types...
-
Daniel Grunwald wrote:
how to signal the end of a linked list?
You do it like they taught you at school/university, with a Sentinal!
xacc.ide - now with TabsToSpaces support
IronScheme - 1.0 alpha 4a out now (29 May 2008)
((lambda (x) `((lambda (x) ,x) ',x)) '`((lambda (x) ,x) ',x))I was taught at university that a null signalled the end.
-
eddyvluggen wrote:
x = new List
(100);
That creates an empty List<form> with a capacity of 100. No nulls involved. OK, there may be nulls in the internal array representation, but the public interface of List<T> doesn't require nulls in any way. Otherwise you couldn't use List<T> with value types...
I agree, you can't avoid using null, unless you cripple the language. You don't want to instantiate a class and hogg memory when a reference (say, a pointer) to 'null' would suffice. Another argument comes from the MS help on 'null'; // Set mc to null again. The object it referenced // is no longer accsessible and can now be garbage-collected. mc = null; How would this work with a 'default' object? Removing the null-keyword from the language doesn't reduce errors. It would be as ridiculous as removing all keys from a database, in order to prevent key-errors :sigh:
-
So basically:
interface INode {
INode Next { get; }
object Value { get; }
}class Node : INode {
public INode Next { get; set; }
public object Value { get; set; }
}
class SentinelNode : INode {
public INode Next { get { throw new NotSupportedException(); } }
public object Value { get { throw new NotSupportedException(); } }
}How's that any better than the language-provided "sentinel"
null
that throws a NullReferenceException on property access? You still have the same problem: The interface looks like you could get the next node; but in fact, you cannot. The only way I can think of to solve this in a type-safe manner without exceptions is with discriminated unions.The Sentinel node is never exposed, and IIRC neither is a Node, they are internal to the implementation and the user should not have to worry about it. You simply use the LinkedList interface (yeah very Java'ish). Anyways, this was an example of what I got thought (personally I would just go for a null).
xacc.ide - now with TabsToSpaces support
IronScheme - 1.0 beta 1 - coming soon
((lambda (x) `((lambda (x) ,x) ',x)) '`((lambda (x) ,x) ',x)) -
I agree, you can't avoid using null, unless you cripple the language. You don't want to instantiate a class and hogg memory when a reference (say, a pointer) to 'null' would suffice. Another argument comes from the MS help on 'null'; // Set mc to null again. The object it referenced // is no longer accsessible and can now be garbage-collected. mc = null; How would this work with a 'default' object? Removing the null-keyword from the language doesn't reduce errors. It would be as ridiculous as removing all keys from a database, in order to prevent key-errors :sigh:
But while null is useful for some data structures (internal implementation of List<T>), there's no reason why all references are nullable. The languages forces us to think "can 'a' be null" whenever we write "a.b" (and a is a reference type). That's a big design mistake in the C# language.
Anders Hejlsberg wrote:
Would you do anything differently in developing C# if you had the chance? ...(snip)... With language design or with platform design 1.0 is always a unique opportunity to put down your core values, your core designs, and then with every version thereafter it’s much harder to fundamentally change the nature of the beast. And so, the things that you typically end up regretting later are the fundamentals that you didn’t quite get right. Because those you can’t change - you can always ship new libraries etc, but you can’t change the fundamental gestalt of the platform. For example, in the type system we do not have separation between value and reference types and nullability of types. This may sound a little wonky or a little technical, but in C# reference types can be null, such as strings, but value types cannot be null. It sure would be nice to have had non-nullable reference types, so you could declare that ‘this string can never be null, and I want you compiler to check that I can never hit a null pointer here’. 50% of the bugs that people run into today, coding with C# in our platform, and the same is true of Java for that matter, are probably null reference exceptions. If we had had a stronger type system that would allow you to say that ‘this parameter may never be null, and you compiler please check that at every call, by doing static analysis of the code’. Then we could have stamped out classes of bugs.
-
I agree, you can't avoid using null, unless you cripple the language. You don't want to instantiate a class and hogg memory when a reference (say, a pointer) to 'null' would suffice. Another argument comes from the MS help on 'null'; // Set mc to null again. The object it referenced // is no longer accsessible and can now be garbage-collected. mc = null; How would this work with a 'default' object? Removing the null-keyword from the language doesn't reduce errors. It would be as ridiculous as removing all keys from a database, in order to prevent key-errors :sigh:
eddyvluggen wrote:
It would be as ridiculous as removing all keys from a database, in order to prevent key-errors
What?!#!? You mean DBA's dont do that already??? ;P
xacc.ide - now with TabsToSpaces support
IronScheme - 1.0 beta 1 - coming soon
((lambda (x) `((lambda (x) ,x) ',x)) '`((lambda (x) ,x) ',x)) -
From StackOverflow comes this one: We noticed that lots of bug in our software developed in C# cause a NullReferenceException. Is there a reason why "null" has been included in the language? After all, if there were no "null", I would have no bug, right? In other words, what feature in the language couldn't work without null?
-
But while null is useful for some data structures (internal implementation of List<T>), there's no reason why all references are nullable. The languages forces us to think "can 'a' be null" whenever we write "a.b" (and a is a reference type). That's a big design mistake in the C# language.
Anders Hejlsberg wrote:
Would you do anything differently in developing C# if you had the chance? ...(snip)... With language design or with platform design 1.0 is always a unique opportunity to put down your core values, your core designs, and then with every version thereafter it’s much harder to fundamentally change the nature of the beast. And so, the things that you typically end up regretting later are the fundamentals that you didn’t quite get right. Because those you can’t change - you can always ship new libraries etc, but you can’t change the fundamental gestalt of the platform. For example, in the type system we do not have separation between value and reference types and nullability of types. This may sound a little wonky or a little technical, but in C# reference types can be null, such as strings, but value types cannot be null. It sure would be nice to have had non-nullable reference types, so you could declare that ‘this string can never be null, and I want you compiler to check that I can never hit a null pointer here’. 50% of the bugs that people run into today, coding with C# in our platform, and the same is true of Java for that matter, are probably null reference exceptions. If we had had a stronger type system that would allow you to say that ‘this parameter may never be null, and you compiler please check that at every call, by doing static analysis of the code’. Then we could have stamped out classes of bugs.
It's a conceptual mistake to think that every object must exist, just as it is a conceptual mistake to assume that every atomic datatype needs to have a value. Sometimes a boolean is empty, that's a fact of life. And no, I don't want to see a tri-bool-enum like { Yes, No, Empty } Sometimes you need to reference "nothing". We have a tiny default object reserved for this special case, which uses almost no memory at all.
Daniel Grunwald wrote:
That's a big design mistake in the C# language.
I doubt it, but I may be proven wrong :)
Anders Hejlsberg wrote:
It sure would be nice to have had non-nullable reference types
He didn't say that "null" is superfluous. It might indeed be useful to add non-nullable classes, but that's not the same as removing the null-keyword from the language. :rose:
-
I was taught at university that a null signalled the end.
-
I was taught at university that a null signalled the end.
Brady Kelly wrote:
I've Found My Mojo
Glad to see that is working for you too! My boss just 'converted' from DNN to mojo too, after many frustrations.
xacc.ide - now with TabsToSpaces support
IronScheme - 1.0 beta 1 - coming soon
((lambda (x) `((lambda (x) ,x) ',x)) '`((lambda (x) ,x) ',x)) -
Brady Kelly wrote:
I've Found My Mojo
Glad to see that is working for you too! My boss just 'converted' from DNN to mojo too, after many frustrations.
xacc.ide - now with TabsToSpaces support
IronScheme - 1.0 beta 1 - coming soon
((lambda (x) `((lambda (x) ,x) ',x)) '`((lambda (x) ,x) ',x))It isn't quite working for me just yet - I had to fall back to plain ASP.NET for a small business site, but that wasn't a problem because they currently only have basically three pages. I had two problems, a phantom piece of feature on the left of a page, just an empty rectangle, and the picture gallery didn't show descriptions when clicking a thumbnail to open it. I'll be trying hard to contribute a 'Product Gallery' feature to the project if I can.
-
It's a conceptual mistake to think that every object must exist, just as it is a conceptual mistake to assume that every atomic datatype needs to have a value. Sometimes a boolean is empty, that's a fact of life. And no, I don't want to see a tri-bool-enum like { Yes, No, Empty } Sometimes you need to reference "nothing". We have a tiny default object reserved for this special case, which uses almost no memory at all.
Daniel Grunwald wrote:
That's a big design mistake in the C# language.
I doubt it, but I may be proven wrong :)
Anders Hejlsberg wrote:
It sure would be nice to have had non-nullable reference types
He didn't say that "null" is superfluous. It might indeed be useful to add non-nullable classes, but that's not the same as removing the null-keyword from the language. :rose:
eddyvluggen wrote:
He didn't say that "null" is superfluous. It might indeed be useful to add non-nullable classes, but that's not the same as removing the null-keyword from the language.
Of course you need something like null. But how often do you use "int?" compared to "int"? Or "bool?" compared to "bool"? I think non-nullable types are used more frequently than nullable types. The default should have been non-nullable, with nullable an option. But why stop at one "special" value
null
? A more flexible solution would have been support for discrimated unions[^]. "T?" then would simply be the union of "T" and "null". Discriminated unions could also ensure at compile-time that when the value of such a type is used, all possible cases are handled by the program. -
From StackOverflow comes this one: We noticed that lots of bug in our software developed in C# cause a NullReferenceException. Is there a reason why "null" has been included in the language? After all, if there were no "null", I would have no bug, right? In other words, what feature in the language couldn't work without null?
blackjack2150 wrote:
In other words, what feature in the language couldn't work without null?
Dunno about the language, but the level of thought in this question might be difficult to describe without the concept of null :laugh:
-
digital man wrote:
How do these people make a living???
By sweeping standing water off sidewalks. Programming is just a hobby for them.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001John Simmons / outlaw programmer wrote:
By sweeping standing water off sidewalks
Isn't that a little too complex? :laugh:
-
The Sentinel node is never exposed, and IIRC neither is a Node, they are internal to the implementation and the user should not have to worry about it. You simply use the LinkedList interface (yeah very Java'ish). Anyways, this was an example of what I got thought (personally I would just go for a null).
xacc.ide - now with TabsToSpaces support
IronScheme - 1.0 beta 1 - coming soon
((lambda (x) `((lambda (x) ,x) ',x)) '`((lambda (x) ,x) ',x))leppie wrote:
The Sentinel node is never exposed, and IIRC neither is a Node, they are internal to the implementation and the user should not have to worry about it. You simply use the LinkedList interface (yeah very Java'ish).
Yes, but now you're responsible for defining Sentinels for every data structure that would otherwise use null.
leppie wrote:
Anyways, this was an example of what I got thought (personally I would just go for a null).
Exactly :D
-
It's a conceptual mistake to think that every object must exist, just as it is a conceptual mistake to assume that every atomic datatype needs to have a value. Sometimes a boolean is empty, that's a fact of life. And no, I don't want to see a tri-bool-enum like { Yes, No, Empty } Sometimes you need to reference "nothing". We have a tiny default object reserved for this special case, which uses almost no memory at all.
Daniel Grunwald wrote:
That's a big design mistake in the C# language.
I doubt it, but I may be proven wrong :)
Anders Hejlsberg wrote:
It sure would be nice to have had non-nullable reference types
He didn't say that "null" is superfluous. It might indeed be useful to add non-nullable classes, but that's not the same as removing the null-keyword from the language. :rose:
eddyvluggen wrote:
And no, I don't want to see a tri-bool-enum like { Yes, No, Empty }
Blecch....one of my least favorite things about SQL is the trivalued boolean...something that is explicitly disallowed in the C# 2.0 spec, I noticed :)... The C# 2.0 spec also introduced nullable value types, a very interesting concept that I'm certain I will find a use for at some point.
-
eddyvluggen wrote:
It would be as ridiculous as removing all keys from a database, in order to prevent key-errors
What?!#!? You mean DBA's dont do that already??? ;P
xacc.ide - now with TabsToSpaces support
IronScheme - 1.0 beta 1 - coming soon
((lambda (x) `((lambda (x) ,x) ',x)) '`((lambda (x) ,x) ',x))keyless databases by clueless DBAs!!!!!! :laugh: oh wait, when I joined my current company, we actually had one of those...created by a junior programmer...we're a two-programmer department, I (a very un-junior programmer) came in after the junior had been gone for eight months, and I was the first to discover that she'd never put any meaningful indices on her largest tables :omg:
-
keyless databases by clueless DBAs!!!!!! :laugh: oh wait, when I joined my current company, we actually had one of those...created by a junior programmer...we're a two-programmer department, I (a very un-junior programmer) came in after the junior had been gone for eight months, and I was the first to discover that she'd never put any meaningful indices on her largest tables :omg:
-
eddyvluggen wrote:
He didn't say that "null" is superfluous. It might indeed be useful to add non-nullable classes, but that's not the same as removing the null-keyword from the language.
Of course you need something like null. But how often do you use "int?" compared to "int"? Or "bool?" compared to "bool"? I think non-nullable types are used more frequently than nullable types. The default should have been non-nullable, with nullable an option. But why stop at one "special" value
null
? A more flexible solution would have been support for discrimated unions[^]. "T?" then would simply be the union of "T" and "null". Discriminated unions could also ensure at compile-time that when the value of such a type is used, all possible cases are handled by the program.Daniel Grunwald wrote:
But how often do you use "int?" compared to "int"?
Not often.
Daniel Grunwald wrote:
I think non-nullable types are used more frequently than nullable types.
Non-nullable value-types are used more frequently than nullable value-types. If you're talking about reference-types, well, I do tend to use a lot of references. Sometimes you declare an object, just to use it as a reference to another object. You don't want the overhead of assigning a default empty one, just to have it replaced by the 'initial' value. If your only argument is the fact that the programmer might forget to initialize the object, than install FxCop for that particular programmer. C# has the option to give a value on declaration, you might want to build a rule out of that :rose:
modified on Tuesday, October 7, 2008 5:38 PM