Fave Operator of the Day
-
Frankly, I am getting tired of null reference exceptions. I just wish C#/.NET wouldn't be so finicky about null references. For example the latest issue I had was with a MediaPlayer object, where I close the (possibly) existing player before starting a new one: player.Stop(); player = new MediaPlayer(); ... But if I haven't already created a MediaPlayer I get a freaking null reference exception. Of course I know that I should know better and test player for null, but my point is I don't CARE if player.Stop() fails, it's not going to aversely affect my function at all anyway, and my argument is that I bet in 80-90% of cases, null reference exceptions that slip through in production code probably would work fine if they were simply ignored, like my example above. Who's with me for demanding that null reference exceptions be ignored by default and only thrown in blocks explicitly marked as such! Lol, just a mini-rant. :P
{o,o}.oO( Did somebody say “mouse”? ) |)””’) -”-”-
logan1337 wrote:
and my argument is that I bet in 80-90% of cases, null reference exceptions that slip through in production code probably would work fine if they were simply ignored, like my example above.
Are you nuts? :~ You're responsible for telling the computer what to do. If you hand it a null reference, your method call can't actually be called. If you didn't need it to be called, you shouldn't have written it! Just think - you could have been trying to do anything. Essentially, you're saying that the runtime should plow through catastrophic bugs and attempt to keep executing something, on the assumption that major portions of your code aren't actually important anyway! No. No, no no. This is the road to unpredictable software, bugs that can never be reliably reproduced or tracked back to their original cause, shoddy software that works once, for the single scenario the developer tests under, but trashes the machines of half the end users. :sigh:
----
...the wind blows over it and it is gone, and its place remembers it no more...
-
It is up there, but not equal to the nullable int (not an assignment operator like ??, but a little known feature and useful none the less) int? myNullableInt = null; Console.WriteLine(myNullableInt ?? "Null"); myNullableInt = 1; Console.WriteLine(myNullableInt ?? "Null"); Outputs: null 1
MrEyes wrote:
Outputs: null
No that would be "Null" :)
xacc.ide
IronScheme a R5RS-compliant Scheme on the DLR
The rule of three: "The first time you notice something that might repeat, don't generalize it. The second time the situation occurs, develop in a similar fashion -- possibly even copy/paste -- but don't generalize yet. On the third time, look to generalize the approach." -
MrEyes wrote:
Outputs: null
No that would be "Null" :)
xacc.ide
IronScheme a R5RS-compliant Scheme on the DLR
The rule of three: "The first time you notice something that might repeat, don't generalize it. The second time the situation occurs, develop in a similar fashion -- possibly even copy/paste -- but don't generalize yet. On the third time, look to generalize the approach." -
logan1337 wrote:
and my argument is that I bet in 80-90% of cases, null reference exceptions that slip through in production code probably would work fine if they were simply ignored, like my example above.
Are you nuts? :~ You're responsible for telling the computer what to do. If you hand it a null reference, your method call can't actually be called. If you didn't need it to be called, you shouldn't have written it! Just think - you could have been trying to do anything. Essentially, you're saying that the runtime should plow through catastrophic bugs and attempt to keep executing something, on the assumption that major portions of your code aren't actually important anyway! No. No, no no. This is the road to unpredictable software, bugs that can never be reliably reproduced or tracked back to their original cause, shoddy software that works once, for the single scenario the developer tests under, but trashes the machines of half the end users. :sigh:
----
...the wind blows over it and it is gone, and its place remembers it no more...
Hahaha, like software isn't unpredictable already! And I'm not responsible for telling the computer what to do, the computer is responsible for doing what I ultimately want it to do. There's a difference. In a perfect world, where programmers were perfect beings and never made mistakes, what you're saying would be perfectly valid. But just think about it: if the computer (that is, the developers designing the system) took into account the potential for programmers to make mistakes, or simply to FORGET things, things might be a lot different. I'm not actually proposing this, but I just wanted to get people thinking about it in a different way. We're programming systems from the perspective that programmers ARE JUST LIKE PROGRAMS--that a programmer will write a program as accurately as a computer will execute it. But this isn't the case. Computers need to be designed from an interpretive perspective, like language. The English language (or any spoken language) is full of ambiguous, imprecise statements because that's how humans think. Why should we be forced to communicate with a computer any differently? ... I just read my last sentence and realized I'm opening myself to all kinds of flaming. So I'll cool down and pull back a little. All I want is for a null reference to be like "no object" so that if I execute a method on "no object" NOTHING HAPPENS! Maybe in some kind of debug mode it would give me a warning, or even an exception, but for production code, are we really saying that the program should break just because the code isn't written perfectly? I mean, what's more important here, that the program be perfect or that the user be able to use it? In my own defense, newer technologies are actually starting to see the benefit of this. Take WPF for instance. If you databind an object to a WPF control that does not support rendering the object, or cannot interpret it (for example in a list view hosting various types of objects), the program will not bomb on you. In fact nothing will happen at all, except that you'll get a silent message on the debug output window. Sure that little part of the program might not end up working, but it doesn't prevent the user from continuing to use the program. In fact it may not even be noticable to the user, and that's a heck of a lot better than throwing up an ugly Exception dialog, announcing to the world that the software basically sucks. I'd personally like to see a shift towards these kind of "silent" failures over big ugly error dialogs.
-
Hahaha, like software isn't unpredictable already! And I'm not responsible for telling the computer what to do, the computer is responsible for doing what I ultimately want it to do. There's a difference. In a perfect world, where programmers were perfect beings and never made mistakes, what you're saying would be perfectly valid. But just think about it: if the computer (that is, the developers designing the system) took into account the potential for programmers to make mistakes, or simply to FORGET things, things might be a lot different. I'm not actually proposing this, but I just wanted to get people thinking about it in a different way. We're programming systems from the perspective that programmers ARE JUST LIKE PROGRAMS--that a programmer will write a program as accurately as a computer will execute it. But this isn't the case. Computers need to be designed from an interpretive perspective, like language. The English language (or any spoken language) is full of ambiguous, imprecise statements because that's how humans think. Why should we be forced to communicate with a computer any differently? ... I just read my last sentence and realized I'm opening myself to all kinds of flaming. So I'll cool down and pull back a little. All I want is for a null reference to be like "no object" so that if I execute a method on "no object" NOTHING HAPPENS! Maybe in some kind of debug mode it would give me a warning, or even an exception, but for production code, are we really saying that the program should break just because the code isn't written perfectly? I mean, what's more important here, that the program be perfect or that the user be able to use it? In my own defense, newer technologies are actually starting to see the benefit of this. Take WPF for instance. If you databind an object to a WPF control that does not support rendering the object, or cannot interpret it (for example in a list view hosting various types of objects), the program will not bomb on you. In fact nothing will happen at all, except that you'll get a silent message on the debug output window. Sure that little part of the program might not end up working, but it doesn't prevent the user from continuing to use the program. In fact it may not even be noticable to the user, and that's a heck of a lot better than throwing up an ugly Exception dialog, announcing to the world that the software basically sucks. I'd personally like to see a shift towards these kind of "silent" failures over big ugly error dialogs.
logan1337 wrote:
And I'm not responsible for telling the computer what to do, the computer is responsible for doing what I ultimately want it to do. There's a difference.
You're thinking of a wife. And even those don't always do what you ultimately want. Computers are the beige ones that only do exactly what you tell them to.
logan1337 wrote:
In a perfect world, where programmers were perfect beings and never made mistakes, what you're saying would be perfectly valid.
No, actually, in that case it wouldn't matter because there would be no programs trying to dereference null pointers and therefore all method calls specified could be made. You might not realize this, but the null reference exception is actually a huge convenience, something that a fair amount of runtime logic is devoted to making easier for you to detect and debug.
logan1337 wrote:
The English language (or any spoken language) is full of ambiguous, imprecise statements because that's how humans think.
Wrong. About the "that's how humans think" bit. Of course there is ambiguity in our language. There's ambiguity in the statement you posted earlier as well - player is a variable, an ambiguity, a reference that won't be resolved until runtime. The difference is, humans take initiative - we expect them to work based on past experience, and so upon instructing an employee to turn off the radio and play a CD, this employee can, upon failing to find any radio in the vicinity, look back at his past dealings with you and recall that you are stone-deaf and tend to assume the presence of radios when there are none. The computer does not have this luxury. It doesn't know the difference between a media player, a missile launch controller, or a game of "decorate the cakes". We may, at some point, reach the level of ability that would let us allow and expect value judgments from our machines. But we're no where near there now. Trying to allow such things at present would put us in the frustrating position of the simpleton's master, continually dealing with the disastrous effects that result from relying on the common sense of something that has none.
logan1337 wrote:
All I want is for a null reference to be like "no object" so that if I execute a method on "no object" NOTHING HAPPENS!
Now, that's actually a
-
logan1337 wrote:
And I'm not responsible for telling the computer what to do, the computer is responsible for doing what I ultimately want it to do. There's a difference.
You're thinking of a wife. And even those don't always do what you ultimately want. Computers are the beige ones that only do exactly what you tell them to.
logan1337 wrote:
In a perfect world, where programmers were perfect beings and never made mistakes, what you're saying would be perfectly valid.
No, actually, in that case it wouldn't matter because there would be no programs trying to dereference null pointers and therefore all method calls specified could be made. You might not realize this, but the null reference exception is actually a huge convenience, something that a fair amount of runtime logic is devoted to making easier for you to detect and debug.
logan1337 wrote:
The English language (or any spoken language) is full of ambiguous, imprecise statements because that's how humans think.
Wrong. About the "that's how humans think" bit. Of course there is ambiguity in our language. There's ambiguity in the statement you posted earlier as well - player is a variable, an ambiguity, a reference that won't be resolved until runtime. The difference is, humans take initiative - we expect them to work based on past experience, and so upon instructing an employee to turn off the radio and play a CD, this employee can, upon failing to find any radio in the vicinity, look back at his past dealings with you and recall that you are stone-deaf and tend to assume the presence of radios when there are none. The computer does not have this luxury. It doesn't know the difference between a media player, a missile launch controller, or a game of "decorate the cakes". We may, at some point, reach the level of ability that would let us allow and expect value judgments from our machines. But we're no where near there now. Trying to allow such things at present would put us in the frustrating position of the simpleton's master, continually dealing with the disastrous effects that result from relying on the common sense of something that has none.
logan1337 wrote:
All I want is for a null reference to be like "no object" so that if I execute a method on "no object" NOTHING HAPPENS!
Now, that's actually a
Shog9 wrote:
You might not realize this, but the null reference exception is actually a huge convenience, something that a fair amount of runtime logic is devoted to making easier for you to detect and debug.
I am aware that it is very useful to programmers, but it's not useful to users, when the program they're using stops working for reasons they don't even understand.
Shog9 wrote:
The difference is, humans take initiative - we expect them to work based on past experience, and so upon instructing an employee to turn off the radio and play a CD, this employee can, upon failing to find any radio in the vicinity, look back at his past dealings with you and recall that you are stone-deaf and tend to assume the presence of radios when there are none. The computer does not have this luxury. It doesn't know the difference between a media player, a missile launch controller, or a game of "decorate the cakes".
I tend to think of computers merely as enablers of communication. There is communication between the developer of a system and the programmers utilizing it. It is, after all the system we are using, built upon a computer, rather than the computer itself. The computer is merely an encapsulation of many many levels of abstraction layered upon one another. As such, yes the computer as a MACHINE does exactly and only what it is instructed to do (assuming it is built properly, and ignoring the potential for physical error), but a program is something much more high-level than the computer. The operating system is, after all, itself a program, that hosts client programs and abstracts away the lower-level machine.
Shog9 wrote:
Someone else in this thread proposed the addition of a new operator, ?.
I was, in fact, one of the gentlemen who brainstormed on this in another thread[^]. :)
Shog9 wrote:
both require the programmer to explicitly specify that it's ok to do nothing in null cases.
I'm not proposing otherwise. My desire is to have the behavior default to
-
Jim Crafton wrote:
Is this just a C# thing or is there a VB equivalent (I'm almost afraid to ask what that monstrosity would look like)?
Monstrosity you say? In VB? Unthinkable! :rolleyes: Classic VB (and VB.NET) has a ternary operator... except, it isn't really an operator. The IIF() function takes three operands, if the first is true it returns the second, otherwise it returns the third. Unlike the C++/C# ternary operator, this will always evaluate all three expressions (being a function call rather than an operator, it has to). This provided yet another pitfall when using VB, as expressions such as:
someVar = IIF(boolVar, HorriblyDestructiveCall1(), EvenMoreDestructiveCall())
...would end up trashing whatever global state you were manipulating with the two functions twice, once for each call (remember, this is VB - of course there's a horrible, fragile, global state of some sort). VB9 now provides a true ternary operator - If. So you can write:someVar = If(boolVar, HorriblyDestructiveCall1(), EvenMoreDestructiveCall())
... and only one of the possible functions will be called. Or, for the null coalescing version:someVar = If(possiblyNothing, BetterThanNothing())
Good times...----
...the wind blows over it and it is gone, and its place remembers it no more...
All of those examples remind me of the days I was an employee. I don't have those examples anymore. Feels good. Feels really good. In fact I'll walk out my office door and go lay on my bed for a minute and savor my daily commute it feels so good. Then I'll get some food right out of my kitchen, right next to my office. A bit of honey encrusted cured ham on some black bread with yellow mustard and sharp cheddar cheese. Yeah... I really miss being the turd-licking employee that had to ponder if (possiblyNothing, BetterThanNothing()) that had to ask about everyone's Friday plans because that's all you look forward to as an employee is Friday. All of the above was said in PureJest version 2.0 running against the Sarcasm Framework 3.5 that was just updated on my Complete Cynic OS version 711.:laugh:
-
Jim Crafton wrote:
is there a VB equivalent
Who cares ;)
WPF - Imagineers Wanted Follow your nose using DoubleAnimationUsingPath
norm .net wrote:
Who cares ;)
When you running for office? You'll get my vote. VB just sucks. Even for those wankees that defend it. They even suck. Why use that language at all. Ugh! I'd rather have dysentery.
-
Shog9 wrote:
You might not realize this, but the null reference exception is actually a huge convenience, something that a fair amount of runtime logic is devoted to making easier for you to detect and debug.
I am aware that it is very useful to programmers, but it's not useful to users, when the program they're using stops working for reasons they don't even understand.
Shog9 wrote:
The difference is, humans take initiative - we expect them to work based on past experience, and so upon instructing an employee to turn off the radio and play a CD, this employee can, upon failing to find any radio in the vicinity, look back at his past dealings with you and recall that you are stone-deaf and tend to assume the presence of radios when there are none. The computer does not have this luxury. It doesn't know the difference between a media player, a missile launch controller, or a game of "decorate the cakes".
I tend to think of computers merely as enablers of communication. There is communication between the developer of a system and the programmers utilizing it. It is, after all the system we are using, built upon a computer, rather than the computer itself. The computer is merely an encapsulation of many many levels of abstraction layered upon one another. As such, yes the computer as a MACHINE does exactly and only what it is instructed to do (assuming it is built properly, and ignoring the potential for physical error), but a program is something much more high-level than the computer. The operating system is, after all, itself a program, that hosts client programs and abstracts away the lower-level machine.
Shog9 wrote:
Someone else in this thread proposed the addition of a new operator, ?.
I was, in fact, one of the gentlemen who brainstormed on this in another thread[^]. :)
Shog9 wrote:
both require the programmer to explicitly specify that it's ok to do nothing in null cases.
I'm not proposing otherwise. My desire is to have the behavior default to
logan1337 wrote:
My desire is to have the behavior default to what I mentioned, but still allowing the programmer to explicitly override this for a block of code.
Dereferencing null is an error on the part of the programmer. Careful programmers take steps to avoid making this error - and i'm all for language or library enhancements that make doing so easier. But making the default behavior more forgiving is just adding another pitfall for careless programmers.
logan1337 wrote:
Finally, when I say "silent" I really mean silent from the perspective of the end user, and not so much just because it's annoying as for the fact that it may in fact, render a program completely useless by accident.
You are perfectly free to change the behavior of null reference errors when it comes to end-users. I did. Obviously, the end-user isn't equipped or inclined to do anything useful about it - so make it obvious to them that it's not their fault, it's yours - and then request that they allow a report to be sent enabling you to fix the problem you caused for them. As far as rendering the program useless - there's no guarantee whatsoever that skipping the line that caused the problem wouldn't cause further problems. In fact, it's entirely possible that it will, given that the internal state of the program is not what the programmer writing the code thought it would be.
logan1337 wrote:
By making the default behavior more tolerant, such "accidental" cases would be completely normal.
You know, there has been at least one set of languages and platforms that allowed this: MS BASIC, VB (and derivatives) had the
On Error Resume Next
command to put execution of a routine in just such a mode. It made testing and debugging hell. I've wasted more time trying to track down errors distorted or masked by this construct than i care to think about, and to this day it is the primary reason why i detest "classic" ASP development. Errors Are Errors - masking them doesn't make them less erroneous, it just makes them harder to notice. To use an analogy: double-entry bookkeeping is tedious and hard to understand. Yet, it has been popular for many, many years for the simple fact that it makes certain types of errors more obvious. Obvious errors get fixed. Subtle errors often don't. -
All of those examples remind me of the days I was an employee. I don't have those examples anymore. Feels good. Feels really good. In fact I'll walk out my office door and go lay on my bed for a minute and savor my daily commute it feels so good. Then I'll get some food right out of my kitchen, right next to my office. A bit of honey encrusted cured ham on some black bread with yellow mustard and sharp cheddar cheese. Yeah... I really miss being the turd-licking employee that had to ponder if (possiblyNothing, BetterThanNothing()) that had to ask about everyone's Friday plans because that's all you look forward to as an employee is Friday. All of the above was said in PureJest version 2.0 running against the Sarcasm Framework 3.5 that was just updated on my Complete Cynic OS version 711.:laugh:
code-frog wrote:
Yeah... I really miss being the turd-licking employee that had to ponder if (possiblyNothing, BetterThanNothing()) that had to ask about everyone's Friday plans because that's all you look forward to as an employee is Friday.
:laugh: Yeah, maybe i'll have to bump up the optimism in my example code a bit... ;P
----
...the wind blows over it and it is gone, and its place remembers it no more...
-
logan1337 wrote:
My desire is to have the behavior default to what I mentioned, but still allowing the programmer to explicitly override this for a block of code.
Dereferencing null is an error on the part of the programmer. Careful programmers take steps to avoid making this error - and i'm all for language or library enhancements that make doing so easier. But making the default behavior more forgiving is just adding another pitfall for careless programmers.
logan1337 wrote:
Finally, when I say "silent" I really mean silent from the perspective of the end user, and not so much just because it's annoying as for the fact that it may in fact, render a program completely useless by accident.
You are perfectly free to change the behavior of null reference errors when it comes to end-users. I did. Obviously, the end-user isn't equipped or inclined to do anything useful about it - so make it obvious to them that it's not their fault, it's yours - and then request that they allow a report to be sent enabling you to fix the problem you caused for them. As far as rendering the program useless - there's no guarantee whatsoever that skipping the line that caused the problem wouldn't cause further problems. In fact, it's entirely possible that it will, given that the internal state of the program is not what the programmer writing the code thought it would be.
logan1337 wrote:
By making the default behavior more tolerant, such "accidental" cases would be completely normal.
You know, there has been at least one set of languages and platforms that allowed this: MS BASIC, VB (and derivatives) had the
On Error Resume Next
command to put execution of a routine in just such a mode. It made testing and debugging hell. I've wasted more time trying to track down errors distorted or masked by this construct than i care to think about, and to this day it is the primary reason why i detest "classic" ASP development. Errors Are Errors - masking them doesn't make them less erroneous, it just makes them harder to notice. To use an analogy: double-entry bookkeeping is tedious and hard to understand. Yet, it has been popular for many, many years for the simple fact that it makes certain types of errors more obvious. Obvious errors get fixed. Subtle errors often don't.Shog - I get the feeling that Grand Negus is coming back here. Don't descend into the morass.:laugh:
Deja View - the feeling that you've seen this post before.
-
code-frog wrote:
Yeah... I really miss being the turd-licking employee that had to ponder if (possiblyNothing, BetterThanNothing()) that had to ask about everyone's Friday plans because that's all you look forward to as an employee is Friday.
:laugh: Yeah, maybe i'll have to bump up the optimism in my example code a bit... ;P
----
...the wind blows over it and it is gone, and its place remembers it no more...
Not for VB. Nothing is optimistic if you have to use VB except for "End" that's optimistic.
-
Not for VB. Nothing is optimistic if you have to use VB except for "End" that's optimistic.
-
norm .net wrote:
Who cares ;)
When you running for office? You'll get my vote. VB just sucks. Even for those wankees that defend it. They even suck. Why use that language at all. Ugh! I'd rather have dysentery.
Yup. I'd been using C# for 5 years, only to take a job that the Department Standard is VB.Net. I have my revenge by using my scripter (wrote it myself) that will make a project in either language. I have all of the VB traps (things that don't have a C# equivalent or vice versa) already coded to use the proper equivalent - and then I don't deviate without good cause. One day the rest of the bottom-feeders will finally learn C#.
-
logan1337 wrote:
My desire is to have the behavior default to what I mentioned, but still allowing the programmer to explicitly override this for a block of code.
Dereferencing null is an error on the part of the programmer. Careful programmers take steps to avoid making this error - and i'm all for language or library enhancements that make doing so easier. But making the default behavior more forgiving is just adding another pitfall for careless programmers.
logan1337 wrote:
Finally, when I say "silent" I really mean silent from the perspective of the end user, and not so much just because it's annoying as for the fact that it may in fact, render a program completely useless by accident.
You are perfectly free to change the behavior of null reference errors when it comes to end-users. I did. Obviously, the end-user isn't equipped or inclined to do anything useful about it - so make it obvious to them that it's not their fault, it's yours - and then request that they allow a report to be sent enabling you to fix the problem you caused for them. As far as rendering the program useless - there's no guarantee whatsoever that skipping the line that caused the problem wouldn't cause further problems. In fact, it's entirely possible that it will, given that the internal state of the program is not what the programmer writing the code thought it would be.
logan1337 wrote:
By making the default behavior more tolerant, such "accidental" cases would be completely normal.
You know, there has been at least one set of languages and platforms that allowed this: MS BASIC, VB (and derivatives) had the
On Error Resume Next
command to put execution of a routine in just such a mode. It made testing and debugging hell. I've wasted more time trying to track down errors distorted or masked by this construct than i care to think about, and to this day it is the primary reason why i detest "classic" ASP development. Errors Are Errors - masking them doesn't make them less erroneous, it just makes them harder to notice. To use an analogy: double-entry bookkeeping is tedious and hard to understand. Yet, it has been popular for many, many years for the simple fact that it makes certain types of errors more obvious. Obvious errors get fixed. Subtle errors often don't.I still get the impression you don't understand my point, because I don't disagree with what you're saying here. Mistakes are mistakes. Granted. I'm not suggesting otherwise. All I'm saying is that I don't like the way that *I* as a programmer am forced to write my program in adherence with the concept that any unexpected condition will throw an exception and therefore must be explicitly handled (i.e. more work for me). In other words, I'm arguing that the way I originally wrote the program was in fact, not flawed, because of my (admittedly incorrect) subconscious disregard for null objects and the refusal to accept the way things are, and that, if my desired mechanism were, in fact, implemented, then there wouldn't BE a problem to begin with. If the call to player.Close() was crucial, even in the case when player is null (or rather, since this is impossible, the best it could do is indicate an earlier problem), then I would either explicitly mark it as such, or observe this in the debugger output. Can you at least grant me that it should be my choice, even if you do not agree with it? :rose: And don't tell me to switch to VB. :mad: :laugh:
{o,o}.oO( Did somebody say “mouse”? ) |)””’) -”-”-
-
I still get the impression you don't understand my point, because I don't disagree with what you're saying here. Mistakes are mistakes. Granted. I'm not suggesting otherwise. All I'm saying is that I don't like the way that *I* as a programmer am forced to write my program in adherence with the concept that any unexpected condition will throw an exception and therefore must be explicitly handled (i.e. more work for me). In other words, I'm arguing that the way I originally wrote the program was in fact, not flawed, because of my (admittedly incorrect) subconscious disregard for null objects and the refusal to accept the way things are, and that, if my desired mechanism were, in fact, implemented, then there wouldn't BE a problem to begin with. If the call to player.Close() was crucial, even in the case when player is null (or rather, since this is impossible, the best it could do is indicate an earlier problem), then I would either explicitly mark it as such, or observe this in the debugger output. Can you at least grant me that it should be my choice, even if you do not agree with it? :rose: And don't tell me to switch to VB. :mad: :laugh:
{o,o}.oO( Did somebody say “mouse”? ) |)””’) -”-”-
logan1337 wrote:
All I'm saying is that I don't like the way that *I* as a programmer am forced to write my program in adherence with the concept that any unexpected condition will throw an exception and therefore must be explicitly handled (i.e. more work for me).
Well, you don't have to. I mean, that's kind of the point of using Exceptions for error handling rather than, say, checking return values or global flags - you can write out the ideal code without regard for error conditions, and then toss in a few high-level handlers and rely on them to just sweep unusual scenarios under the rug. Heck you can totally skip handling errors altogether if you want, and never have to worry about them compounding in the way that, say, C-style return value errors would (one call failing undetected leading to the next call failing because of the previous failure, leading to the next...).
logan1337 wrote:
In other words, I'm arguing that the way I originally wrote the program was in fact, not flawed, because of my (admittedly incorrect) subconscious disregard for null objects and the refusal to accept the way things are, and that, if my desired mechanism were, in fact, implemented, then there wouldn't BE a problem to begin with.
I'll allow that, in your very specific case, that's probably true. But i don't think it's wise to draw from that the conclusion that all or even most null references are harmless and can be safely ignored. In fact, the conclusion i'd draw would be that the system should be designed with deterministic finalization in mind, so that explicitly
Close()
ing the player isn't necessary even when the object does exist. And of course it's your choice. That's been my point all along - you're free to implement whatever system you like to avoid null references. If that works for you, then good. But it shouldn't be the general case, the default for everyone. Trust me - that way lies madness.----
...the wind blows over it and it is gone, and its place remembers it no more...
-
logan1337 wrote:
All I'm saying is that I don't like the way that *I* as a programmer am forced to write my program in adherence with the concept that any unexpected condition will throw an exception and therefore must be explicitly handled (i.e. more work for me).
Well, you don't have to. I mean, that's kind of the point of using Exceptions for error handling rather than, say, checking return values or global flags - you can write out the ideal code without regard for error conditions, and then toss in a few high-level handlers and rely on them to just sweep unusual scenarios under the rug. Heck you can totally skip handling errors altogether if you want, and never have to worry about them compounding in the way that, say, C-style return value errors would (one call failing undetected leading to the next call failing because of the previous failure, leading to the next...).
logan1337 wrote:
In other words, I'm arguing that the way I originally wrote the program was in fact, not flawed, because of my (admittedly incorrect) subconscious disregard for null objects and the refusal to accept the way things are, and that, if my desired mechanism were, in fact, implemented, then there wouldn't BE a problem to begin with.
I'll allow that, in your very specific case, that's probably true. But i don't think it's wise to draw from that the conclusion that all or even most null references are harmless and can be safely ignored. In fact, the conclusion i'd draw would be that the system should be designed with deterministic finalization in mind, so that explicitly
Close()
ing the player isn't necessary even when the object does exist. And of course it's your choice. That's been my point all along - you're free to implement whatever system you like to avoid null references. If that works for you, then good. But it shouldn't be the general case, the default for everyone. Trust me - that way lies madness.----
...the wind blows over it and it is gone, and its place remembers it no more...
Ok, I can leave it at that. ;P Cheers.
{o,o}.oO( Did somebody say “mouse”? ) |)””’) -”-”-
-
norm .net wrote:
Jim Crafton wrote: is there a VB equivalent Who cares
People in jobs that require the use of VB. >.<
-
Ok, I can leave it at that. ;P Cheers.
{o,o}.oO( Did somebody say “mouse”? ) |)””’) -”-”-
I'm going to have to go ahead and mention that almost 100% of the time the object in question needs to be used for the code to execute properly.
-
Is this just a C# thing or is there a VB equivalent (I'm almost afraid to ask what that monstrosity would look like)?
¡El diablo está en mis pantalones! ¡Mire, mire! Real Mentats use only 100% pure, unfooled around with Sapho Juice(tm)! SELECT * FROM User WHERE Clue > 0 0 rows returned Save an Orange - Use the VCF! VCF Blog
return _cachedItem ButIfIsNothingThenInstead (_cachedItem = GetItem())
:)Luc Pattyn [Forum Guidelines] [My Articles]
this months tips: - before you ask a question here, search CodeProject, then Google - the quality and detail of your question reflects on the effectiveness of the help you are likely to get - use PRE tags to preserve formatting when showing multi-line code snippets