Pretty normal mistake although its kind of weird how it made it into release unless this bit of code is only run when some very obscure corner case or feature is hit... I'm sure there is some good reason ($$$) why the C# team haven't implemented it but I still wonder why the compiler doesn't throw a warning in such cases...
gumi_r msn com
Posts
-
Stupid code -
Will the TSA allow these on planes?W∴ Balboos wrote:
Oh, yeah, one more thing with the hydrogen fuel-cell economy (particularly for vehicles) that so often touted as the clean-energy answer. Where are we to get this hydrogen? Sure, it's very abundant on Earth: in the form of water. Electolysis, one way or another, requires energy in excess of that which the hydrogen will eventually return on its way back to water. Where's that to come from? That question begs the question: then why not use the energy at that point and save the energy loss of the conversion step. Where-does-it-come-from is usually left out of the discussion because it's an inconvenient question [and thermodynamics is a rather dull branch of chemistry.]
That is not a well thought argument at all. You are just repeating a worn out argument and not really stopped to think about the issue at all. Precisely hydrogen fuel cells are one of the convenient solutions for the future "era" of renewable energies, mainly wind power, solar power and to an extent runoff hydroelectric power plants which can not be stored and are thus wasted if there is no demand of power when they are available. The issue with energy is that we are not very good at storing it...thermal energy does not store well at all unless you incur massive costs, neither does electric energy (batteries is the best we can do)...the only type of storage we are able to do fairly well is potential energy but its expensive: hydroelectric plants are just a potential energy storage system. The big advantage of hydrogen fuel cells is that its possibly the best "portable" energy storage system we have, way better than our current batteries and a whole lot cleaner. What you seem to be completely unaware of is that in most 1rst world countries there is a massive amount of energy surplus in certain time frames (mainly night time). As shutting down conventional or nuclear thermal based plants is too expensive these are normally kept running and some important power consumers are even encouraged financially to demand energy in order to keep the electric system balanced (some power plant generators are even run in motor mode in order to add to the demand). Add to this excess of energy all the wind power that is not running at night and you have a huge energy surplus that is not being used and is in most cases "wasted" as it can't be stored (wind). There is the power you need for hydrogen at zero environmental additional cost. Obviously there is a
-
Just got an email rejection to my job application........A friend of mine had a similar response a few years after submitting his resumé. The funny part was that he was already an employee of the company when he got the reply... :doh:
-
Why I don't use Apple productsAgain, I never said Apple invented anything...what I am saying is that Apple has consistently been able to come up with a design that pleases people. There is genius at work there, no matter how you look at it. Once can be chance or luck, but they have done it more than a few times.
Vivic wrote:
Don't tell me that Steve Jobs brought the graphical user interface to the masses. His first graphical interface computer, Lisa, cost $10,000! It was the price drop in components and enabled Apple to reduce the price on the Mac to levels acceptable even to Apple fanboys.
"Apple fanboys..." hmmm, I'm wasting my time with you as it is clear where you stand. Anyhow I think you really dont know what you are talking about and are just spitting stuff out you've read without really bothering to look into it. Yes, Xerox was the first graphical interface and yes, Apple's Lisa was expensive, where did I ever say the contrary? The fact remains that the Macintosh 128K was the first comercially succesful computer with graphical and mouse interface. I don't seem to remember any Xerox PARC's Alto computers at my schools in 1985-1988 or at any home but I do seem to remember a whole lot of Macintosh(no PCs at all either, I started to see those in the very early 90s). Your point? If it was simply because hardware was getting cheaper then why wasn't it Xerox or IBM or whoever who made the breakthrough instead of Apple? Don't you see the trend? Jesus, even Pixar was the first one to ever produce a major breakthrough in computer animated movies. Again, who was at the owner of Pixar back then? Jobs decided to sell all the hardware branch of Pixar and center all resources in the animation department, becuae that is where he saw the future of the company. And strangely enough, he was right, AGAIN. Did he make Toy Story? Absolutely not, but I'm quite convinced that he was responsible for focusing the company in order to make it possible. But whatever, Apple and Jobs are just hype: 1. They have worse products than most competitors. 2. They are a whole lot pricier. 3. They haven't brought anything original to the industry, they just copy everything all the time. I find it kind of curious, how if 1,2 and 3 are true they can still be so succesful? Oh I know, it is thanks to all those stupid Apple fanboys....oh but wait, how did Apple ever manage to have so many Apple fanboys to begin with? I'll keep saying it again, Job's genius wasn't about inventing...it was all abou
-
Why I don't use Apple productsYou can go ahead and say that Apple didn't invent anything or bring any new technology to the world, but that is besides the point. Steve Job's legacy is not about inventing, its about seeing what people were demanding and giving it to them just right even before people knew they needed it. That is where his greatness lies. 1. The first succesful company to actually get a computer with a graphic interface and a mouse in about every school and many homes was, like it or not, Apple. I studied in the AISN (American School of Nouakchott (freaking Mauritania) and I remember those Apples in 1985 at my school. Who else was doing something similar then? Was it new technology? Absolutely not, but he was the first to actually make it something people would want to need (soon after the world decided he was right and everything ended up going in that direction). 2. iPod: Was it the first mp3 player? No. But still, he just got it right as to what people wanted and he gave it to them. If there is no genius at work there, then why didnt anyone get to do it before him? Why was the iPod so succesful and other devices weren't. Was it gifted to Apple? 3. iPhone. Again no new technology, but want it or not it changed the smart phone panorama entirely. It became the benchmark of all smart phones. Right now, its kind of funny to read all anti Apple fanatics talking about their phones...its all about comparing it to the iPhone and how theirs is so much better. I keep wondering, why is iPhone still the benchmark and the phone to beat. Was the succes of the iPhone also gifted to Jobs and Apple? or did he maybe get it just right AGAIN and delivered what everyone wanted? 4. iPad. lol, I wont even go into that. Nothing new either, but again, one step ahead of everyone and delivering something that people wanted even before they knew they did. Its funny to see how the rest have floundered miserable trying to bite into the iPad's dominion. Is it a better product than the rest? Probably not, but it has the best advantage it can ever have: headstart. There is greatness again. The rest have to play catch up. You see, you can say whatever you want, the numbers and facts dont back you up. There is just one undeniable fact that no matter how much you squirm you will never get around, and that is the undeniable success Apple has had. And most of it is due to the "taste" and the "timing" Jobs has had when it came to supervising the design and the strategic decisions of the company. Obviously his greatness has nothing to do with how
-
From where should the index start?Collin Jasnoch wrote:
When you count Apples you DO start with 0. You used the word APPLE No? In computers that is where 0 starts, The definition. In Human tounge it is an implicit deffinition. Computers must work in concrete terms. You can't change that. Yes you can abstract it. But at a cost of efficiency.
That is a really wierd way to undersant it. We live in a 1 based indexing world. You said it yourself, we have the luxury of being able to omit the 0th element in our life precisely because it means NOTHING to us. Quite the contrary in the programming world, where the 0th element or the nth element are equally meaningful.
-
From where should the index start?That I honestly would not like at all. I think a bad standard is better than no standard at all. It would be a nightmare if everytime you had to get your hands dirty with some codebase you had to factor in the index base whoever wrote the code decided to use on that given day/project/etc.
-
From where should the index start?As I posted before you'r reasoning IMHO is not solid. The fact is that array[0] is a meaningful item in a zero based indexing system, while in our everyday life, 0 is the contrary; it can be omitted, as you well say, precisely because it is not meaningful, we live in a 1-index based world like it or not. And I absolutely disagree with the supposedly overhead paid when the compiler has to correctly interpret array[1] as the first item of the array (that is, offset zero in the uderlying pointer) when it spits out machine code, CIL or what have you. Why should there be any performance issue at all? Let the compiler transform everything to zero based indexing that the computer natively understands at compile time (we may have a 1 ms compile time overhead in there somewhere...). Or should we also start writing our programming in assembly code just to make the computers understand us better? Obviously this is of course a hyothetical discussion. We are stuck with zero based indexing and it is after all trivial to understand how it works.
-
From where should the index start?From your post you seem to be implying we should just do without levels of abstraction. So what if the compiler has to do some dirty work for you just because you start at 1 and not at 0? I wouldn't care less, there is no meaningful implication anyhow. Also, the way we count apples, apple nº0 is not meaningful, it is not an apple. Having 0 apples means you have...well no apples at all. We start to count in 1 because that is the first meaningful number when we have apples to begin with, that's why we count all the way up to apple nº10 and do not stop at apple nº9 (apple nº9 is the 10th apple...hmm isnt that obvious? now try explaining that to somenone who has never programmed and you'll really see how intuitive zero based indexing is).
-
From where should the index start?If it were to be a clean start, I'd definitely have an index start with '1'. As to why, the reason is simple, it's they way we count since we're 2 years old. How does a small kid learn his numbers? I find it hard to imagine some little boy counting 0, 1, 2, ... We start at 1, and do so even when we are grown ups. Zero based indexing is just a legacy of array pointers and offsets. It doesn't help in any useful way in modern languages, it only makes things more confusing, its cumbersome to referrence the last item of an array and it doesn't agree with how we normally count things; when you go to a grocery store and you ask for 10 apples, you expect to get ten apples, them being apple nº1, apple nº2, ... , apple nº10, definitely not apple nº0, ..., apple nº9. So why when you "buy" a 10 item array you get the latter and not the former? WTF?!?
-
Safest Restaraunt on EarthWell it's pretty obvious isn't it? The link between "safest" and no muslims. I wonder if a similar sign referencing some other american society "minority" would be welcomed with laughs and chuckles too...
-
Comment SwitchingLol sorry, I actually misread and thought it was
public const string MyString = string.Empty
Sorry about that.
-
Comment SwitchingActually that wouldn't compile I think.
string.Empty
is readonly so you can't initialize a static variable with it as it could theoretically change during runtime.
-
IDisposable and CA2000 warning during Vs2010 Code Analysis.I'm sorry but that is wrong. A use statement is just syntactic sugar for a
try finally
block. What you are proposing translates to:Bar bar = null;
try
{
bar = new Bar();
...
return new FooX(bar);
}
finally
{
if (bar != null)
{
bar.Dispose();
}
}When the method gets to the return statement, it will execute the finally clause before exiting the method's scope returning a FooX object with a disposed bar. If you are not convinced execute the following code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;namespace Tests
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Creating FooA 'fooA' from GetFooAWithUsingStatement.");
FooA fooA=FooProvider.GetFooAWithUsingStatement();
Console.WriteLine("'fooA' internal bar is disposed: {0}", fooA.InternalBarIsDisposed);
Console.WriteLine("Disposing 'fooA'");
fooA.Dispose();
Console.WriteLine("'fooA' internal bar is disposed: {0}", fooA.InternalBarIsDisposed);
Console.WriteLine();
Console.WriteLine("Creating FooA 'fooA' from GetFooAWithoutUsingStatement.");
fooA = FooProvider.GetFooWithoutUsingStatement();
Console.WriteLine("'fooA' internal bar is disposed: {0}", fooA.InternalBarIsDisposed);
Console.WriteLine("Disposing 'fooA'");
fooA.Dispose();
Console.WriteLine("'fooA' internal bar is disposed: {0}", fooA.InternalBarIsDisposed);
Console.Write("Press a key to exit.");
Console.ReadKey();
}
}class Bar : IDisposable { bool disposed; public void Dispose() { this.disposed = true; } public bool IsDisposed { get { return this.disposed; } } } static class FooProvider { public static FooA GetFooAWithUsingStatement() { using (Bar bar = new Bar()) { return new FooA(bar); } } public static FooA GetFooWithoutUsingStatement() { Bar bar = new Bar(); return new FooA(bar); } } class FooBase : IDisposable { bool disposed; Bar bar; internal FooBase(Bar bar) { this.bar = bar; } public void Dispose()
-
IDisposable and CA2000 warning during Vs2010 Code Analysis.Thanks for the reply but that is not a valid solution. If I do that, GetFooX() will be returning an invalid FooX object(disposed bar).
-
IDisposable and CA2000 warning during Vs2010 Code Analysis.Hi all, I need some advice here, I hope somebody can help me. I have the following class structure (simplified):
public class Bar: IDisposable {...}
public abstract class FooBase: IDisposable
{
Bar bar;
bool disposed;internal FooBase(Bar bar)
{
this.bar=bar;
}public void Dispose()
{
if (!this.disposed)
{
this.bar.Dispose(true);
GC.SupressFinalize(this);
this.disposed = true;
}
}protected void Dipose(bool disposing)
{
if (disposing)
{
this.bar.Dispose();
}
}
}public FooA: Foo {...}
public FooB: Foo {...}public static class FooProvider
{
public static FooA GetFooA()
{
Bar bar = new Bar();
...
return new FooA(bar);
}public static FooB GetFooB()
{
Bar bar = new Bar();
...
return new FooB(bar);
}...
}When I run Code Analysis on this, I get Warnings CA2000 on all 'CreateFooX()' methods of the FooProvider class. This warning gives the following message: "Microsoft. Reliability: In method 'FooProvider.GetFooX()', call System.IDisposable.Dispose on object 'bar' before all references to it are out of scope." Microsoft recommends to never supress this warning but I'm not really sure its warning about a real problem in the code. True that 'bar' is not disposed before going out of scope in whatever 'CreateFooX()' method we consider but a reference to it lives on in the 'FooX' object which eventually will get disposed and will in turn take care of disposing 'bar'. Am I understanding something wrong about how the Dispose pattern should work and I have some fundamental flaw in my code or should I just supress this warning? Thanks for any advice.
-
LogicJesus, why do people pretend to know the absolute truth about things without even bothering to do some minimal research. What you are stating is completely wrong. The
&
operator is an OVERLOADABLE operator. As such, it has predefined behaviours for integral types and boolean types.(int & int)
IS NOT THE SAME ASbool & bool
. The first performs a logical bitwise AND operation while the latter performs a LOGICAL AND operation. There is no bitwise operation at all if the operator is dealing with two booleans. It is exactly the same asbool && bool
except that both terms are evaluated no matter what the first expression evaluates to. If you are not convinced then please read the following MSDN C# reference link: http://msdn.microsoft.com/en-us/library/sbf85k1c.aspx[^] or better yet: http://msdn.microsoft.com/en-us/library/2a723cdk.aspx[^] -
LogicThe
&&
operator should be used instead of the&
operator for clarity's sake because its the standard way of doing things. Also the&&
operator gives you some type safety that the&
operator does not when dealing with LOGICAL ANDS:(4 && 5)
will not compile but(4 & 5)
will. If you are intending to perform a LOGICAL AND operation you could end up performing a LOGICAL BITWISE AND operation instead by mistake. But in any case, the reason you are giving: "(...)In C#, logical operators always short-circuit(...) is so wrong I dont even know where to begin. The operator&
is an overloadable operator. Integral and boolean types have their own predefined&
binary operators:&(int, int)
computes the logical bitwise AND of its operands while&(bool, bool)
computes the logical AND of its operands; that is, the result is true if and only if both its operands are true. This is all straight from the C# specifications:http://msdn.microsoft.com/en-us/library/sbf85k1c.aspx[^] Sobool & bool
is NOT A BITWISE OPERATION at all. Its a normal LOGICAL AND OPERATION where both terms are always evaluated contrary tobool && bool
where the second term is evaluated if and only if the first term is true.modified on Thursday, May 19, 2011 5:55 AM
-
LogicReally? And I always thought that '&&' was simply a short-circuited '&'. I must go RTFM. :sigh:
-
May be bad code or May not be!!!In any "sane" scenario its pretty obvious that if A==B and B==C then A should equal C therefore DoABC() will never be called. Still with no more information, it is plausible (even if hideous) to think that DoABC could be called. It just needs some dubious implicit cast operators to do the trick.