Namespaces relates to the logical organisation of the code. Assemblies relates to deployment, and can be governed by things like a plugin architecture, need for patch updates, versioning, reuse between different applications etc. That's two different things, just like in UML where you can have both a logical view and a deployment view.
bjarneds
Posts
-
Folders and Namespaces vs Multiple Assemblies -
:'-(Relying on the GC to clean up after classes that implements IDisposable may be dangerous - in some cases such a class may never be garbage collected and thus in practice lead to memory leaks. Consider a class that e.g., when instantiated, subscribes to an event from .NET's SystemEvents class (a GUI application might e.g. subscribe to UserPreferenceChanged). Because these events are static, the delegate have a reference to the class that lasts for the entire lifetime of the application, i.e. it will prevent the class from being GC'ed until it unsubscribes from the event. I.e. the user of the class will have to tell it when it no longer needs to subscribe to such an event - the obvious way to do that is by calling Dispose. Now, you could argue that not all classes that implements Dispose suffer from this, but how would you know (and know that a future version of the class wouldn't require it either)? And should you know - is it not the purpose of encapsulation to ensure that you don't have to know about implementation details like this? Due to this, in my opinion is should not only be best practice, but required that you call Dispose on a class that implements IDisposable.
-
How can I turn on a computer when it is truned off ?I think it is called something like "Wake on RTC" (RTC = Real Time Clock, i.e. the clock on the motherboard, see http://en.wikipedia.org/wiki/Real-time_clock[^]). An "alarm time" can be programmed into this (see http://en.wikipedia.org/wiki/RTC_Alarm[^]). Media Center software usually have this feature so it can wake up and perform the next scheduled recording.
-
[Connect bug] VC++ 2010 : C++/CLI does not support variant delegates/interfacesJoe Woodbury wrote:
Other than Nish, here on CP, and myself, I don't know a soul who has written any C++/CLI code, even to try it out.
I've used it - because I had to. See .NET Object Spy and InvokeRemote[^].
-
This is why I am starting to loathe programmingYes, I believe so (I don't have practical experience with weak references though). Thinking about it, it may also be possible to solve the problem in this case by subscribing/unsubscribing when visibility of the form changes. But this only works because it is a form/control, because visibility can be used as a replacement for the client (owner form) telling when it is no longer used. In other classes, you may not have this option.
-
This is why I am starting to loathe programmingLooking at all the responses in this thread, there seem to be a lot of confusion about dispose and finalization. The basic recommendations is really not that complicated :-\ : 1) In the finalizer (i.e. ~SomeClass() ), clean up unmanaged resources. This method is called by the garbage collector, when the object can no longer be reached. However, it may not happen immediately, the garbage collector waits until it needs memory (which could be hours or days after it is no longer reachable). 2) In Dispose() (i.e. the method defined in IDisposable), cleanup managed and unmanaged resources. Furthermore, call GC.SuppressFinalize(this) (since everything is now cleaned up, there is no need for the finalizer to be called by the garbage collector, and doing this optimizes the garbage collection). The dispose pattern furthermore combines the implementation of these two, by implementing a protected Dispose(bool disposing) in the base class. The finalizer calls it with false, the Dispose() in IDisposable calls it with true. This makes it simple for derived classes, it only has to override this method and add whatever it needs, but it doesn't really change anything, it just a matter of style in the implementation. Why shouldn't the finalizer clean up managed resources as well? Because that is what the garbage collector does - managed resources is the memory used by other managed classes. Furthermore, it is very limited what a finalizer is allowed to do - it must not use references to other objects (as they may already be garbage collected). Why implement Dispose(), if the garbage collector takes care of it all? Because sometimes you need some cleanup to take place at a deterministic time, not hours or days later. The classical example is an open file. You typically want to allow the file being opened again, immediately you are done using it. Why do you need IDisposable for this, why not just implement a Close method on the class? Because with IDisposable, you have a uniform way of dealing with scenarios like this (e.g. a using block). You are free to implement a Close method as well if it feels more natural to use this name - this would simply call Dispose(). Are there any other reasons for Dispose() than deterministic cleanup (or: are there any situations where you have to call Dispose())? Yes (and this is where it gets a little more complicated :doh: ). If the object you no longer need is referenced by a longer living object, it may unintentionally keep it alive. Y
-
Unicode testingpg--az wrote:
"it's not really a character"
Actually, the BOM (byte-order mark) is a real character, known as "zero-width no-break space". This is a good choice, because it makes no harm to programs that just need to display the content, even if they don't skip over it (zero-width = invisible, no-break = no undesired wrapping behaviour).
-
Unicode testingYou are probably thinking about A with a ring (not dots, aka. umlaut) above it: http://en.wikipedia.org/wiki/Å[^]. The A with a ring is a different character, one that is used in several danish words. The old spelling of these words used the double AA instead of an A with a ring, but many names still use the double AA. Note that this letter (no matter if it is written as an A with a ring or a double AA) is the last character in the danish alphabet. This means that the result of sorting the strings "AA" and "BB" depends on the current culture. Of course, you shouldn't be required to know details like this when you are coding. Instead, you should assume nothing when it comes to cultures, characters, spelling etc. I think the MSDN article Writing Culture-Safe Managed Code (http://msdn.microsoft.com/en-us/library/ms994325.aspx[^]) may have a few surprises for most developers. So in my opinion, testing with different characters (and cultures) do make sense. Not only to make sure an application is Unicode compliant, but more importantly to catch some of the incorrect assumptions developers make about cultures etc.
-
Builtind culturally aware applicationsPCSpectra wrote:
Is it safe to assume then: 60 seconds = 1 minute 60 minutes = 1 hour 24 hours = 1 day
Not if you take leap seconds into account: http://en.wikipedia.org/wiki/Leap_second[^]. On the other hand, most applications shouldn't bother taking this into account ;)
-
Builtind culturally aware applicationsBy the way - if anyone is interested in the historical facts behind calenders, this document is also a good source. I personally find one of the most amusing facts to be, that in the year 1712, Sweden had 2 leap days in February - because they forgot to take out the leap days in 1704 and 1708 (see section 2.2.4). Now why did they forget that? As far as I remember an explanation I once got, it was because Sweden was at war at the time, so they had their mind on other stuff.
-
Builtind culturally aware applicationsChris Austin wrote:
Fascinating. Makes me want to lose an evening reading all about it.
You can find a very good 65-page PDF file about calenders here: http://www.tondering.dk/claus/calendar.html[^]. It contains details about several calenders (although I don't think the Buddhist calender is among them).