Am I the only one that thinks type inference is a bad thing?
-
No, C# isn't suddenly going to become JavaScript. The point of
var
is to replace this:Dictionary<String, MyBigObjectClassName> dict = new Dictionary<String, MyBigObjectClassName>();
with this:
var dict = new Dictionary<String, MyBigObjectClassName>();
Or in C++, replace:
std::vector<std::pair<std::string, int>>::const_iterator it = someVector.begin();
with:
auto it = someVector.begin();
--Mike-- Visual C++ MVP :cool: LINKS~! Ericahist | PimpFish | CP SearchBar v3.0 | C++ Forum FAQ Ford, what's this fish doing in my ear?
Ah. An excellent shortcut for development. And one which is going to be seriously abused...
cheers, Chris Maunder
CodeProject.com : C++ MVP
-
Replace
var
withobject
and you can pretty much do that already. Excellent for those who think strongly typed languages are for bed-wetting types. Terrible for those who need to debug their programs. I appreciate the concept of a non-typed language but unfortunately we're still using bits to represent our data. Because of this I want to be sure that my fixed point and floating point values remain fixed point and floating point values. I also want to know when a numerical value gets converted to a string because sometimes I really, really don't need the overhead of that conversion. I spend(spent) way too much time with VBScript and Javascript to appreciate weakly-typed languages and their subtle bugs. Much easier having specific types than littering your code with cast operators to force your data into a given type. It makes me sad that applications are now aimed at the lowest common denominator, and sadder still that our development tools are following suit.cheers, Chris Maunder
CodeProject.com : C++ MVP
Chris Maunder wrote:
littering your code with cast operators
Don't you mean "Runtime Conversion Errors"? And yeah, you're right. Programming languages should be harder to use. The easier it gets, the worse the worldwide codebase becomes. I've written about this before, this degradation in the worldwide collection of code. Eventually, we're going to have massive failures, and then people with experience might finally get paid what we deserve, but we'll hate it. The problem is that schools (and literature) are not teaching concepts, they are teaching languages. When you do that, it's like trying to describe a duck to someone who's never seen a bird before. Students need at least a full semester of concept stuff before they even touch a programming language. I don't understand why experienced college professors, who have probably worked in the industry, don't understand that. My intern is going through this... learning C#, but he doesn't know the basic concepts, so he has to ask me a lot of really basic questions. I'm tempted to go down there and ask the guy what the hell he's teaching, but I'm not this kid's mom :) It is a big waste of time to teach a language unless you use it only as a context for learning the basic concepts. No language that I used in the 1980s is still around today. If I didn't know the concepts, I wouldn't have been able to learn new languages without help. Because I know those concepts, I can look at code in a language I've never seen before, and usually figure it out (with things like LISP and SQL being notable exceptions to that rule)... I don't know if we could teach that ability in schools, but that doesn't mean we shouldn't try.
"Quality Software since 1983!"
http://www.smoothjazzy.com/ - see the "Programming" section for freeware tools and articles.