Yes, I agree - but there are 'degrees' in all of this. I just find it frustrating that we have an historical legacy in C/C++ that favours efficiency over all else - all I'd like (especially in libraries) is an option - the default behaviour says 'do it safer, but slower', and the option says "get out of the way, I need the speed". A trivial example based upon the std::string class supplied as part of VC++ 6. The std::string constructor that takes a 'char*' parameter will crash if the pointer passed in is NULL. Yet, setting a char* to NULL is a standard way of initialising C-style 'char* strings'. When we moved a set of code from char* to std::string, we spent weeks finding and removing bugs caused by the interaction between 'NULL char* strings' and 'empty std::string'. And we still get crahses in new code when programmers 'forget' the differences and try to initialise std::string using NULL (like they've done for years in C/C++) Why doesn't the constructor test for NULL, and simply set the std::string to 'empty' if it is NULL? This would greatly simply change over from old to new. Why doesn't do this? Because it is inefficient to test the char* against NULL when 95% of the time it will not be NULL. The design favour the 95% case. This seems correct,except that the 5% case means a program crash!!! My point would be that std::string should offer safety first, so the constructor does check against NULL. Then add an optional function or second constructor (taking a second bool parameter, perhaps?) that does not check, so that people who need to eliminate that NULL check can. I'm not arguing against C++ and efficiency, just suggesting that perhaps the time has come to change the focus, and favour protection against simple (but common) mistakes.