What I want.....
-
.NET is useful - it provides a huge library of classes that work in a consistent way, unlike the libraries you had to play with with C / C++, and reduces the memory leak problems endemic to pointer based code written by people who think they know what they are doing ... :laugh: It's a tool - and one that works across multiple platforms with a high degree of "similarity". Try that with a native C compiler, and it becomes a struggle to get anything to work in a short timeframe. Don't get me wrong, I miss my assembler days (or decades more accurately) - but .NET is here to stay and I'm happy using it.
Sent from my Amstrad PC 1640 Never throw anything away, Griff Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!
"but .NET is here to stay and I'm happy using it." I can happily say I hardly ever used .NET. As an embedded developer (now retired), .NET and C# were never options for the projects I worked on. Heck, I think only one project in my career had more than a meg of RAM. For my projects it's been C/C++ for at least the last 15 years of my career.
-
I do a lot of development, migrating, porting, etc., in different environments, different systems, different languages, different versions,....get the idea? I have specialized over the years in doing weird stuff. I just finished chasing a bug in some vintage commercial software that was written to accommodate big-endian and little-endian integers depending on the platform. It turned out be that it applied big-endian logic in one place where little-endian was required. The why's and wherefore's of doing this are not important. What is important is the line of thought I reached upon conclusion, i.e., What I want to see in software. 1) I want to be able to declare my variables of a type and then have the necessary conversions take place automatically. E.g., I want to declare a "X" as a 2-byte integer and "S" as an ASCII (or utf8, utf16, whatever...) character string and be able simply specify 'S = X AS "format" --not "S =X.toString("fmt");", "S = itoa(i);" or whatever language conversion functions are appropriate. I know what I have going in, I know what I want coming out--make it easy to get from A to B!! 2) I want my data declarations to be consistent across all platforms/languages/environments/...! Before some reader gets all hot and bothered about using C typedefs, etc., in declarations, I know this can be done. What I want is consistency--i.e., a "short" to be a 2 byte integer, a "long" to be a 4 byte integer,....get the drift. Part of problem I was chasing had to do with the software expecting a C int to be 32 bits, but some 64 bit environments define an int as 64 bits 3) I want my utilities and commands to operate the same way, with the same results, across all platforms. If I do a "myutil -x" command on Linux, I want to see EXACTLY the same output and results across cygwin, Windows 10 Ubuntu, Debian, etc. 4) I want clear, simple, understandable, comprehensible programming constructs. I am tired of chasing errors such as where somebody fat-fingered "=" when "==" was meant or where a brace was misplaced or omitted. I want to be able to look at a piece of code and understand what the author intended easily and clearly. 5) I want clear, complete commercial documentation. I have seen thousands of circular definitions such as: returntype FunctionXYZ(int p1, int p2); Returns XYZ of p1 and p2. BIG WHOOPING DEAL! I'm not an idiot--I can plainly see that from the function call. I often need to know HOW it uses p1 and p2 to arrive at XYZ. (Of course, by now,
I feel your pain. Here is the dilemma: Ease of use for the programmer. As a programmer, I don't want to declare: int4,int8,int16. The concept was that int was the DEFAULT machines bitness, which allowed it to move across platform. In memory this is great. Reading/writing to disk, created the Endian problem. Then when you are not looking you get coercion of types, etc. Add in signed/unsigned, and pretty soon you realize you need a completely Object Based system. To fix what? Write once test everywhere? I feel your pain, but see no solution outside of custom managing types that go into and out of some kind of storage!
-
You and every other quality technician... Seriously though, the problem is that as a profession we have no real enforced standards. Microsoft attempted to be a standard bearer in the 1990s and early 2000s but everyone complained and now we have the morass that technicians are just starting to rightly complain about. The other issue is that we have too many technicians in our ranks that are too eager to promote their own vision of things at the drop of a hat since using a tool that may be one or more years old may not be cool. First the major organizations destroyed the vital functions of IT development and then the technical community got on board and started doing the job for them. Now you have aberrations like MVC replacing Web Forms when it was already available and no one on the Microsoft side of things was interested until that company promoted its own version of MVC (most likely a direct copy of the original). Now you have JavaScript as a major language though it is a major headache to code with. Now you have Agile and DevOps, which in reality are diametrically opposed to quality software engineering standards. And now you have new tool-sets being introduced on a daily basis by anyone who thinks they know what they are doing. In short, the entire profession is a complete mess. And it ain't going to get better in the current economic environments of barbaric capitalism...
Steve Naidamast Sr. Software Engineer Black Falcon Software, Inc. blackfalconsoftware@outlook.com
-
I feel your pain. Here is the dilemma: Ease of use for the programmer. As a programmer, I don't want to declare: int4,int8,int16. The concept was that int was the DEFAULT machines bitness, which allowed it to move across platform. In memory this is great. Reading/writing to disk, created the Endian problem. Then when you are not looking you get coercion of types, etc. Add in signed/unsigned, and pretty soon you realize you need a completely Object Based system. To fix what? Write once test everywhere? I feel your pain, but see no solution outside of custom managing types that go into and out of some kind of storage!
Write once, run everywhere. How many times have you heard a senior management type say "We need it to run on Windows laptops for Accounting but Marketing uses Macs and Engineering uses Linux. And by the way, can you make it accessible from my phone?" If I had a nickel for every time I've heard that, I would make Bill Gates and Warren Buffet combined look like paupers.
-
Write once, run everywhere. How many times have you heard a senior management type say "We need it to run on Windows laptops for Accounting but Marketing uses Macs and Engineering uses Linux. And by the way, can you make it accessible from my phone?" If I had a nickel for every time I've heard that, I would make Bill Gates and Warren Buffet combined look like paupers.
That's why the DOCKER concept was so exciting to me, to be honest. UCSD Pascal had a runtime. Under DOS it was dog slow, but the idea was basically a VM... It ran on Linux and DOS the same, and anywhere else they ported that too, if memory serves me. Imagine a world where (X-windows tried this), you run your application, and the GUI attaches to it! Meaning you need only configure standard IO parameters. Now, this was the beginning of EVERY COBOL program (remember: Environment Division, etc). We have evolved really far, and we are getting places. The one upside of the web was a "Standard" GUI available to program to. Making things like Proton or Docker with a port to do things workable across platforms. And I see that is where we seem to be going. But like Scotty in Star Trek said "The fancier the plumbing, the bigger the problems" (or some such)