Unnoticeable yet awesome new C# feature
-
Looks more like SQL or Basic. I hope this doesn't keep seeping into other statements. Like: if(myBool is not false) {} if(thisString does not contain("yipes!")) {} if(myString contains("hello") then change it to "goodbye". var myVar = "a variable" END OF STATEMENT The more "stuff" you add to a statement, the more likely that (1) more mistakes will occur, (2) Intellisense will overflow and stop working, and (3) the compiler will choke to death.
-
I was thinking Sql Server T-Sql. When checking for null you need to use
WHERE A.SomeColumn IS NOT NULL
"X IS NOT NULL" is actually an SQL language standard, not just T-SQL.
Daniel
-
More like Basic ))) I like that.
As a longtime VB developer.. It always makes me smile just a little to watch C# language evolve and become a little more "wordy" with each new version. Having spent my first .NET developer years in a "C# is superior to VB in every way because ......" environment, It warms the heart to see old concepts, syntax, and patterns once viewed as inferior years later turn into evolutionary improvement. I applaud the change as I can see cases where that could codebases that are extremely data heavy read a little easier. For anyone but the purists anyway. If I were converting legacy VB code to C# "IsNot" to "Is Not" would feel more natural to me.
-
With latest c# iteration, instead of
x != null
, one can writex is not null
. Meh, I initially thought. But then I tried to override the==
and!=
operators and then.. I understood! :-DA new .NET Serializer All in one Menu-Ribbon Bar Taking over the world since 1371!
I have been using C# since 2000 and am impressed with how contemplative the languages teams at Microsoft have been to evolve the languages to address the computer science issues of the day. As you mentioned, they are now taking on the issue of nullability and providing the capabilities to identify and address the challenges. I have a background in mathematics and SQL Server so nullability has always been something that I have paid attention. But, many software programmers don't even think in terms of, take for example a Boolean where the values can be true, false, or indeterminate (null). The goal of course is to have more resilient code. The addition of null checking would seem like an easy thing to do, until you realize that the entire .Net library needs to checked and enabled to participate.
-
I was going to say more like SQL
-
Or more like:
ADD 5 TO SUM GIVING SUM5
Yuk! :rolleyes:
Mircea
COBOL truly is filth. What was it that Dijkstra said: "The teaching of COBOL cripples the mind and should be regarded as a criminal offence!"
-
With latest c# iteration, instead of
x != null
, one can writex is not null
. Meh, I initially thought. But then I tried to override the==
and!=
operators and then.. I understood! :-DA new .NET Serializer All in one Menu-Ribbon Bar Taking over the world since 1371!
Hmmm... that seems more like an Easter Egg than a feature... I mean, in terms of fitting in with C#'s regular syntax, it really doesn't... "if (a!=3 || x is not null || b!=null)" ... and so on... ...what happens if you say "if (x is not null || b!=null)" Is that the same as "if (x is not (null || b!=null))? Doesn't work. Can you say "if (x is not 4)"?
-
With latest c# iteration, instead of
x != null
, one can writex is not null
. Meh, I initially thought. But then I tried to override the==
and!=
operators and then.. I understood! :-DA new .NET Serializer All in one Menu-Ribbon Bar Taking over the world since 1371!
Good call out. This is a way to future proof against someone adding a messed up operator later. The ironic thing is that if someone wrote incorrect overrides for “==“ or “!=“ where they neglected null, then the likely outcome would be a null pointer exception from the operator itself. Legacy code, before someone introduces a bad operator!= if (obj != null) obj.f(); Avoids future introduction of bad operator. if (obj is not null) obj.f();
-
Marketing. They have to motivate selling a newer version.
Wrong is evil and must be defeated. - Jeff Ello
Jörgen Andersson wrote:
Marketing.
Probably very valid because I believe Microsoft now markets Visual Studio as a product. It has to pay for itself and make a profit. Before I think they were marketing as a tool to increase Windows acceptance. So it was cheaper.
-
With latest c# iteration, instead of
x != null
, one can writex is not null
. Meh, I initially thought. But then I tried to override the==
and!=
operators and then.. I understood! :-DA new .NET Serializer All in one Menu-Ribbon Bar Taking over the world since 1371!
Super Lloyd wrote:
With latest c# iteration, instead of x != null, one can write x is not null.
Something else to insist people should not use.
Super Lloyd wrote:
But then I tried to override the == and != operators and then.. I understood!
Operator overloading was something that C++ programmers learned to avoid like the plague before C# existed.
-
0x01AA wrote:
Once upon a time c# was such a beautiful, simply/logical
You could say the same thing about nearly every piece of software today. My OS feels bloated, my compiler IDE feels bloated, my word processor is certainly bloated, even the languages are becoming bloated. :sigh: What happened to 'Keep it Simple'? Best Wishes, David Delaune
-
Check out the ESP32 WROVER, but honestly? In Jan. Espressif is officially releasing the ESP32-S3 which is a monster. It has a ton of GPIO, like the ARM boards. It has a USB port you can program to be anything you want (like a USB HID device), it has at least 2MB of PSRAM, and 512kB of SRAM, 300kB or so of which is effectively available for user stuff. The CPU is dual core, at up to 240mhz. The SPI tops out at either 40MHz, or 80Mhz, I forget. If nothing else, I know one of the busses is tappable at 80Mhz but you're sharing it with the PSRAM I think, and you have to be careful how you use it. There might be a totally free 80MHz SPI bus now, I haven't looked into it. But even 40Mhz will drive a small display, and there are enough pins to drive an 8-bit parallel with plenty of pins left over if you need something faster. You can program it in micropython or the ESP-IDF using C or C++. Arduino support is coming, maybe by the time they officially ship. I have a reference board, but I'm not using it because the toolchain is still very preliminary.
Real programmers use butterflies
Don't forget the .net nano framework :-) C# for the win, on ALL ESP32 based devices :-) [.NET nanoFramework – Making it easy to write C# code for embedded systems.](https://www.nanoframework.net/)
-
the best thing for bloating is a good fart or two. I wonder if there is a way to make software fart.
You've clearly never read any of my "Debugging Messages" when I'm trouble shooting then... And yes, I have in the past accidently left one or two in, only to get a puzzled email from a client asking why my software is putting a message box on the screen telling him that it "Just Farted" :-)
-
Don't forget the .net nano framework :-) C# for the win, on ALL ESP32 based devices :-) [.NET nanoFramework – Making it easy to write C# code for embedded systems.](https://www.nanoframework.net/)
Yeah, assuming that's garbage collected I don't see the point.
Real programmers use butterflies
-
the best thing for bloating is a good fart or two. I wonder if there is a way to make software fart.
i keep working on the challenge of making my code not flatulent :wtf:
«The mind is not a vessel to be filled but a fire to be kindled» Plutarch
-
Yeah, assuming that's garbage collected I don't see the point.
Real programmers use butterflies
I believe it is, well as far as I know, they use a rather scaled down micro GC that (from what I'm told) does a better job than the full GC in the .net framework. Of course, this is only what I'm told. :-)
-
I believe it is, well as far as I know, they use a rather scaled down micro GC that (from what I'm told) does a better job than the full GC in the .net framework. Of course, this is only what I'm told. :-)
I have to be very careful in practice about where my RAM is allocated. For example, I have SRAM (fast) and PSRAM (over an 80mhz bus). PSRAM isn't DMA capable, meaning for me to blt bitmaps to the screen requires me to tie up the CPU. So I need to use SRAM to do so asynchronously. Then you have ISR code, which must be loaded into SRAM, and I doubt you can even make ISR code with .net No offense, but what exactly are you going to do with a garbage collected managed code framework on something like an ESP32 that couldn't be done on the same platform several orders of magnitude more efficiently without it? Because nobody has been able to satisfactorily answer that question for me, I have dismissed it as a viable IoT development framework. For now. As time goes on, I'll reconsider my position as the technology warrants it. Micropython I'm uneasy about, but at least there's a better argument for it, and it's battle tested at this point. As much as I don't like it, it has proven capable enough. As for me, for now I'll stick with C++, because of all of the C++ ecosystem I can leverage in the IoT world but also because every cycle i don't spend garbage collecting or interpreting script is one more cycle of battery life.
Real programmers use butterflies
-
I have to be very careful in practice about where my RAM is allocated. For example, I have SRAM (fast) and PSRAM (over an 80mhz bus). PSRAM isn't DMA capable, meaning for me to blt bitmaps to the screen requires me to tie up the CPU. So I need to use SRAM to do so asynchronously. Then you have ISR code, which must be loaded into SRAM, and I doubt you can even make ISR code with .net No offense, but what exactly are you going to do with a garbage collected managed code framework on something like an ESP32 that couldn't be done on the same platform several orders of magnitude more efficiently without it? Because nobody has been able to satisfactorily answer that question for me, I have dismissed it as a viable IoT development framework. For now. As time goes on, I'll reconsider my position as the technology warrants it. Micropython I'm uneasy about, but at least there's a better argument for it, and it's battle tested at this point. As much as I don't like it, it has proven capable enough. As for me, for now I'll stick with C++, because of all of the C++ ecosystem I can leverage in the IoT world but also because every cycle i don't spend garbage collecting or interpreting script is one more cycle of battery life.
Real programmers use butterflies
I hear you, and yes I follow your concerns. What I do know about the nano framework however, is that for me, right now my biggest drawback is lack of drivers for common hardware. For example, I wanted to make use of an HD44780 LCD display, but my displays where all 4 bit with an I2C backpack on them, rather than the full 8 bit one the drivers where designed for. I have for a large chunk had to write my own shims for the hardware I'm using, but I have to say aside from that I have had no other issues in any of the work I've been doing. The project is also getting funding directly from Microsoft to help progress it, so that I think counts for something. To be perfectly honest, I would rather use the meadow platform : [Meadow](https://www.wildernesslabs.co/) But right now they only run on their own hardware. Meadow is more mature, not going to deny that, but it's not available to me right now what I'm working on. As for Nano, as I say, I can't say I've had any issues, but then again maybe I'm not working on the same type of projects you are, nano works largely ok for me, and my increases in productivity come from the fact that I'm using the same language and the same dev environment on both the device, and the device client sides of the equation.
-
I hear you, and yes I follow your concerns. What I do know about the nano framework however, is that for me, right now my biggest drawback is lack of drivers for common hardware. For example, I wanted to make use of an HD44780 LCD display, but my displays where all 4 bit with an I2C backpack on them, rather than the full 8 bit one the drivers where designed for. I have for a large chunk had to write my own shims for the hardware I'm using, but I have to say aside from that I have had no other issues in any of the work I've been doing. The project is also getting funding directly from Microsoft to help progress it, so that I think counts for something. To be perfectly honest, I would rather use the meadow platform : [Meadow](https://www.wildernesslabs.co/) But right now they only run on their own hardware. Meadow is more mature, not going to deny that, but it's not available to me right now what I'm working on. As for Nano, as I say, I can't say I've had any issues, but then again maybe I'm not working on the same type of projects you are, nano works largely ok for me, and my increases in productivity come from the fact that I'm using the same language and the same dev environment on both the device, and the device client sides of the equation.
For doing things like truetype rendering and partial double buffering over SPI (not I2C which is dog slow) you really benefit from DMA, and being able to niggle hardware some. I mean my unoptimized drivers get about 20fps and TFT_eSPI's spi drivers for the same configuration get 30fps. SPI on an ESP32 for 320x240@16bpp tops out at 31fps - just fast enough that the user hopefully doesn't notice your redrawing too much. However, I routinely deal with 800x480 displays using an RA8875 controller that's not even as fast as most other SPI displays. I can top it out at only 20MHz rather than say, 27. All the extra pixels compounds the problem. For me to render true type fonts, jpegs, and such reasonably I really need the speed that DMA buys me when it's available.
Real programmers use butterflies
-
For doing things like truetype rendering and partial double buffering over SPI (not I2C which is dog slow) you really benefit from DMA, and being able to niggle hardware some. I mean my unoptimized drivers get about 20fps and TFT_eSPI's spi drivers for the same configuration get 30fps. SPI on an ESP32 for 320x240@16bpp tops out at 31fps - just fast enough that the user hopefully doesn't notice your redrawing too much. However, I routinely deal with 800x480 displays using an RA8875 controller that's not even as fast as most other SPI displays. I can top it out at only 20MHz rather than say, 27. All the extra pixels compounds the problem. For me to render true type fonts, jpegs, and such reasonably I really need the speed that DMA buys me when it's available.
Real programmers use butterflies
Ah, well you see that's the difference. a huge amount of what I do communicates with a faster device on the client end that does the display (Automotive electronics) All I realistically need to be concerned with at the small device end is, does the device read the inputs fast enough, respond to the instructions sent to it fast enough. In most cases, anything that's time critical is on a GPIO all of it's own, anything that's computed, does so based on input to it's UART, the devices I build are typically sat down in the gunk and crap of the engine block, miles away from the user sat behind the dashboard, and very often each device handles at most only a handful of I/O pins, the most stressful thing I put any of my devices through is upping the serial baud rate :-) Display is usually sat up on the dash, and has something like an ARM7 with a decent clock speed and good amount of memory sat on it. For those cases where the engine end really needs to perform, I'll typically deeply something like a PIC Micro and write it's firmware directly in PIC ASM, but where I can get away with it, I do like using the ESP32 and the Nano framework, as the "head end" that has the display on is very often written using .NET/C# on a device that can handle it.