What's wrong with Java?
-
I don't understand the snarky comments one sees about Java. :confused: I am well versed in programming both in C# (Visual Studio 2019) and JavaFx (IntelliJ IDE). I enjoy both equally. There must be something wrong with me! :sigh: :laugh:
Get me coffee and no one gets hurt!
-
There are two main issues to me: 1) versioning -- difficult to know which version to run and what functionality I will have -- this is especially after Oracle took over and then it split even more with the OpenJDK and all that nonsense. It's quite difficult. Along with versioning it is difficult to find tools that feel like they are "official". For instance, I am attempting to use JCov (java coverage tool) and it is supposed to be the "official" but very poorly or not documented at all. 2) UI Framework - Oh boy. I remember the original was something like AWT, right? Then JavaFX (but never caught on fully). 3rd party stuff, and controls that are instantly recognizable that they weren't Windows controls. It was all so confusing and there were better options (C#, Visual Studio and MFC, etc). 3) Java Applets they used applets to introduce Java and it was supposed to be gee-whiz. I was like, "a plugin...?? that fails a lot in my browser...?? and needs to be updated constantly...??? which MS doesn't like to support ???" That intro to Java kind of killed it. After that it felt like a slow cumbersome thing with no direct line to components without lots of management. So, over to C#, which was easy. Much of this isn't "fair" to Java, but it is the perception.
Following on with raddevus' points are: 4) Support -- ongoing support is difficult as the Java versions keep changing. Applications that run in "version X update Y" may not run on "update Y+1". And if multiple installed applications require different versions? Getting each to use the right installed version is its own version of DLL Hell. 5) Hiring -- finding people with experience in the needed versions of specific libraries can be tough. If not using the latest and greatest, finding people with experience and a willingness to work in an older version can be all but impossible. Besides, knowing one version of a library may not mean anything in a different version. Java the language? It's just another language. It's got its pluses and minuses, same as every other language. IMO, the serious problems are everything except the language itself. C# has its own issues, but post-deployment it's MUCH easier to support.
-
Nay brother let me lead you to the true path of enlightenment. Nikon shall set you free and with your purchase of a new lens you shall receive the blessing of the shutter gods.
The less you need, the more you have. JaxCoder.com
Hmmph! https://www.flickr.com/photos/awrose/103252765/in/pool-camerawiki ain't it pretty? Story - 35 yrs ago - guy had done some beautiful work and was very proud that the grain was so fine. I said so? I get that smooth out of Tri-X all the time. He goes uh - OH! you're the one with THAT camera!
-
Hmmph! https://www.flickr.com/photos/awrose/103252765/in/pool-camerawiki ain't it pretty? Story - 35 yrs ago - guy had done some beautiful work and was very proud that the grain was so fine. I said so? I get that smooth out of Tri-X all the time. He goes uh - OH! you're the one with THAT camera!
Beautiful camera! I love looking at old photos taken with those types of cameras.
The less you need, the more you have. JaxCoder.com
-
What's a 'camera'? Isn't that what smartphones are for? How else to instantly upload to that other essential invention ... social media? ;P
-
Hmmph! https://www.flickr.com/photos/awrose/103252765/in/pool-camerawiki ain't it pretty? Story - 35 yrs ago - guy had done some beautiful work and was very proud that the grain was so fine. I said so? I get that smooth out of Tri-X all the time. He goes uh - OH! you're the one with THAT camera!
I am very much two-faced about grains. On the one hand, those super-smooth tones from old 4*5" films (or even larger), when the emulsion was stuffed with silver, and not a trace of grain, can be a pleasure to study for their tonal qualities alone. Then, in significant parts of modern B&W photography, graininess is used as an artistic expression, not unlike the 'dottiness' of some impressionist painters. In journalism and sports photography, graininess and hard, 'graphics style' contrast has been a style of expression for at least 50 years. Even in some landscape photography, grains can add structure to a surface that would otherwise be boring (e.g. a misty landscape). My photography books fill two meters of shelf space. In them, I can "in no time" find a hundred photos, from internationally recognized photographers, that would have lost some of their qualities if they were absolutely grain-less and with a smooth, 'natural' tone scale. Fair enough for 'scientific style' documentary photos, but if you want your photo to tell a story, you may need something beyond a simple and boring 'This is exactly how it looks'. It may be true, but So what? Who said "A picture shouldn't show something, it should be something"? In the process of making it "be" something, grains can be a great tool.
-
I don't understand the snarky comments one sees about Java. :confused: I am well versed in programming both in C# (Visual Studio 2019) and JavaFx (IntelliJ IDE). I enjoy both equally. There must be something wrong with me! :sigh: :laugh:
Get me coffee and no one gets hurt!
I tried writing an accounting program in Java, but gave up, when I got tired of trying to outwit UI classes that didn't do the job, and having to write factory classes to construct other classes to do the simplest things in the most complicated way.
-
Beautiful camera! I love looking at old photos taken with those types of cameras.
The less you need, the more you have. JaxCoder.com
-
I tried writing an accounting program in Java, but gave up, when I got tired of trying to outwit UI classes that didn't do the job, and having to write factory classes to construct other classes to do the simplest things in the most complicated way.
-
I don't understand the snarky comments one sees about Java. :confused: I am well versed in programming both in C# (Visual Studio 2019) and JavaFx (IntelliJ IDE). I enjoy both equally. There must be something wrong with me! :sigh: :laugh:
Get me coffee and no one gets hurt!
-
I worked with a customer who implemented a serious, corporate-wide Manufacturing Control System using Java. I worked with them on systems at four sites - San Jose, CA; Szenzhen, China; Mainz, Germany; and a place in Thailand I have forgotten the name of. The MCS was used with Windows, AIX, Linux, MacOS, and a few others I can't remember. We used sockets as our interface and there were no problems with it all. I don't even know what OS the systems we directly interfaced with ran on because we didn't need to. That was a few employers ago so I have forgotten some details now. The systems were used in the manufacturing of disk drives and dealt with making the disks themselves. Assembly happened at other sites and we did a few of those systems too.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
I grew up with Open Systems Interconnection - communication protocols that are very explicitly designed for being completely independent of any given tool used to generate the protocol elements. One very fundamental protocol principle is that the application, and all lower layers, should relate to the protocol specification, and nothing else. As OSI was pushed into the darkness, it felt like a rope being tightened around my neck: the protocols replacing the OSI ones more or less demanded that you use specific libraries, specific OS conventions, specific languages - or else you would have to emulate those specific libraries, OSs and languages. But that is troublesome, because new extensions were added all the time, and keeping up with the updates to the libraries, OSs and languages in your emulation was almost impossible. Sometimes, the binding to specific tools is not immediately apparent. Take the RFC 4506 serialization format (a.k.a SUN XDR): In my archives from the years of the network wars is a "benchmark" of it against OSI BER (Basic Encoding Rules). XDR beats BER by an high factor. What the benchmark documents keep perfectly quiet about is that XDR is more or less a direct streaming of the binary layout of a C struct on a SUN machine; there is hardly any conversion at all. After generating the red tape, you set that byte pointer to the start of the struct and send the following sizeof(struct) bytes out on the line. (Needless to say, this benchmark was run on a SUN.) I never thought BER would be anywhere as fast as XDR (BER has a lot more flexibility, which can't be realized at zero extra cost). But if you set up a similar benchmark on a non-SUN machine, the serialization cost of XDR might easily raise by a magnitude. Say, if the machine had a different float format. Different byte order. Maybe an old machine wouldn't be byte addressable - my first three university years, I was programming non byte addressable machines exclusively; the Univac even used 1-complement integers, and its proprietary 6-bit character set. (The Decsystem series used 7-bit ASCII, packed 5 to a 36-bit word.) Say, if your language has proper string handling, you will have to explicitly add a C-style terminating NUL at the end (a use of NUL which is in direct conflict the the ASCII / ISO 646 standard). The specification of how to encode a structure is given by how a C declaration of it looks. And so on. This is typical: As long as you use the "proper" machine, the "proper" language, the "proper" OS, you find it very eas
-
I grew up with Open Systems Interconnection - communication protocols that are very explicitly designed for being completely independent of any given tool used to generate the protocol elements. One very fundamental protocol principle is that the application, and all lower layers, should relate to the protocol specification, and nothing else. As OSI was pushed into the darkness, it felt like a rope being tightened around my neck: the protocols replacing the OSI ones more or less demanded that you use specific libraries, specific OS conventions, specific languages - or else you would have to emulate those specific libraries, OSs and languages. But that is troublesome, because new extensions were added all the time, and keeping up with the updates to the libraries, OSs and languages in your emulation was almost impossible. Sometimes, the binding to specific tools is not immediately apparent. Take the RFC 4506 serialization format (a.k.a SUN XDR): In my archives from the years of the network wars is a "benchmark" of it against OSI BER (Basic Encoding Rules). XDR beats BER by an high factor. What the benchmark documents keep perfectly quiet about is that XDR is more or less a direct streaming of the binary layout of a C struct on a SUN machine; there is hardly any conversion at all. After generating the red tape, you set that byte pointer to the start of the struct and send the following sizeof(struct) bytes out on the line. (Needless to say, this benchmark was run on a SUN.) I never thought BER would be anywhere as fast as XDR (BER has a lot more flexibility, which can't be realized at zero extra cost). But if you set up a similar benchmark on a non-SUN machine, the serialization cost of XDR might easily raise by a magnitude. Say, if the machine had a different float format. Different byte order. Maybe an old machine wouldn't be byte addressable - my first three university years, I was programming non byte addressable machines exclusively; the Univac even used 1-complement integers, and its proprietary 6-bit character set. (The Decsystem series used 7-bit ASCII, packed 5 to a 36-bit word.) Say, if your language has proper string handling, you will have to explicitly add a C-style terminating NUL at the end (a use of NUL which is in direct conflict the the ASCII / ISO 646 standard). The specification of how to encode a structure is given by how a C declaration of it looks. And so on. This is typical: As long as you use the "proper" machine, the "proper" language, the "proper" OS, you find it very eas
I saw that tendency in MFC and I never bought into that aspect of it. I try my best to avoid all of its collections and OS synchronization object wrappers and I use STL and write my own wrappers. That lets me leave only the UI for MFC and I actually like how it handles most of that. It's pretty much on life support now so soon I am going to leave it and move on.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
-
I don't understand the snarky comments one sees about Java. :confused: I am well versed in programming both in C# (Visual Studio 2019) and JavaFx (IntelliJ IDE). I enjoy both equally. There must be something wrong with me! :sigh: :laugh:
Get me coffee and no one gets hurt!
Depends on what you what domain you are working in. I wrote a DAB (digital Aufio Broadcasting) decoder in C++ and simplified versions - just as a programming exercise - in Ada and in java. The type of program of such a decoder requires extensive interaction with libraries written in C (to name a few, device handling, fft transforms, and aac decoding). In my personal opinion, binding java structures to C libraries is a crime. btw the Ada binding is simpler since I was using the Gnat compiler system, but even then ... Java is just a language, it is not my choice, but for many applications it seems more or less OK. Personally I do not like the GUI handling, but that is probably a matter of taste. The misery with binding to non-java (read: C) libraries is such that I would not recommend it for applications where one depends on that kind of libraries (I dislike all kinds of so-called integrated environments such as IntelliJ or whatever, right now I am writing some stuff where I (more or less) have to use VS as development environment. It is probably my ignorance, but I absolutely dislike the destruction of the formats I use in my coding, and the error messages are a horror. For me the command line tools such as vim, qmake, mae and the GCC suite - with gdb as debugger under Linux - are the ideal development tools)
-
I don't understand the snarky comments one sees about Java. :confused: I am well versed in programming both in C# (Visual Studio 2019) and JavaFx (IntelliJ IDE). I enjoy both equally. There must be something wrong with me! :sigh: :laugh:
Get me coffee and no one gets hurt!
What's wrong with Java is that it's 30 years old or something, and its architecture is a prisoner of what was available at the time: it no longer makes any sense now that richer and more capable systems exist. And it doesn't have properties, which is ridiculous.
-
Nothing really, as long as you remain inside its walled garden. I had a few interactions with Java, all of them ending in pain and tears because at some point I needed step out of the Virtual Machine. I fondly remember discovering that the byte type is signed (why, really why ???) and spending a few days debugging my hardware to figure out in the end that Java was to blame. Or the magical moment when one of the gazilion DLLs needed by an over engineered project had a bug. I simply fixed the bug and recompiled the source and build the DLL again. Something none of the over Java experts were even aware it was possible. And of course, how can I forget when I relied on the Java standard String library only to find out that the target where the program ran had an incomplete (but still announced as 100% compatible) implementation of that library. What can be more fun than writing your own standard library functions ? A bit more serious, there is nothing wrong with Java. It is widely used, and in most cases it is good enough. I was just an unfortunate victim of the attempt to using Java in the embedded world, where it most definitively is not right an appropriate tool.
NelsonGoncalves wrote:
I fondly remember discovering that the byte type is signed (why, really why ???)
Since my student days (long ago!) I have been fighting this concept that "inside a computer, everything is a number". No, it isn't! Data are bit patterns, that are bit patterns, and not "zeroes and ones". A type defines which bit patterns are used (on a given machine) to represent various values, such as 'x' or 'y'. They are letters, da**it, no sort of 'numbers'. Similarly, colors are colors. Dog breeds are dog breeds. Weekdays are weekdays. Seasons are seasons. One problem is that computer professionals are among the most fierce defenders of this 'number' concept, arguing that 'A' really is 65 (or, most would prefer 0x41, but still a 'number'). They think it perfectly natural to divide 'o' by two does not give 'c' (as you might think from the graphical image), but '7' -and that is a perfectly valid operation because 'o' is really not a letter but the numeric value 111, and '7' is really 55. Even programmers who have worked with objects and abstractions and abstractions of abstractions still are unable to see a bit pattern as directly representing something that is not numerical. They cannot relate to the bit pattern as a representation of abstract information of arbitrary type, but must go via a numeric interpretation. So we get this idea that an uninterpreted octet (the ISO term, partially accepted even outside ISO), a.k.a an 8 bit 'byte', in spite of its uninterpretedness does have a numeric interpretation, by being signed. I shake my head: How much has the IT world progressed the last three to four decades (i.e. since High Level Languages took over) at all in the direction of a 'scientific discipline'? When we can't even manage abstractions at the octet level, but insist on a numeric interpretation when it isn't, then I think we are quite remote from a science on a solid academic foundation. The bad thing is that we are not making very fast progress. 40+ years ago, you could, in Pascal, declare 'type season = (winter, spring, summer, fall)', and they are not numeric: You cannot divide summer by two to get spring (the way you can in C and lots of its derivates). There is no strong movement among software developers for a proper enumeration, discrete value, concept: We have written so much software that depends on spring+2 being fall. It would created havo
-
I don't understand the snarky comments one sees about Java. :confused: I am well versed in programming both in C# (Visual Studio 2019) and JavaFx (IntelliJ IDE). I enjoy both equally. There must be something wrong with me! :sigh: :laugh:
Get me coffee and no one gets hurt!
Haters are going to Hate! No matter what. The VB guys have been living with B.S. for years.:mad:
Wear a mask! Wash your hands too. The life you save might be your own.
-
I don't understand the snarky comments one sees about Java. :confused: I am well versed in programming both in C# (Visual Studio 2019) and JavaFx (IntelliJ IDE). I enjoy both equally. There must be something wrong with me! :sigh: :laugh:
Get me coffee and no one gets hurt!
Oracle. They're really bad at making software ecosystems that are pleasant to work with. Nothing wrong with the language on it's own though.
-
It was a very long time ago, at the time Swing was being developed. So, that may have fixed some of the formatting issues. I wanted an HTML-like table with wrapped text in the cells. I think I succeeded, but the low level code I had to do to wrap text was not pretty. I still have a problem with all of those factory classes.
-
It was a very long time ago, at the time Swing was being developed. So, that may have fixed some of the formatting issues. I wanted an HTML-like table with wrapped text in the cells. I think I succeeded, but the low level code I had to do to wrap text was not pretty. I still have a problem with all of those factory classes.
Well, today the task would probably be a lot simpler: Create a TableView with a TextArea in the cells. That should do it, but you would probably need JavaFX. I have created a number of TableViews with controls like CheckBoxes and Rectangles in the cells.
Get me coffee and no one gets hurt!
-
Well, today the task would probably be a lot simpler: Create a TableView with a TextArea in the cells. That should do it, but you would probably need JavaFX. I have created a number of TableViews with controls like CheckBoxes and Rectangles in the cells.
Get me coffee and no one gets hurt!
That's nice to hear! Just about 20 years too late for me, but still nice to hear, in case I ever use Java again. :-)