Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. What's wrong with Java?

What's wrong with Java?

Scheduled Pinned Locked Moved The Lounge
csharpvisual-studiojavaquestion
75 Posts 34 Posters 7 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C Cp Coder

    I don't understand the snarky comments one sees about Java. :confused: I am well versed in programming both in C# (Visual Studio 2019) and JavaFx (IntelliJ IDE). I enjoy both equally. There must be something wrong with me! :sigh: :laugh:

    Get me coffee and no one gets hurt!

    M Offline
    M Offline
    Member_5893260
    wrote on last edited by
    #65

    What's wrong with Java is that it's 30 years old or something, and its architecture is a prisoner of what was available at the time: it no longer makes any sense now that richer and more capable systems exist. And it doesn't have properties, which is ridiculous.

    1 Reply Last reply
    0
    • N NelsonGoncalves

      Nothing really, as long as you remain inside its walled garden. I had a few interactions with Java, all of them ending in pain and tears because at some point I needed step out of the Virtual Machine. I fondly remember discovering that the byte type is signed (why, really why ???) and spending a few days debugging my hardware to figure out in the end that Java was to blame. Or the magical moment when one of the gazilion DLLs needed by an over engineered project had a bug. I simply fixed the bug and recompiled the source and build the DLL again. Something none of the over Java experts were even aware it was possible. And of course, how can I forget when I relied on the Java standard String library only to find out that the target where the program ran had an incomplete (but still announced as 100% compatible) implementation of that library. What can be more fun than writing your own standard library functions ? A bit more serious, there is nothing wrong with Java. It is widely used, and in most cases it is good enough. I was just an unfortunate victim of the attempt to using Java in the embedded world, where it most definitively is not right an appropriate tool.

      T Online
      T Online
      trønderen
      wrote on last edited by
      #66

      NelsonGoncalves wrote:

      I fondly remember discovering that the byte type is signed (why, really why ???)

      Since my student days (long ago!) I have been fighting this concept that "inside a computer, everything is a number". No, it isn't! Data are bit patterns, that are bit patterns, and not "zeroes and ones". A type defines which bit patterns are used (on a given machine) to represent various values, such as 'x' or 'y'. They are letters, da**it, no sort of 'numbers'. Similarly, colors are colors. Dog breeds are dog breeds. Weekdays are weekdays. Seasons are seasons. One problem is that computer professionals are among the most fierce defenders of this 'number' concept, arguing that 'A' really is 65 (or, most would prefer 0x41, but still a 'number'). They think it perfectly natural to divide 'o' by two does not give 'c' (as you might think from the graphical image), but '7' -and that is a perfectly valid operation because 'o' is really not a letter but the numeric value 111, and '7' is really 55. Even programmers who have worked with objects and abstractions and abstractions of abstractions still are unable to see a bit pattern as directly representing something that is not numerical. They cannot relate to the bit pattern as a representation of abstract information of arbitrary type, but must go via a numeric interpretation. So we get this idea that an uninterpreted octet (the ISO term, partially accepted even outside ISO), a.k.a an 8 bit 'byte', in spite of its uninterpretedness does have a numeric interpretation, by being signed. I shake my head: How much has the IT world progressed the last three to four decades (i.e. since High Level Languages took over) at all in the direction of a 'scientific discipline'? When we can't even manage abstractions at the octet level, but insist on a numeric interpretation when it isn't, then I think we are quite remote from a science on a solid academic foundation. The bad thing is that we are not making very fast progress. 40+ years ago, you could, in Pascal, declare 'type season = (winter, spring, summer, fall)', and they are not numeric: You cannot divide summer by two to get spring (the way you can in C and lots of its derivates). There is no strong movement among software developers for a proper enumeration, discrete value, concept: We have written so much software that depends on spring+2 being fall. It would created havo

      N 1 Reply Last reply
      0
      • C Cp Coder

        I don't understand the snarky comments one sees about Java. :confused: I am well versed in programming both in C# (Visual Studio 2019) and JavaFx (IntelliJ IDE). I enjoy both equally. There must be something wrong with me! :sigh: :laugh:

        Get me coffee and no one gets hurt!

        S Offline
        S Offline
        Slow Eddie
        wrote on last edited by
        #67

        Haters are going to Hate! No matter what. The VB guys have been living with B.S. for years.:mad:

        Wear a mask! Wash your hands too. The life you save might be your own.

        1 Reply Last reply
        0
        • C Cp Coder

          I don't understand the snarky comments one sees about Java. :confused: I am well versed in programming both in C# (Visual Studio 2019) and JavaFx (IntelliJ IDE). I enjoy both equally. There must be something wrong with me! :sigh: :laugh:

          Get me coffee and no one gets hurt!

          K Offline
          K Offline
          KateAshman
          wrote on last edited by
          #68

          Oracle. They're really bad at making software ecosystems that are pleasant to work with. Nothing wrong with the language on it's own though.

          1 Reply Last reply
          0
          • C Cp Coder

            Were you using Swing or JavaFX?

            Get me coffee and no one gets hurt!

            B Offline
            B Offline
            Bruce Patin
            wrote on last edited by
            #69

            It was a very long time ago, at the time Swing was being developed. So, that may have fixed some of the formatting issues. I wanted an HTML-like table with wrapped text in the cells. I think I succeeded, but the low level code I had to do to wrap text was not pretty. I still have a problem with all of those factory classes.

            C 1 Reply Last reply
            0
            • B Bruce Patin

              It was a very long time ago, at the time Swing was being developed. So, that may have fixed some of the formatting issues. I wanted an HTML-like table with wrapped text in the cells. I think I succeeded, but the low level code I had to do to wrap text was not pretty. I still have a problem with all of those factory classes.

              C Offline
              C Offline
              Cp Coder
              wrote on last edited by
              #70

              Well, today the task would probably be a lot simpler: Create a TableView with a TextArea in the cells. That should do it, but you would probably need JavaFX. I have created a number of TableViews with controls like CheckBoxes and Rectangles in the cells.

              Get me coffee and no one gets hurt!

              B 1 Reply Last reply
              0
              • C Cp Coder

                Well, today the task would probably be a lot simpler: Create a TableView with a TextArea in the cells. That should do it, but you would probably need JavaFX. I have created a number of TableViews with controls like CheckBoxes and Rectangles in the cells.

                Get me coffee and no one gets hurt!

                B Offline
                B Offline
                Bruce Patin
                wrote on last edited by
                #71

                That's nice to hear! Just about 20 years too late for me, but still nice to hear, in case I ever use Java again. :-)

                1 Reply Last reply
                0
                • Greg UtasG Greg Utas

                  Special cases or not, there's no way I'd go to assembler and give up all the things that an OO language like C++ provides. And the special cases I'm thinking of aren't a question of trying to outsmart anything. One of them, in serious production code, was morphing an object to a sibling class in the inheritance hierarchy by changing its vptr. The objects' memory came from a pool of blocks, not the heap, so objects from both classes fit into the same block. No deep copying, no worries about stale pointers to the object, just abracadabra, and its behavior is now what's needed. :-D

                  Robust Services Core | Software Techniques for Lemmings | Articles
                  The fox knows many things, but the hedgehog knows one big thing.

                  K Offline
                  K Offline
                  Kirk 10389821
                  wrote on last edited by
                  #72

                  Your example is no different that code I saw implemented, where the sort order, leveraged the same code by the program MOVING the appropriate "Branch If >=" vs "Branch If <=" to X bytes forward on the execution pointer. I remember reading that code over and over. Utterly confused. I had NEVER SEEN a move to a relative offset from the SP (Stack Pointer). It was already coded for the the ASCENDING sort, if they are overwriting that address, if it is ascending, it made no sense (ah, but the code broke when he was wrong on the pointer location, LOL)... Anyways, I would EITHER shoot or beat a programmer for that kind of optimization. The C solution, of using a pointer to a function() that returned the appropriate sorting value induced HUGE functionality benefits, cleaner code and tons more flexibility. Almost pure elegance, IMO. The code has 3 consumers. The Current Developer, The Compiler, and the Next Developer. The last one being the most important!

                  Greg UtasG 1 Reply Last reply
                  0
                  • K Kirk 10389821

                    Your example is no different that code I saw implemented, where the sort order, leveraged the same code by the program MOVING the appropriate "Branch If >=" vs "Branch If <=" to X bytes forward on the execution pointer. I remember reading that code over and over. Utterly confused. I had NEVER SEEN a move to a relative offset from the SP (Stack Pointer). It was already coded for the the ASCENDING sort, if they are overwriting that address, if it is ascending, it made no sense (ah, but the code broke when he was wrong on the pointer location, LOL)... Anyways, I would EITHER shoot or beat a programmer for that kind of optimization. The C solution, of using a pointer to a function() that returned the appropriate sorting value induced HUGE functionality benefits, cleaner code and tons more flexibility. Almost pure elegance, IMO. The code has 3 consumers. The Current Developer, The Compiler, and the Next Developer. The last one being the most important!

                    Greg UtasG Offline
                    Greg UtasG Offline
                    Greg Utas
                    wrote on last edited by
                    #73

                    I also like the "pointer to the comparison function" design. The fourth consumer of code is the customer, and our customers were demanding when it came to performance. The scenario in question occurred with sufficient frequency that no one objected to the design. The code in question called a MorphTo function, which was a giveaway as to what was going on.

                    Robust Services Core | Software Techniques for Lemmings | Articles
                    The fox knows many things, but the hedgehog knows one big thing.

                    <p><a href="https://github.com/GregUtas/robust-services-core/blob/master/README.md">Robust Services Core</a>
                    <em>The fox knows many things, but the hedgehog knows one big thing.</em></p>

                    1 Reply Last reply
                    0
                    • T trønderen

                      NelsonGoncalves wrote:

                      I fondly remember discovering that the byte type is signed (why, really why ???)

                      Since my student days (long ago!) I have been fighting this concept that "inside a computer, everything is a number". No, it isn't! Data are bit patterns, that are bit patterns, and not "zeroes and ones". A type defines which bit patterns are used (on a given machine) to represent various values, such as 'x' or 'y'. They are letters, da**it, no sort of 'numbers'. Similarly, colors are colors. Dog breeds are dog breeds. Weekdays are weekdays. Seasons are seasons. One problem is that computer professionals are among the most fierce defenders of this 'number' concept, arguing that 'A' really is 65 (or, most would prefer 0x41, but still a 'number'). They think it perfectly natural to divide 'o' by two does not give 'c' (as you might think from the graphical image), but '7' -and that is a perfectly valid operation because 'o' is really not a letter but the numeric value 111, and '7' is really 55. Even programmers who have worked with objects and abstractions and abstractions of abstractions still are unable to see a bit pattern as directly representing something that is not numerical. They cannot relate to the bit pattern as a representation of abstract information of arbitrary type, but must go via a numeric interpretation. So we get this idea that an uninterpreted octet (the ISO term, partially accepted even outside ISO), a.k.a an 8 bit 'byte', in spite of its uninterpretedness does have a numeric interpretation, by being signed. I shake my head: How much has the IT world progressed the last three to four decades (i.e. since High Level Languages took over) at all in the direction of a 'scientific discipline'? When we can't even manage abstractions at the octet level, but insist on a numeric interpretation when it isn't, then I think we are quite remote from a science on a solid academic foundation. The bad thing is that we are not making very fast progress. 40+ years ago, you could, in Pascal, declare 'type season = (winter, spring, summer, fall)', and they are not numeric: You cannot divide summer by two to get spring (the way you can in C and lots of its derivates). There is no strong movement among software developers for a proper enumeration, discrete value, concept: We have written so much software that depends on spring+2 being fall. It would created havo

                      N Offline
                      N Offline
                      NelsonGoncalves
                      wrote on last edited by
                      #74

                      trønderen wrote:

                      We have written so much software that depends on spring+2 being fall. It would created havoc if we abandon the idea of a season as an integer value.

                      I actually think it is a good thing that we can do 'fall = spring + 2'. First because it makes sense intuitively. Second, because although computers hold bits you need to do operations on those bits otherwise computers are not useful. And (some sort of) math seems to be a good enough common denominator that mostly everybody can rely and build on. Personally I tend to view software development more as art, and less as science.

                      1 Reply Last reply
                      0
                      • C Cp Coder

                        I don't understand the snarky comments one sees about Java. :confused: I am well versed in programming both in C# (Visual Studio 2019) and JavaFx (IntelliJ IDE). I enjoy both equally. There must be something wrong with me! :sigh: :laugh:

                        Get me coffee and no one gets hurt!

                        M Offline
                        M Offline
                        Matt McGuire
                        wrote on last edited by
                        #75

                        I've done a lot of work in Java and I can say that I'm not a huge fan of it. If it works good for you and you like it then there is nothing wrong with it. I can't even place exactly why I don't like Java. I like C#, VB.net (nostalgic), C, Rust, javascript. it's funny because at one time I didn't like C, and bashed on it because C++ was "better", getting older now I actually enjoy C more than C++, and literally haven't touched C++ in a decade at least. there's no point in bashing languages, if they become unpopular enough they go away on their own or adapt.

                        1 Reply Last reply
                        0
                        Reply
                        • Reply as topic
                        Log in to reply
                        • Oldest to Newest
                        • Newest to Oldest
                        • Most Votes


                        • Login

                        • Don't have an account? Register

                        • Login or register to search.
                        • First post
                          Last post
                        0
                        • Categories
                        • Recent
                        • Tags
                        • Popular
                        • World
                        • Users
                        • Groups