Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Am I Being Bullied by Microsoft

Am I Being Bullied by Microsoft

Scheduled Pinned Locked Moved The Lounge
game-devsysadminquestion
53 Posts 20 Posters 1 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • A adel ahmadyan

    There were days that programming languages were tools at the hands of developers to build great things. Nowadays, majority of developers are the just typist to convert a syntax of some shitty language to some app. This is just sad. I'm a software developer, or at least I like to think that. The only true languages that I fell in love with were C/C++ and Java. And all of these languages were around for more than a decade. I know how write (type) code in dozens of other languages , but I just don't trust them enough to use them in any real-world case. But now, you see lots of kids that just want to show off their skills by writing code in some language that some person/company designed. I mean why would anyone think that learning a new technology would indicate the level of your proficiency in programming is beyond my understanding. I agree with the fact that in programming, nothing has foundamentally changed: of course the underlying technology is always updating, but when you write in a great language, that language should distance you from those details. If you find yourself thinking too much about anything other than the algorithms, either change your language or accept the fact that you're a coder, not a developer. To the OP: you made your own bed when you chose a closed eco-system language and a technology that hasn't passed its test. Now you're at the mercy of Microsoft. Suck it an pay up.

    L Offline
    L Offline
    Lost User
    wrote on last edited by
    #44

    adel ahmadyan wrote:

    To the OP: you made your own bed when you chose a closed eco-system language and a technology that hasn't passed its test. Now you're at the mercy of Microsoft. Suck it an pay up.

    They will learn, eventually. At least those, who reach the point where playing with Lego does not bring them any further and that they can indeed make anything they want themselves. We just had the privilege to skip the Lego phase because Lego was not really invented back then. We had to rely more on our own stuff and have little reason to get excited over the next new Lego sets they want to sell us.

    At least artificial intelligence already is superior to natural stupidity

    1 Reply Last reply
    0
    • J jschell

      lewax00 wrote:

      I'll be doing something productive, like finishing the same application in a fraction of the time by learning to use and utilize new tools as they become available

      How exactly did you measure that improvement? Did you base your results on many different problem domains? Do those problem domains span not only different industry but also different platform targets like 24x7 high volume servers, game platforms, phones, tablets and embedded systems? Are you claiming that your personal (presumably) experience spans differing skill sets of development teams and different experience levels? Are you claiming that all tools each individually reduce your development time to a fraction? And absolutely none of them are duds? Actually given the vast number of new technologies introduced every single year then if even a fraction of them individually reduce development time to a "fraction" then you must be finishing all of your projects in mere seconds.

      L Offline
      L Offline
      lewax00
      wrote on last edited by
      #45

      jschell wrote:

      Are you claiming that all tools each individually reduce your development time to a fraction? And absolutely none of them are duds?

      No, but the fact that we aren't all programming machine code shows that at least some of them must be useful (I don't know about you, but I certainly don't want to create things like collection classes from the ground up every time I write a new program). And how are you going to tell what's useful and what's a dud without trying it out?

      jschell wrote:

      Actually given the vast number of new technologies introduced every single year then if even a fraction of them individually reduce development time to a "fraction" then you must be finishing all of your projects in mere seconds.

      And if you want to split hairs (and apparently you do), a fraction can also be something like 999/1000, it would take a lot of those before you could reduce it on the scale you're talking about. But basically, if I felt a tool wasn't helping me be more productive, I wouldn't bother to use it.

      L J 2 Replies Last reply
      0
      • L lewax00

        jschell wrote:

        Are you claiming that all tools each individually reduce your development time to a fraction? And absolutely none of them are duds?

        No, but the fact that we aren't all programming machine code shows that at least some of them must be useful (I don't know about you, but I certainly don't want to create things like collection classes from the ground up every time I write a new program). And how are you going to tell what's useful and what's a dud without trying it out?

        jschell wrote:

        Actually given the vast number of new technologies introduced every single year then if even a fraction of them individually reduce development time to a "fraction" then you must be finishing all of your projects in mere seconds.

        And if you want to split hairs (and apparently you do), a fraction can also be something like 999/1000, it would take a lot of those before you could reduce it on the scale you're talking about. But basically, if I felt a tool wasn't helping me be more productive, I wouldn't bother to use it.

        L Offline
        L Offline
        Lost User
        wrote on last edited by
        #46

        lewax00 wrote:

        No, but the fact that we aren't all programming machine code shows that at least some of them must be useful

        Machine code must be something absolutely scary. I begin to wonder what remains when we take away all your tools and gadgets. How far would you get without them?

        lewax00 wrote:

        but I certainly don't want to create things like collection classes from the ground up every time I write a new program

        That answers a few questions.

        At least artificial intelligence already is superior to natural stupidity

        L 1 Reply Last reply
        0
        • L Lost User

          lewax00 wrote:

          No, but the fact that we aren't all programming machine code shows that at least some of them must be useful

          Machine code must be something absolutely scary. I begin to wonder what remains when we take away all your tools and gadgets. How far would you get without them?

          lewax00 wrote:

          but I certainly don't want to create things like collection classes from the ground up every time I write a new program

          That answers a few questions.

          At least artificial intelligence already is superior to natural stupidity

          L Offline
          L Offline
          lewax00
          wrote on last edited by
          #47

          CDP1802 wrote:

          Machine code must be something absolutely scary.

          No, I use it as an example because that's the original form of programming, keying in codes to the processor or memory directly. If nothing had changed than that leaves you in that original state, i.e. programming in machine code. (I would use assembly because of the 1:1 correspondence, but even those mnemonics are an advancement, and that introduces an assembler into the tool chain.) I can program in a few different assembly languages (and by extension, machine code), but I choose not to because in the vast majority of applications it is a waste of time, and it's much harder to maintain. For example, if I go into VS and create a new WinForms application, compile it without changes, the result is almost 300 lines of IL. Each of those IL instructions will translate into at least one machine instruction (I doubt many, if any at all, will translate to just one instruction). So are you saying that hundreds of lines of assembly is going to be less error prone than the total 79 lines of C#?

          L 1 Reply Last reply
          0
          • L lewax00

            jschell wrote:

            Are you claiming that all tools each individually reduce your development time to a fraction? And absolutely none of them are duds?

            No, but the fact that we aren't all programming machine code shows that at least some of them must be useful (I don't know about you, but I certainly don't want to create things like collection classes from the ground up every time I write a new program). And how are you going to tell what's useful and what's a dud without trying it out?

            jschell wrote:

            Actually given the vast number of new technologies introduced every single year then if even a fraction of them individually reduce development time to a "fraction" then you must be finishing all of your projects in mere seconds.

            And if you want to split hairs (and apparently you do), a fraction can also be something like 999/1000, it would take a lot of those before you could reduce it on the scale you're talking about. But basically, if I felt a tool wasn't helping me be more productive, I wouldn't bother to use it.

            J Offline
            J Offline
            jschell
            wrote on last edited by
            #48

            lewax00 wrote:

            No, but the fact that we aren't all programming machine code shows that at least some of them must be useful

            We are not using object oriented databases but are using object oriented programming. The latter is something that was successful while the former is not. Many, many more ideas fail versus those that are successful (by the way OLE is something from Microsoft that didn't last.)

            lewax00 wrote:

            And how are you going to tell what's useful and what's a dud without trying it out?

            I didn't "try out" object oriented programming when it showed up. But I have been using it for many, many years. I did however try out object databases and EJBs and they were, for the most part, miserable failures.

            lewax00 wrote:

            And if you want to split hairs (and apparently you do), a fraction can also be something like 999/1000

            If you spend 8 hours trying 100 new technologies a year that is almost 1/3 of the year wasted. So a 999/1000 saving means a total loss.

            lewax00 wrote:

            But basically, if I felt a tool wasn't helping me be more productive, I wouldn't bother to use it.

            Great but how to you amortize the time that you spend figuring that out versus you actual productive time?

            L 1 Reply Last reply
            0
            • L lewax00

              CDP1802 wrote:

              Machine code must be something absolutely scary.

              No, I use it as an example because that's the original form of programming, keying in codes to the processor or memory directly. If nothing had changed than that leaves you in that original state, i.e. programming in machine code. (I would use assembly because of the 1:1 correspondence, but even those mnemonics are an advancement, and that introduces an assembler into the tool chain.) I can program in a few different assembly languages (and by extension, machine code), but I choose not to because in the vast majority of applications it is a waste of time, and it's much harder to maintain. For example, if I go into VS and create a new WinForms application, compile it without changes, the result is almost 300 lines of IL. Each of those IL instructions will translate into at least one machine instruction (I doubt many, if any at all, will translate to just one instruction). So are you saying that hundreds of lines of assembly is going to be less error prone than the total 79 lines of C#?

              L Offline
              L Offline
              Lost User
              wrote on last edited by
              #49

              lewax00 wrote:

              No, I use it as an example because that's the original form of programming, keying in codes to the processor or memory directly. If nothing had changed than that leaves you in that original state, i.e. programming in machine code. (I would use assembly because of the 1:1 correspondence, but even those mnemonics are an advancement, and that introduces an assembler into the tool chain.)

              You don't say :) When I look to the left, I see my first computer on a table next to my desk. It's one like this[^]. The cards in the bus slots in the picture are one for parallel IO ports, RS232 (300 baud) and an interface for a cassette recorder and (the one to the right) full 4k static RAM. The best way to make use of the little memory and the ancient CPU was and still is to enter machine code with the hex keyboard. There is no other software involved, not even an OS, so you have the entire memory for whatever you want to do. I keep the old computer in good repair and, when I find the time, rebuild old boards I have gotten hold of (like that color graphics card for which I finally scraped together all parts). Even if I would not generally recommend working like this anymore, I learned a few lessons the hard way: First I learned the importance of algorithms and structure. The language, even if it's machine code or assembly, is secondary. If you use it every day then you can also read it without problems. The real problem was awkward spaghetti code. My solution was 'structure' meaning packing everything in functions and some simple conventions. Simple, but sufficient. I can still make sense of the forgotten programs that I scrape off the old cassette tapes. And, over more than 30 years and many different computers, I think I got a good understanding of algorithms and architecture. This led to reusable code. I stored functions on cassette tapes and loaded them when I needed them, still manually adapting the memory addresses. What a primitive way to import code, but I did not have to start from scratch and could rely on code that had already proven itself. Even today I still build libraries, whatever it may be that I'm told to work on. They grow with every project, evolve to fit exactly to whatever we are working on, require practically no learning effort from new team members and cause the least trouble. Bei

              1 Reply Last reply
              0
              • J jschell

                lewax00 wrote:

                No, but the fact that we aren't all programming machine code shows that at least some of them must be useful

                We are not using object oriented databases but are using object oriented programming. The latter is something that was successful while the former is not. Many, many more ideas fail versus those that are successful (by the way OLE is something from Microsoft that didn't last.)

                lewax00 wrote:

                And how are you going to tell what's useful and what's a dud without trying it out?

                I didn't "try out" object oriented programming when it showed up. But I have been using it for many, many years. I did however try out object databases and EJBs and they were, for the most part, miserable failures.

                lewax00 wrote:

                And if you want to split hairs (and apparently you do), a fraction can also be something like 999/1000

                If you spend 8 hours trying 100 new technologies a year that is almost 1/3 of the year wasted. So a 999/1000 saving means a total loss.

                lewax00 wrote:

                But basically, if I felt a tool wasn't helping me be more productive, I wouldn't bother to use it.

                Great but how to you amortize the time that you spend figuring that out versus you actual productive time?

                L Offline
                L Offline
                lewax00
                wrote on last edited by
                #50

                jschell wrote:

                I didn't "try out" object oriented programming when it showed up. But I have been using it for many, many years.

                So you just magically learned how to properly use them without taking any time to learn them? You never wrote sample projects to see how it all works? That sounds both unlikely and boring.

                jschell wrote:

                If you spend 8 hours trying 100 new technologies a year that is almost 1/3 of the year wasted. So a 999/1000 saving means a total loss.

                But since none of these numbers are based in anything but being pulled out of our ***es that's hardly proof either way.

                jschell wrote:

                Great but how to you amortize the time that you spend figuring that out versus you actual productive time?

                I don't amortize my free time, do you? I like trying out new things. And when I can apply it to something productive, great. When I can't, at least I amused myself for some time. For example, Python. I played around with it in my free time, creating useless little apps and games to learn the language, and carried it over into language agnostic classes because of how rapidly I can develop small programs in it. It regularly becomes useful at work due to the fact it can be run both on my Windows dev machine and various Linux environments we have without any modification (and of course, the extensive standard library). No time wasted at work to learn it, but it saves me time often. Of course, this could just have easily applied to any scripting language that runs on both, but the point is knowing at least one has made me more productive.

                J 1 Reply Last reply
                0
                • L lewax00

                  jschell wrote:

                  I didn't "try out" object oriented programming when it showed up. But I have been using it for many, many years.

                  So you just magically learned how to properly use them without taking any time to learn them? You never wrote sample projects to see how it all works? That sounds both unlikely and boring.

                  jschell wrote:

                  If you spend 8 hours trying 100 new technologies a year that is almost 1/3 of the year wasted. So a 999/1000 saving means a total loss.

                  But since none of these numbers are based in anything but being pulled out of our ***es that's hardly proof either way.

                  jschell wrote:

                  Great but how to you amortize the time that you spend figuring that out versus you actual productive time?

                  I don't amortize my free time, do you? I like trying out new things. And when I can apply it to something productive, great. When I can't, at least I amused myself for some time. For example, Python. I played around with it in my free time, creating useless little apps and games to learn the language, and carried it over into language agnostic classes because of how rapidly I can develop small programs in it. It regularly becomes useful at work due to the fact it can be run both on my Windows dev machine and various Linux environments we have without any modification (and of course, the extensive standard library). No time wasted at work to learn it, but it saves me time often. Of course, this could just have easily applied to any scripting language that runs on both, but the point is knowing at least one has made me more productive.

                  J Offline
                  J Offline
                  jschell
                  wrote on last edited by
                  #51

                  lewax00 wrote:

                  So you just magically learned how to properly use them without taking any time to learn them?

                  When OO showed up I didn't try it then. I learned it after many years when its success was obvious. In fairness I can note that I am not sure that OO is in fact 'better' but it is popular thus learning was a good professional move. I also like it but is entirely subjective.

                  lewax00 wrote:

                  But since none of these numbers are based in anything but being pulled out of our ***es that's hardly proof either way.

                  You however made a specific claim that you are more productive versus those that do not do what you do. Productivity can be measured (which is what my first response was about.) So presumably there is some basis for your claim besides just a rationalization that because you do it it must in fact be good.

                  lewax00 wrote:

                  I don't amortize my free time, do you? I like trying out new things. And when I can apply it to something productive, great. When I can't, at least I amused myself for some time.

                  You still spent time doing it. That specific methodology is not going to work for all developers nor is it going to work for businesses since they can't mandate that their employees should do that.

                  L 1 Reply Last reply
                  0
                  • J jschell

                    lewax00 wrote:

                    So you just magically learned how to properly use them without taking any time to learn them?

                    When OO showed up I didn't try it then. I learned it after many years when its success was obvious. In fairness I can note that I am not sure that OO is in fact 'better' but it is popular thus learning was a good professional move. I also like it but is entirely subjective.

                    lewax00 wrote:

                    But since none of these numbers are based in anything but being pulled out of our ***es that's hardly proof either way.

                    You however made a specific claim that you are more productive versus those that do not do what you do. Productivity can be measured (which is what my first response was about.) So presumably there is some basis for your claim besides just a rationalization that because you do it it must in fact be good.

                    lewax00 wrote:

                    I don't amortize my free time, do you? I like trying out new things. And when I can apply it to something productive, great. When I can't, at least I amused myself for some time.

                    You still spent time doing it. That specific methodology is not going to work for all developers nor is it going to work for businesses since they can't mandate that their employees should do that.

                    L Offline
                    L Offline
                    lewax00
                    wrote on last edited by
                    #52

                    jschell wrote:

                    You however made a specific claim that you are more productive versus those that do not do what you do. Productivity can be measured (which is what my first response was about.) So presumably there is some basis for your claim besides just a rationalization that because you do it it must in fact be good.

                    You know your car is bigger than your house, but have you taken the time to measure both? I wrote a Python script to send emails via SMTP, I know I did it faster in Python than I could of in something else like C#, I just don't know how long because I didn't try it in another language. (There were some more parts to it than mail, but I knew I could use Python's libraries to send mail, using the included SQLite libraries I was able to store some required data faster than creating a custom file format, and when I needed to convert it to a CGI script it didn't take more than a few extra lines. I just looked at the example for just sending mail in C#, it's longer than my Python script that does all the things listed.)

                    J 1 Reply Last reply
                    0
                    • L lewax00

                      jschell wrote:

                      You however made a specific claim that you are more productive versus those that do not do what you do. Productivity can be measured (which is what my first response was about.) So presumably there is some basis for your claim besides just a rationalization that because you do it it must in fact be good.

                      You know your car is bigger than your house, but have you taken the time to measure both? I wrote a Python script to send emails via SMTP, I know I did it faster in Python than I could of in something else like C#, I just don't know how long because I didn't try it in another language. (There were some more parts to it than mail, but I knew I could use Python's libraries to send mail, using the included SQLite libraries I was able to store some required data faster than creating a custom file format, and when I needed to convert it to a CGI script it didn't take more than a few extra lines. I just looked at the example for just sending mail in C#, it's longer than my Python script that does all the things listed.)

                      J Offline
                      J Offline
                      jschell
                      wrote on last edited by
                      #53

                      lewax00 wrote:

                      You know your car is bigger than your house, but have you taken the time to measure both

                      Presumably you meant the reverse. If your applications represent the simplicity associated with a volume or square footage of your house/vehicle then I would suppose that you can accurately judge efficiency of your output. What I produce however is far, far more complex.

                      lewax00 wrote:

                      I just don't know how long because I didn't try it in another language.

                      There are also far more programmers that think they are excellent programmers than any realistic understanding of probability would allow for. The difference of course lies in their own subject view and someone actually measuring one or more business significant characteristics.

                      lewax00 wrote:

                      I wrote a Python script to send emails via SMTP

                      And of your total output for this entire year what percentage of the total output does this specific project represent?

                      1 Reply Last reply
                      0
                      Reply
                      • Reply as topic
                      Log in to reply
                      • Oldest to Newest
                      • Newest to Oldest
                      • Most Votes


                      • Login

                      • Don't have an account? Register

                      • Login or register to search.
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • World
                      • Users
                      • Groups