Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. AI-assisted programming: A cynical view

AI-assisted programming: A cynical view

Scheduled Pinned Locked Moved The Lounge
csharpcssvisual-studiocloud
24 Posts 14 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • M Offline
    M Offline
    MSBassSinger
    wrote on last edited by
    #1

    Proposed for discussion: The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap. First, it was offshoring and hiring cheap H1-B labor for programmers. Taking our discipline from the level of professionals down to assembly line technicians. Non-tech bean counters, MBAs (full disclosure-I earned my MBA), and CTOs looking for better bonuses bought into those sources of reducing the development phase cost of the Software Development Life Cycle (SDLC). Now our industry is “et up” with the results - low quality code that drives up the biggest part of the SDLC costs - support and extension. Not all cultures encourage applying excellence and deductive reasoning in their work, but encourage varying degrees of making more money at the cost of excellence and just following “best practices” and other recipe books. The concepts of value engineering and defensive programming are rather alien to the cheap programmers. If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect, and still going strong) have ever had to clean up (or throw away and start over) on outsourced/H1-B code, you know what I mean. (Full disclosure - I have worked with H1-B and offshore programmers for almost 30 years, and there are some, a minority of them to be sure, excellent ones that do not fit the description) Now, even less knowledge about the discipline is needed when AI-driven programming just spits it out with even less “thinking with an engineer’s mind” and attention to the full SDLC. Low cost programmers can now be replaced by even lower cost “widget assemblers”. If you think too many software projects go south now (to wit, over-budget, fail to meet deadlines, buggy, high support costs, etc.), wait until the AI-assisted widget assemblers invade, making those CTO bonuses and short term labor overhead reductions even bigger. You know, cut costs and nab the bonus, then leave for another company before the support cost hens come home to roost. I am not against AI/ML. I love using the AI/ML services in Azure, as well as Microsoft’s ML.NET library. Training an application to be useful and accurate takes a LOT of data, but once trained and including a self-learning routine based on how it processes real world data has very useful application. AI as it is being used in Visual Studio is sometimes useful in code completion, and sometimes just annoying. MS needs to improve

    M L C J P 12 Replies Last reply
    0
    • M MSBassSinger

      Proposed for discussion: The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap. First, it was offshoring and hiring cheap H1-B labor for programmers. Taking our discipline from the level of professionals down to assembly line technicians. Non-tech bean counters, MBAs (full disclosure-I earned my MBA), and CTOs looking for better bonuses bought into those sources of reducing the development phase cost of the Software Development Life Cycle (SDLC). Now our industry is “et up” with the results - low quality code that drives up the biggest part of the SDLC costs - support and extension. Not all cultures encourage applying excellence and deductive reasoning in their work, but encourage varying degrees of making more money at the cost of excellence and just following “best practices” and other recipe books. The concepts of value engineering and defensive programming are rather alien to the cheap programmers. If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect, and still going strong) have ever had to clean up (or throw away and start over) on outsourced/H1-B code, you know what I mean. (Full disclosure - I have worked with H1-B and offshore programmers for almost 30 years, and there are some, a minority of them to be sure, excellent ones that do not fit the description) Now, even less knowledge about the discipline is needed when AI-driven programming just spits it out with even less “thinking with an engineer’s mind” and attention to the full SDLC. Low cost programmers can now be replaced by even lower cost “widget assemblers”. If you think too many software projects go south now (to wit, over-budget, fail to meet deadlines, buggy, high support costs, etc.), wait until the AI-assisted widget assemblers invade, making those CTO bonuses and short term labor overhead reductions even bigger. You know, cut costs and nab the bonus, then leave for another company before the support cost hens come home to roost. I am not against AI/ML. I love using the AI/ML services in Azure, as well as Microsoft’s ML.NET library. Training an application to be useful and accurate takes a LOT of data, but once trained and including a self-learning routine based on how it processes real world data has very useful application. AI as it is being used in Visual Studio is sometimes useful in code completion, and sometimes just annoying. MS needs to improve

      M Offline
      M Offline
      MikeCO10
      wrote on last edited by
      #2

      Not sure this is a cynical view at all, more like a pragmatic view. For grins, I asked Bing for an authentication script. Now I understand that Sydney, or whatever it calls itself, isn't a programming AI but, hey, it was fun. It returned a working solution. Would an inexperienced programmer have just copied/pasted it in, changing out vars? I don't know. It certainly wouldn't accomplish what we'd want it to if they did. How carefully does one need to architect or pseudo out what you want written and, at that point, is it any more efficient than working with a human?

      1 Reply Last reply
      0
      • M MSBassSinger

        Proposed for discussion: The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap. First, it was offshoring and hiring cheap H1-B labor for programmers. Taking our discipline from the level of professionals down to assembly line technicians. Non-tech bean counters, MBAs (full disclosure-I earned my MBA), and CTOs looking for better bonuses bought into those sources of reducing the development phase cost of the Software Development Life Cycle (SDLC). Now our industry is “et up” with the results - low quality code that drives up the biggest part of the SDLC costs - support and extension. Not all cultures encourage applying excellence and deductive reasoning in their work, but encourage varying degrees of making more money at the cost of excellence and just following “best practices” and other recipe books. The concepts of value engineering and defensive programming are rather alien to the cheap programmers. If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect, and still going strong) have ever had to clean up (or throw away and start over) on outsourced/H1-B code, you know what I mean. (Full disclosure - I have worked with H1-B and offshore programmers for almost 30 years, and there are some, a minority of them to be sure, excellent ones that do not fit the description) Now, even less knowledge about the discipline is needed when AI-driven programming just spits it out with even less “thinking with an engineer’s mind” and attention to the full SDLC. Low cost programmers can now be replaced by even lower cost “widget assemblers”. If you think too many software projects go south now (to wit, over-budget, fail to meet deadlines, buggy, high support costs, etc.), wait until the AI-assisted widget assemblers invade, making those CTO bonuses and short term labor overhead reductions even bigger. You know, cut costs and nab the bonus, then leave for another company before the support cost hens come home to roost. I am not against AI/ML. I love using the AI/ML services in Azure, as well as Microsoft’s ML.NET library. Training an application to be useful and accurate takes a LOT of data, but once trained and including a self-learning routine based on how it processes real world data has very useful application. AI as it is being used in Visual Studio is sometimes useful in code completion, and sometimes just annoying. MS needs to improve

        L Offline
        L Offline
        Lost User
        wrote on last edited by
        #3

        Everyone these days (apparently) goes right into programming; no requirements gathering. AI will be great at programming the wrong solutions. Maybe it has a place doing user interviews; i.e. requirements gathering. Then I'd like it to design and program the thing. With pictures.

        "Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I

        1 Reply Last reply
        0
        • M MSBassSinger

          Proposed for discussion: The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap. First, it was offshoring and hiring cheap H1-B labor for programmers. Taking our discipline from the level of professionals down to assembly line technicians. Non-tech bean counters, MBAs (full disclosure-I earned my MBA), and CTOs looking for better bonuses bought into those sources of reducing the development phase cost of the Software Development Life Cycle (SDLC). Now our industry is “et up” with the results - low quality code that drives up the biggest part of the SDLC costs - support and extension. Not all cultures encourage applying excellence and deductive reasoning in their work, but encourage varying degrees of making more money at the cost of excellence and just following “best practices” and other recipe books. The concepts of value engineering and defensive programming are rather alien to the cheap programmers. If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect, and still going strong) have ever had to clean up (or throw away and start over) on outsourced/H1-B code, you know what I mean. (Full disclosure - I have worked with H1-B and offshore programmers for almost 30 years, and there are some, a minority of them to be sure, excellent ones that do not fit the description) Now, even less knowledge about the discipline is needed when AI-driven programming just spits it out with even less “thinking with an engineer’s mind” and attention to the full SDLC. Low cost programmers can now be replaced by even lower cost “widget assemblers”. If you think too many software projects go south now (to wit, over-budget, fail to meet deadlines, buggy, high support costs, etc.), wait until the AI-assisted widget assemblers invade, making those CTO bonuses and short term labor overhead reductions even bigger. You know, cut costs and nab the bonus, then leave for another company before the support cost hens come home to roost. I am not against AI/ML. I love using the AI/ML services in Azure, as well as Microsoft’s ML.NET library. Training an application to be useful and accurate takes a LOT of data, but once trained and including a self-learning routine based on how it processes real world data has very useful application. AI as it is being used in Visual Studio is sometimes useful in code completion, and sometimes just annoying. MS needs to improve

          C Offline
          C Offline
          charlieg
          wrote on last edited by
          #4

          Concur. Suffering that list above for the last 15 years.

          Charlie Gilley “They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759 Has never been more appropriate.

          1 Reply Last reply
          0
          • M MSBassSinger

            Proposed for discussion: The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap. First, it was offshoring and hiring cheap H1-B labor for programmers. Taking our discipline from the level of professionals down to assembly line technicians. Non-tech bean counters, MBAs (full disclosure-I earned my MBA), and CTOs looking for better bonuses bought into those sources of reducing the development phase cost of the Software Development Life Cycle (SDLC). Now our industry is “et up” with the results - low quality code that drives up the biggest part of the SDLC costs - support and extension. Not all cultures encourage applying excellence and deductive reasoning in their work, but encourage varying degrees of making more money at the cost of excellence and just following “best practices” and other recipe books. The concepts of value engineering and defensive programming are rather alien to the cheap programmers. If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect, and still going strong) have ever had to clean up (or throw away and start over) on outsourced/H1-B code, you know what I mean. (Full disclosure - I have worked with H1-B and offshore programmers for almost 30 years, and there are some, a minority of them to be sure, excellent ones that do not fit the description) Now, even less knowledge about the discipline is needed when AI-driven programming just spits it out with even less “thinking with an engineer’s mind” and attention to the full SDLC. Low cost programmers can now be replaced by even lower cost “widget assemblers”. If you think too many software projects go south now (to wit, over-budget, fail to meet deadlines, buggy, high support costs, etc.), wait until the AI-assisted widget assemblers invade, making those CTO bonuses and short term labor overhead reductions even bigger. You know, cut costs and nab the bonus, then leave for another company before the support cost hens come home to roost. I am not against AI/ML. I love using the AI/ML services in Azure, as well as Microsoft’s ML.NET library. Training an application to be useful and accurate takes a LOT of data, but once trained and including a self-learning routine based on how it processes real world data has very useful application. AI as it is being used in Visual Studio is sometimes useful in code completion, and sometimes just annoying. MS needs to improve

            J Offline
            J Offline
            jschell
            wrote on last edited by
            #5

            MSBassSinger wrote:

            The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap.

            Because of course in the good ol' days (fill in a date here) everyone knows that (fill in something here) was much better. Of course defining better and how it is known that it was in fact better is very murky.

            MSBassSinger wrote:

            If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect,

            My decades of experience only allows me to state with certainty that long ago I absolutely did not have enough knowledge or experience to judge whether anything was good or bad. Now with that experience/knowledge now I can state that, certainly not surprising to me, that programming, like everything else, drives towards the average. Because if not it would probably require some supernatural explanation.

            M 1 Reply Last reply
            0
            • J jschell

              MSBassSinger wrote:

              The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap.

              Because of course in the good ol' days (fill in a date here) everyone knows that (fill in something here) was much better. Of course defining better and how it is known that it was in fact better is very murky.

              MSBassSinger wrote:

              If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect,

              My decades of experience only allows me to state with certainty that long ago I absolutely did not have enough knowledge or experience to judge whether anything was good or bad. Now with that experience/knowledge now I can state that, certainly not surprising to me, that programming, like everything else, drives towards the average. Because if not it would probably require some supernatural explanation.

              M Offline
              M Offline
              MikeCO10
              wrote on last edited by
              #6

              So, given the un-finite range of human abilities, there's a fair chance that the outcome from one of us will exceed the average. While I'm pretty sure that over my 25+ years, I'm probably average or below, but I can also say that we've produced some way above average results as well. How does an AI exceed the average if it is built on the average? And how does it judge the feedback that it gets? All we have to do is visit stack exchange to see the complexity of that issue.

              1 Reply Last reply
              0
              • M MSBassSinger

                Proposed for discussion: The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap. First, it was offshoring and hiring cheap H1-B labor for programmers. Taking our discipline from the level of professionals down to assembly line technicians. Non-tech bean counters, MBAs (full disclosure-I earned my MBA), and CTOs looking for better bonuses bought into those sources of reducing the development phase cost of the Software Development Life Cycle (SDLC). Now our industry is “et up” with the results - low quality code that drives up the biggest part of the SDLC costs - support and extension. Not all cultures encourage applying excellence and deductive reasoning in their work, but encourage varying degrees of making more money at the cost of excellence and just following “best practices” and other recipe books. The concepts of value engineering and defensive programming are rather alien to the cheap programmers. If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect, and still going strong) have ever had to clean up (or throw away and start over) on outsourced/H1-B code, you know what I mean. (Full disclosure - I have worked with H1-B and offshore programmers for almost 30 years, and there are some, a minority of them to be sure, excellent ones that do not fit the description) Now, even less knowledge about the discipline is needed when AI-driven programming just spits it out with even less “thinking with an engineer’s mind” and attention to the full SDLC. Low cost programmers can now be replaced by even lower cost “widget assemblers”. If you think too many software projects go south now (to wit, over-budget, fail to meet deadlines, buggy, high support costs, etc.), wait until the AI-assisted widget assemblers invade, making those CTO bonuses and short term labor overhead reductions even bigger. You know, cut costs and nab the bonus, then leave for another company before the support cost hens come home to roost. I am not against AI/ML. I love using the AI/ML services in Azure, as well as Microsoft’s ML.NET library. Training an application to be useful and accurate takes a LOT of data, but once trained and including a self-learning routine based on how it processes real world data has very useful application. AI as it is being used in Visual Studio is sometimes useful in code completion, and sometimes just annoying. MS needs to improve

                P Offline
                P Offline
                Peter Adam
                wrote on last edited by
                #7

                In other news, Bing is constrained to five answers to avoid going wild. How many useful lines eat up the include statements at the top of the file? :-D

                1 Reply Last reply
                0
                • M MSBassSinger

                  Proposed for discussion: The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap. First, it was offshoring and hiring cheap H1-B labor for programmers. Taking our discipline from the level of professionals down to assembly line technicians. Non-tech bean counters, MBAs (full disclosure-I earned my MBA), and CTOs looking for better bonuses bought into those sources of reducing the development phase cost of the Software Development Life Cycle (SDLC). Now our industry is “et up” with the results - low quality code that drives up the biggest part of the SDLC costs - support and extension. Not all cultures encourage applying excellence and deductive reasoning in their work, but encourage varying degrees of making more money at the cost of excellence and just following “best practices” and other recipe books. The concepts of value engineering and defensive programming are rather alien to the cheap programmers. If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect, and still going strong) have ever had to clean up (or throw away and start over) on outsourced/H1-B code, you know what I mean. (Full disclosure - I have worked with H1-B and offshore programmers for almost 30 years, and there are some, a minority of them to be sure, excellent ones that do not fit the description) Now, even less knowledge about the discipline is needed when AI-driven programming just spits it out with even less “thinking with an engineer’s mind” and attention to the full SDLC. Low cost programmers can now be replaced by even lower cost “widget assemblers”. If you think too many software projects go south now (to wit, over-budget, fail to meet deadlines, buggy, high support costs, etc.), wait until the AI-assisted widget assemblers invade, making those CTO bonuses and short term labor overhead reductions even bigger. You know, cut costs and nab the bonus, then leave for another company before the support cost hens come home to roost. I am not against AI/ML. I love using the AI/ML services in Azure, as well as Microsoft’s ML.NET library. Training an application to be useful and accurate takes a LOT of data, but once trained and including a self-learning routine based on how it processes real world data has very useful application. AI as it is being used in Visual Studio is sometimes useful in code completion, and sometimes just annoying. MS needs to improve

                  D Offline
                  D Offline
                  Derek Hunter
                  wrote on last edited by
                  #8

                  Not cynical. Entirely realistic.

                  1 Reply Last reply
                  0
                  • M MSBassSinger

                    Proposed for discussion: The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap. First, it was offshoring and hiring cheap H1-B labor for programmers. Taking our discipline from the level of professionals down to assembly line technicians. Non-tech bean counters, MBAs (full disclosure-I earned my MBA), and CTOs looking for better bonuses bought into those sources of reducing the development phase cost of the Software Development Life Cycle (SDLC). Now our industry is “et up” with the results - low quality code that drives up the biggest part of the SDLC costs - support and extension. Not all cultures encourage applying excellence and deductive reasoning in their work, but encourage varying degrees of making more money at the cost of excellence and just following “best practices” and other recipe books. The concepts of value engineering and defensive programming are rather alien to the cheap programmers. If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect, and still going strong) have ever had to clean up (or throw away and start over) on outsourced/H1-B code, you know what I mean. (Full disclosure - I have worked with H1-B and offshore programmers for almost 30 years, and there are some, a minority of them to be sure, excellent ones that do not fit the description) Now, even less knowledge about the discipline is needed when AI-driven programming just spits it out with even less “thinking with an engineer’s mind” and attention to the full SDLC. Low cost programmers can now be replaced by even lower cost “widget assemblers”. If you think too many software projects go south now (to wit, over-budget, fail to meet deadlines, buggy, high support costs, etc.), wait until the AI-assisted widget assemblers invade, making those CTO bonuses and short term labor overhead reductions even bigger. You know, cut costs and nab the bonus, then leave for another company before the support cost hens come home to roost. I am not against AI/ML. I love using the AI/ML services in Azure, as well as Microsoft’s ML.NET library. Training an application to be useful and accurate takes a LOT of data, but once trained and including a self-learning routine based on how it processes real world data has very useful application. AI as it is being used in Visual Studio is sometimes useful in code completion, and sometimes just annoying. MS needs to improve

                    J Offline
                    J Offline
                    Julian Ragan
                    wrote on last edited by
                    #9

                    Until incentives change to ones that prevent short-sighted decisions and rewards are given based on long-term success of the product, this will not change. And for that to happen, well, some serious shocks to the system will be required.

                    1 Reply Last reply
                    0
                    • M MSBassSinger

                      Proposed for discussion: The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap. First, it was offshoring and hiring cheap H1-B labor for programmers. Taking our discipline from the level of professionals down to assembly line technicians. Non-tech bean counters, MBAs (full disclosure-I earned my MBA), and CTOs looking for better bonuses bought into those sources of reducing the development phase cost of the Software Development Life Cycle (SDLC). Now our industry is “et up” with the results - low quality code that drives up the biggest part of the SDLC costs - support and extension. Not all cultures encourage applying excellence and deductive reasoning in their work, but encourage varying degrees of making more money at the cost of excellence and just following “best practices” and other recipe books. The concepts of value engineering and defensive programming are rather alien to the cheap programmers. If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect, and still going strong) have ever had to clean up (or throw away and start over) on outsourced/H1-B code, you know what I mean. (Full disclosure - I have worked with H1-B and offshore programmers for almost 30 years, and there are some, a minority of them to be sure, excellent ones that do not fit the description) Now, even less knowledge about the discipline is needed when AI-driven programming just spits it out with even less “thinking with an engineer’s mind” and attention to the full SDLC. Low cost programmers can now be replaced by even lower cost “widget assemblers”. If you think too many software projects go south now (to wit, over-budget, fail to meet deadlines, buggy, high support costs, etc.), wait until the AI-assisted widget assemblers invade, making those CTO bonuses and short term labor overhead reductions even bigger. You know, cut costs and nab the bonus, then leave for another company before the support cost hens come home to roost. I am not against AI/ML. I love using the AI/ML services in Azure, as well as Microsoft’s ML.NET library. Training an application to be useful and accurate takes a LOT of data, but once trained and including a self-learning routine based on how it processes real world data has very useful application. AI as it is being used in Visual Studio is sometimes useful in code completion, and sometimes just annoying. MS needs to improve

                      M Offline
                      M Offline
                      MadGerbil
                      wrote on last edited by
                      #10

                      I'm not sure AI is the issue here. Initially, the university where I work had a mainframe that handled student billing for many thousands of students and that system worked very well for 25+. It made sense to really invest in making the code right because the time you put in was rewarded with 2 decades of service. The system was replaced with some difficulty in 2018 and yet five years later they're shopping for a replacement. Coding as an engineering exercise makes sense when you're building something with hardware and software that will be around for two decades but it doesn't make sense to put the thought and effort into something that gets used for 6 months before being replaced by the latest framework/fotm language/cloud next best thing. Look at the .NET framework - how does a 3 year support window for a version compare to two decades? If you start writing for .NET 6 right now and do a real quality job - on a complex system you may not even been completed before the framework is out of support. Why take all the extra time to produce good code when it will be obsolete security hazard before you can even get it out the door? I'll worry about being an engineer when I get an environment that isn't completely upended every 24 months.

                      M 1 Reply Last reply
                      0
                      • M MSBassSinger

                        Proposed for discussion: The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap. First, it was offshoring and hiring cheap H1-B labor for programmers. Taking our discipline from the level of professionals down to assembly line technicians. Non-tech bean counters, MBAs (full disclosure-I earned my MBA), and CTOs looking for better bonuses bought into those sources of reducing the development phase cost of the Software Development Life Cycle (SDLC). Now our industry is “et up” with the results - low quality code that drives up the biggest part of the SDLC costs - support and extension. Not all cultures encourage applying excellence and deductive reasoning in their work, but encourage varying degrees of making more money at the cost of excellence and just following “best practices” and other recipe books. The concepts of value engineering and defensive programming are rather alien to the cheap programmers. If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect, and still going strong) have ever had to clean up (or throw away and start over) on outsourced/H1-B code, you know what I mean. (Full disclosure - I have worked with H1-B and offshore programmers for almost 30 years, and there are some, a minority of them to be sure, excellent ones that do not fit the description) Now, even less knowledge about the discipline is needed when AI-driven programming just spits it out with even less “thinking with an engineer’s mind” and attention to the full SDLC. Low cost programmers can now be replaced by even lower cost “widget assemblers”. If you think too many software projects go south now (to wit, over-budget, fail to meet deadlines, buggy, high support costs, etc.), wait until the AI-assisted widget assemblers invade, making those CTO bonuses and short term labor overhead reductions even bigger. You know, cut costs and nab the bonus, then leave for another company before the support cost hens come home to roost. I am not against AI/ML. I love using the AI/ML services in Azure, as well as Microsoft’s ML.NET library. Training an application to be useful and accurate takes a LOT of data, but once trained and including a self-learning routine based on how it processes real world data has very useful application. AI as it is being used in Visual Studio is sometimes useful in code completion, and sometimes just annoying. MS needs to improve

                        B Offline
                        B Offline
                        BillWoodruff
                        wrote on last edited by
                        #11

                        Which sections of this essay were written by ChatGPT ?

                        «The mind is not a vessel to be filled but a fire to be kindled» Plutarch

                        1 Reply Last reply
                        0
                        • M MadGerbil

                          I'm not sure AI is the issue here. Initially, the university where I work had a mainframe that handled student billing for many thousands of students and that system worked very well for 25+. It made sense to really invest in making the code right because the time you put in was rewarded with 2 decades of service. The system was replaced with some difficulty in 2018 and yet five years later they're shopping for a replacement. Coding as an engineering exercise makes sense when you're building something with hardware and software that will be around for two decades but it doesn't make sense to put the thought and effort into something that gets used for 6 months before being replaced by the latest framework/fotm language/cloud next best thing. Look at the .NET framework - how does a 3 year support window for a version compare to two decades? If you start writing for .NET 6 right now and do a real quality job - on a complex system you may not even been completed before the framework is out of support. Why take all the extra time to produce good code when it will be obsolete security hazard before you can even get it out the door? I'll worry about being an engineer when I get an environment that isn't completely upended every 24 months.

                          M Offline
                          M Offline
                          MSBassSinger
                          wrote on last edited by
                          #12

                          I’ve been developing systems in .NET for 22 years. It has never been upended. It has grown, expanded, and improved. Code I wrote 20 years ago still runs. I remember the days of writing in FORTRAN and COBOL. Those languages grew, expanded, and improved over time, also. There is a difference between replacing a program and upgrading it by extending it features.

                          M 1 Reply Last reply
                          0
                          • M MSBassSinger

                            I’ve been developing systems in .NET for 22 years. It has never been upended. It has grown, expanded, and improved. Code I wrote 20 years ago still runs. I remember the days of writing in FORTRAN and COBOL. Those languages grew, expanded, and improved over time, also. There is a difference between replacing a program and upgrading it by extending it features.

                            M Offline
                            M Offline
                            MadGerbil
                            wrote on last edited by
                            #13

                            If you've been maintaining it for 22 years you've either replaced the underlying framework a couple of times or you're running insecure code. Regardless, I think technology churn is a huge driver for code quality problems.

                            M 1 Reply Last reply
                            0
                            • M MSBassSinger

                              Proposed for discussion: The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap. First, it was offshoring and hiring cheap H1-B labor for programmers. Taking our discipline from the level of professionals down to assembly line technicians. Non-tech bean counters, MBAs (full disclosure-I earned my MBA), and CTOs looking for better bonuses bought into those sources of reducing the development phase cost of the Software Development Life Cycle (SDLC). Now our industry is “et up” with the results - low quality code that drives up the biggest part of the SDLC costs - support and extension. Not all cultures encourage applying excellence and deductive reasoning in their work, but encourage varying degrees of making more money at the cost of excellence and just following “best practices” and other recipe books. The concepts of value engineering and defensive programming are rather alien to the cheap programmers. If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect, and still going strong) have ever had to clean up (or throw away and start over) on outsourced/H1-B code, you know what I mean. (Full disclosure - I have worked with H1-B and offshore programmers for almost 30 years, and there are some, a minority of them to be sure, excellent ones that do not fit the description) Now, even less knowledge about the discipline is needed when AI-driven programming just spits it out with even less “thinking with an engineer’s mind” and attention to the full SDLC. Low cost programmers can now be replaced by even lower cost “widget assemblers”. If you think too many software projects go south now (to wit, over-budget, fail to meet deadlines, buggy, high support costs, etc.), wait until the AI-assisted widget assemblers invade, making those CTO bonuses and short term labor overhead reductions even bigger. You know, cut costs and nab the bonus, then leave for another company before the support cost hens come home to roost. I am not against AI/ML. I love using the AI/ML services in Azure, as well as Microsoft’s ML.NET library. Training an application to be useful and accurate takes a LOT of data, but once trained and including a self-learning routine based on how it processes real world data has very useful application. AI as it is being used in Visual Studio is sometimes useful in code completion, and sometimes just annoying. MS needs to improve

                              C Offline
                              C Offline
                              Cpichols
                              wrote on last edited by
                              #14

                              I've been programming all of my adult life - really from a teen around 1980, but with serious intent since beginning college in 83. I love being a part of such a fast-moving industry, but I do agree that we may well have stumbled over our feet with AI. Here's the thing: I program with all of that old school as a foundation - man pages/reference books at hand as needed, wishing I could type faster wit accuracy to keep up with my plan for the code. I remember the beginnings of widgets, objects, and relational database design. Because I have all of that in my experience, I can make good use of code recommendation/auto-complete. It's like spelling for me - I'm pretty sure I know how to spell a word, but it's nice to have the confirmation of the auto-completer. What concerns me about newer programmers is that they have likely never had to man a function call. Do they use reference resources, or do they just use code snippets? Do they understand the 'grammar' of their coding language, or is the boilerplate a black box for them? And does it matter? I think it does; I don't know how it could fail to matter, but maybe things have just changed that much so that it doesn't matter in the end. I love efficient code. I think it's important to optimize performance and write code that is maintainable because it's not junked up. I believe that the basics are important and we skip them to our doom, but I could be wrong; maybe this is how great leaps are made, by leaving the ground.

                              S 1 Reply Last reply
                              0
                              • M MadGerbil

                                If you've been maintaining it for 22 years you've either replaced the underlying framework a couple of times or you're running insecure code. Regardless, I think technology churn is a huge driver for code quality problems.

                                M Offline
                                M Offline
                                MSBassSinger
                                wrote on last edited by
                                #15

                                I am not sure you know how this works. The point is that there is no wholesale replacing of anything for the last 22 years of .NET. Expanded API, expanded OSS, added features, etc. But it is still .NET. Nothing like the false premise you offer. All good, long lasting programs of any consequence are regularly updated and extended. It was true in the mainframe/minicomputer days, and is still true today. To say that technology change - kin and of itself - is the cause for churn shows a lack of understanding or experience in software engineering. As with any discipline, there are those who change something for change’s sake (always chasing the new and shiny), and there are those who change/amend/refactor/revise based on 1) need and 2) application of value engineering. If you think it is “upending” to go from .NET 5 to 6 to 7, then you don’t understand .NET.

                                M 1 Reply Last reply
                                0
                                • M MSBassSinger

                                  I am not sure you know how this works. The point is that there is no wholesale replacing of anything for the last 22 years of .NET. Expanded API, expanded OSS, added features, etc. But it is still .NET. Nothing like the false premise you offer. All good, long lasting programs of any consequence are regularly updated and extended. It was true in the mainframe/minicomputer days, and is still true today. To say that technology change - kin and of itself - is the cause for churn shows a lack of understanding or experience in software engineering. As with any discipline, there are those who change something for change’s sake (always chasing the new and shiny), and there are those who change/amend/refactor/revise based on 1) need and 2) application of value engineering. If you think it is “upending” to go from .NET 5 to 6 to 7, then you don’t understand .NET.

                                  M Offline
                                  M Offline
                                  MadGerbil
                                  wrote on last edited by
                                  #16

                                  MSBassSinger wrote:

                                  If you think it is “upending” to go from .NET 5 to 6 to 7, then you don’t understand .NET.

                                  I'm talking about decades of stability and somehow you believe migrating an application from .NET 5 (2020) to .NET 6 (2021) is a reasonable comparison? Okay.

                                  M 1 Reply Last reply
                                  0
                                  • M MadGerbil

                                    MSBassSinger wrote:

                                    If you think it is “upending” to go from .NET 5 to 6 to 7, then you don’t understand .NET.

                                    I'm talking about decades of stability and somehow you believe migrating an application from .NET 5 (2020) to .NET 6 (2021) is a reasonable comparison? Okay.

                                    M Offline
                                    M Offline
                                    MSBassSinger
                                    wrote on last edited by
                                    #17

                                    No, I think you don’t know what you think you know. No production program remains unchanged for 20 years. The required business logic changes. If industrial automation, the hardware changes as it is replaced, thus making API or buffer location changes. The backend databases or third party APIs change. Production software always changes to meet production requirement changes. The core purpose of the program can be stable for decades, but there are always changes. I’ve seen that in banking and HR with programs that had the same core responsibilities for 20+ years, written in COBOL. The same is true where production programs were written to use .NET. In both cases, stable production programs were updated to meet changing business requirements, not because .NET grew and improved like any language does. Yes, churn does happen because unqualified and ignorant management chooses to ignore ROI and just chase after something new and shiny, or falls for the latest “best practices” silliness. That is not the fault of .NET. That is the fault of an organization hiring the incompetent and putting them in charge of something.

                                    M 1 Reply Last reply
                                    0
                                    • M MSBassSinger

                                      No, I think you don’t know what you think you know. No production program remains unchanged for 20 years. The required business logic changes. If industrial automation, the hardware changes as it is replaced, thus making API or buffer location changes. The backend databases or third party APIs change. Production software always changes to meet production requirement changes. The core purpose of the program can be stable for decades, but there are always changes. I’ve seen that in banking and HR with programs that had the same core responsibilities for 20+ years, written in COBOL. The same is true where production programs were written to use .NET. In both cases, stable production programs were updated to meet changing business requirements, not because .NET grew and improved like any language does. Yes, churn does happen because unqualified and ignorant management chooses to ignore ROI and just chase after something new and shiny, or falls for the latest “best practices” silliness. That is not the fault of .NET. That is the fault of an organization hiring the incompetent and putting them in charge of something.

                                      M Offline
                                      M Offline
                                      MadGerbil
                                      wrote on last edited by
                                      #18

                                      For some reason you keep moving the conversation to weird extremes. For example:

                                      MSBassSinger wrote:

                                      No production program remains unchanged for 20 years.

                                      I never made that claim. I've been more than clear - when you want to have a discussion with me and not weird caricatures of what I'm posting I'll re-engage. Until then, good-day.

                                      M 1 Reply Last reply
                                      0
                                      • C Cpichols

                                        I've been programming all of my adult life - really from a teen around 1980, but with serious intent since beginning college in 83. I love being a part of such a fast-moving industry, but I do agree that we may well have stumbled over our feet with AI. Here's the thing: I program with all of that old school as a foundation - man pages/reference books at hand as needed, wishing I could type faster wit accuracy to keep up with my plan for the code. I remember the beginnings of widgets, objects, and relational database design. Because I have all of that in my experience, I can make good use of code recommendation/auto-complete. It's like spelling for me - I'm pretty sure I know how to spell a word, but it's nice to have the confirmation of the auto-completer. What concerns me about newer programmers is that they have likely never had to man a function call. Do they use reference resources, or do they just use code snippets? Do they understand the 'grammar' of their coding language, or is the boilerplate a black box for them? And does it matter? I think it does; I don't know how it could fail to matter, but maybe things have just changed that much so that it doesn't matter in the end. I love efficient code. I think it's important to optimize performance and write code that is maintainable because it's not junked up. I believe that the basics are important and we skip them to our doom, but I could be wrong; maybe this is how great leaps are made, by leaving the ground.

                                        S Offline
                                        S Offline
                                        sasadler
                                        wrote on last edited by
                                        #19

                                        Yep. I'm betting using AI to help you program is going to make a new software engineer a less able architect/programmer. It's going to be like how the calculator (app these days) has destroyed younger peoples ability to do basic math. I've got a college professor friend I game with who teaches immunology. He made the decision that his students couldn't use their cell phones during labs. There was an uproar from the students and one of the issues was they couldn't use their calculator app on the phone. He then posed a simple problem to the students: What's 13 divided by 26. Not one of the students was able to answer the question without using their calculator app!

                                        C 1 Reply Last reply
                                        0
                                        • M MadGerbil

                                          For some reason you keep moving the conversation to weird extremes. For example:

                                          MSBassSinger wrote:

                                          No production program remains unchanged for 20 years.

                                          I never made that claim. I've been more than clear - when you want to have a discussion with me and not weird caricatures of what I'm posting I'll re-engage. Until then, good-day.

                                          M Offline
                                          M Offline
                                          MSBassSinger
                                          wrote on last edited by
                                          #20

                                          You wrote: “I'm talking about decades of stability…” So yes, what I wrote is entirely in context to what you wrote. Maybe American English is not your first language and you are having trouble comprehending. It is clear you do not understand the difference between a stable program that is decades old, being “upended” and one simply being updated over time while retaining its stability and scope of functionality. Your assertion that merely updating .NET by improved versions over those decades you referenced is “upending” a stable program is sheer nonsense.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups