Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. The ethics of Open Source

The ethics of Open Source

Scheduled Pinned Locked Moved The Lounge
questionbusinesshelptutorialcareer
43 Posts 20 Posters 1 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • R RainHat

    The OSI is talking about what defines open source. If a repository was restricted to use by people with the initials Q.C. there would be a lot of people who could not use it and calling it open source would be a stretch of the imagination. As for putting ethical restrictions on software I would say only ethical people will respect it, the rest will use your code anyway and hope they do not get caught. Personally if I felt strongly about something I would put in a disclaimer, rather than a licence clause. Something like: This code is not endorsed for use in cold blooded murder. It makes the point without adding legal restrictions.

    C Offline
    C Offline
    Chris Maunder
    wrote on last edited by
    #18

    So would that suggest that morally you would be comfortable releasing code you knew could (and perhaps was) being used for Evil Purposes as long as you had a non enforceable "Don't use this for Evil Purposes" statement in the code? At a practical level an expensive, water tight legally binding license is just as enforceable as your note in the minds of many, so I guess it comes down to: Do you make a statement that has no teeth, or do you make a statement with teeth that will not really help the situation? which reduces down to Do you put the effort into a statement, knowing it will not actually help, or do you just mail it in? Which is really Do you put time and money into a statement as a statement unto itself, or just put a statement in so you can say you said "I told them not to" This stuff is hard.

    cheers Chris Maunder

    J 1 Reply Last reply
    0
    • C Chris Maunder

      I'm sitting here, sipping a beer, while giving you a very flat look. Never change, John. The world will crumble.

      cheers Chris Maunder

      R Offline
      R Offline
      realJSOP
      wrote on last edited by
      #19

      But you laughed, right? Consider your answer - remember, I have guns. :)

      ".45 ACP - because shooting twice is just silly" - JSOP, 2010
      -----
      You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
      -----
      When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013

      C 1 Reply Last reply
      0
      • R realJSOP

        But you laughed, right? Consider your answer - remember, I have guns. :)

        ".45 ACP - because shooting twice is just silly" - JSOP, 2010
        -----
        You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
        -----
        When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013

        C Offline
        C Offline
        Chris Maunder
        wrote on last edited by
        #20

        Yes, absolutely 😅

        cheers Chris Maunder

        1 Reply Last reply
        0
        • C Chris Maunder

          I'm sitting here, sipping a beer, while giving you a very flat look. Never change, John. The world will crumble.

          cheers Chris Maunder

          D Offline
          D Offline
          Daniel Pfeffer
          wrote on last edited by
          #21

          Chris Maunder wrote:

          sipping a beer, while giving you a very flat look.

          Is the beer flat, too? :)

          Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.

          1 Reply Last reply
          0
          • P Peter_in_2780

            They are presented as alternatives. Pick one.

            Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012

            J Offline
            J Offline
            jmaida
            wrote on last edited by
            #22

            got it. for some reason I did not read it that way. Thanx to CP. it helps to hear many voices.

            "A little time, a little trouble, your better day" Badfinger

            1 Reply Last reply
            0
            • C Chris Maunder

              It was a question:

              Quote:

              What is more important to you, as the developer of code you want to share with the World: - That the code is always able to be used for anything, without constraint - That you have the ability to restrict the use of your code based on ethical concerns

              cheers Chris Maunder

              J Offline
              J Offline
              jmaida
              wrote on last edited by
              #23

              i recognize i read out of context. i explained earlier. old farts have short term memory and read with less skill. thanx, as i said CP helps. I missed it for about 2 weeks because illness. First thing I did was jump back in the lounge.

              "A little time, a little trouble, your better day" Badfinger

              1 Reply Last reply
              0
              • C Chris Maunder

                The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

                Quote:

                6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

                I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

                cheers Chris Maunder

                S Offline
                S Offline
                Sean Cundiff
                wrote on last edited by
                #24

                Not Open Source, but AI ethics related. Billie Eilish, Pearl Jam, 200 artists say AI poses existential threat to their livelihoods | Ars Technica[^]

                -Sean ---- Fire Nuts

                H J 2 Replies Last reply
                0
                • C Chris Maunder

                  The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

                  Quote:

                  6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

                  I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

                  cheers Chris Maunder

                  H Offline
                  H Offline
                  honey the codewitch
                  wrote on last edited by
                  #25

                  I effectively restrict my professional code to certain arenas, as I will not work for example, on weapons systems. I cannot control what my code under MIT license is used for however, and I produce a lot of that. If someone makes a missile guidance system with my JSON parser, well, I guess more power to them? I won't lose sleep over it, because I didn't make anything specifically for that purpose, and I don't feel morally or ethically obligated to control what other people do with my code. The other thing is - the people I would be least comfortable with using my code - bad actors in general, whether they used it to create malware or anything I else I disagreed with - aren't the type of people to respect license agreements in the first place, so there's that to consider as well. A long time ago I worked on productivity monitoring software for a workplace. I was in my twenties I wasn't considering how it was likely to be used. These days I wouldn't write such software because sadly, it's most likely going to be used to abuse employees. That's just how that software works when you go to squeeze every last bit of "productivity" out of someone's workday. Software micromanagement isn't much better than the meat based variety. I get the same heebie jeebies from AI. It's so easy to abuse AI. Want to sidestep its filters? Ask your question using ASCII art. Or tell it to write War and Peace using only the word "pudding". So even attempts to make it ethical don't work. LLMs are just not a "safe" technology - but then neither is the Internet, but also look what the Internet has done (the bad as well as the good). I understand the Internet, and I've worked with it for long enough to temper what I produce such that I'm not unleashing something terrible upon the world. I can't say the same of anything I'd produce using LLMs or the like. I'd sooner just avoid it, and let other people be the ones to screw up the planet with it.

                  Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                  C 1 Reply Last reply
                  0
                  • A Amarnath S

                    Given that a good percentage of any code base today is borrowed/adapted from different sites on the Internet (including open source code), I feel the question is more of Accountability. The final Accountability (including ethical accountability) of any software should rest with the current owner, releaser of that software, and not be transferred to the various internet sources from where extracts were taken. In other words, the "buck stops" at the person/company which released such code into production, deployment. As an open source developer i will be unaware of the possible use cases of my code 40 years hence. And i need to be insulated against possible misuse.

                    J Offline
                    J Offline
                    jschell
                    wrote on last edited by
                    #26

                    Amarnath S wrote:

                    Given that a good percentage of any code base today is borrowed/adapted from different sites on the Internet

                    True of all technology. And science. And philosophy. And religion. And beer for that matter. I am not much of a beer drinker but I am rather glad that originality is not being kept. https://www.nationalgeographic.com/culture/article/ancient-alcoholic-drinks-unusual-starter-human-spit[^]

                    1 Reply Last reply
                    0
                    • C Chris Maunder

                      So would that suggest that morally you would be comfortable releasing code you knew could (and perhaps was) being used for Evil Purposes as long as you had a non enforceable "Don't use this for Evil Purposes" statement in the code? At a practical level an expensive, water tight legally binding license is just as enforceable as your note in the minds of many, so I guess it comes down to: Do you make a statement that has no teeth, or do you make a statement with teeth that will not really help the situation? which reduces down to Do you put the effort into a statement, knowing it will not actually help, or do you just mail it in? Which is really Do you put time and money into a statement as a statement unto itself, or just put a statement in so you can say you said "I told them not to" This stuff is hard.

                      cheers Chris Maunder

                      J Offline
                      J Offline
                      jschell
                      wrote on last edited by
                      #27

                      Chris Maunder wrote:

                      This stuff is hard.

                      Only if you think idealism is attainable. Is clean drinking water a good thing? What if the convenience of it coming out of a faucet makes it easier to water board someone?

                      1 Reply Last reply
                      0
                      • H honey the codewitch

                        I effectively restrict my professional code to certain arenas, as I will not work for example, on weapons systems. I cannot control what my code under MIT license is used for however, and I produce a lot of that. If someone makes a missile guidance system with my JSON parser, well, I guess more power to them? I won't lose sleep over it, because I didn't make anything specifically for that purpose, and I don't feel morally or ethically obligated to control what other people do with my code. The other thing is - the people I would be least comfortable with using my code - bad actors in general, whether they used it to create malware or anything I else I disagreed with - aren't the type of people to respect license agreements in the first place, so there's that to consider as well. A long time ago I worked on productivity monitoring software for a workplace. I was in my twenties I wasn't considering how it was likely to be used. These days I wouldn't write such software because sadly, it's most likely going to be used to abuse employees. That's just how that software works when you go to squeeze every last bit of "productivity" out of someone's workday. Software micromanagement isn't much better than the meat based variety. I get the same heebie jeebies from AI. It's so easy to abuse AI. Want to sidestep its filters? Ask your question using ASCII art. Or tell it to write War and Peace using only the word "pudding". So even attempts to make it ethical don't work. LLMs are just not a "safe" technology - but then neither is the Internet, but also look what the Internet has done (the bad as well as the good). I understand the Internet, and I've worked with it for long enough to temper what I produce such that I'm not unleashing something terrible upon the world. I can't say the same of anything I'd produce using LLMs or the like. I'd sooner just avoid it, and let other people be the ones to screw up the planet with it.

                        Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                        C Offline
                        C Offline
                        Chris Maunder
                        wrote on last edited by
                        #28

                        "tell it to write War and Peace using only the word "pudding". I'm staring and I'm staring and I'm staring at that sentence. Must...not...

                        cheers Chris Maunder

                        H 1 Reply Last reply
                        0
                        • C Chris Maunder

                          "tell it to write War and Peace using only the word "pudding". I'm staring and I'm staring and I'm staring at that sentence. Must...not...

                          cheers Chris Maunder

                          H Offline
                          H Offline
                          honey the codewitch
                          wrote on last edited by
                          #29

                          Researchers were doing things like that to get it to start dumping its training data at them. :)

                          Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                          1 Reply Last reply
                          0
                          • C Chris Maunder

                            The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

                            Quote:

                            6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

                            I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

                            cheers Chris Maunder

                            D Offline
                            D Offline
                            Davyd McColl
                            wrote on last edited by
                            #30

                            1 because I'm not here to police others and I don't want anyone having to think to hard about if they can use my opensource stuff. It's why my license of choice for my software is bsd-3-clause, basically saying you can do what you want with it, except take credit for it, and if it breaks, you can keep all the pieces.

                            ------------------------------------------------ If you say that getting the money is the most important thing You will spend your life completely wasting your time You will be doing things you don't like doing In order to go on living That is, to go on doing things you don't like doing Which is stupid. - Alan Watts https://www.youtube.com/watch?v=-gXTZM\_uPMY

                            1 Reply Last reply
                            0
                            • C Chris Maunder

                              The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

                              Quote:

                              6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

                              I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

                              cheers Chris Maunder

                              G Offline
                              G Offline
                              GuyThiebaut
                              wrote on last edited by
                              #31

                              For me definitely the latter: That you have the ability to restrict the use of your code based on ethical concerns I wrote a webcam movement detection security surveillance system around 12 years ago, which gets occasional updates and which is open source. I stated in the EULA that the software is not to be used to secretly monitor individuals but is only for security or nature watching purposes - knowing that some people would probably not pay any attention to that, but at least giving some them time to pause and think about what they might be about to do. Yes, I am one of those who thinks that ethics is important and needs open discussion with regards to computer systems. Although I don't think there was a single lecture on ethics on my computer science course when I was in back in university in 1988.

                              “That which can be asserted without evidence, can be dismissed without evidence.”

                              ― Christopher Hitchens

                              H 1 Reply Last reply
                              0
                              • G GuyThiebaut

                                For me definitely the latter: That you have the ability to restrict the use of your code based on ethical concerns I wrote a webcam movement detection security surveillance system around 12 years ago, which gets occasional updates and which is open source. I stated in the EULA that the software is not to be used to secretly monitor individuals but is only for security or nature watching purposes - knowing that some people would probably not pay any attention to that, but at least giving some them time to pause and think about what they might be about to do. Yes, I am one of those who thinks that ethics is important and needs open discussion with regards to computer systems. Although I don't think there was a single lecture on ethics on my computer science course when I was in back in university in 1988.

                                “That which can be asserted without evidence, can be dismissed without evidence.”

                                ― Christopher Hitchens

                                H Offline
                                H Offline
                                honey the codewitch
                                wrote on last edited by
                                #32

                                I've avoided creating certain types of projects altogether in large part because there are no practical ethics adopted by the field, so there are no guardrails. "Just because you can, doesn't necessarily mean you should" looms large in my consideration and I tend to think about the possibilities of my software. I don't want my code to be weaponized against people or used to abuse them. That means certain projects are simply a no go for me. I don't do law enforcement (with one notable exception where I made something to keep K9 units alive and safe in the car) and I don't do military. I don't do productivity monitoring or other kinds of surveillance, excepting things like closed circuit security systems, which I think are fine. If there was some sort of ethical standards we all adopted it would give me more freedom to work in arenas I just bow out of entirely, as above.

                                Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                                G 1 Reply Last reply
                                0
                                • H honey the codewitch

                                  I've avoided creating certain types of projects altogether in large part because there are no practical ethics adopted by the field, so there are no guardrails. "Just because you can, doesn't necessarily mean you should" looms large in my consideration and I tend to think about the possibilities of my software. I don't want my code to be weaponized against people or used to abuse them. That means certain projects are simply a no go for me. I don't do law enforcement (with one notable exception where I made something to keep K9 units alive and safe in the car) and I don't do military. I don't do productivity monitoring or other kinds of surveillance, excepting things like closed circuit security systems, which I think are fine. If there was some sort of ethical standards we all adopted it would give me more freedom to work in arenas I just bow out of entirely, as above.

                                  Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                                  G Offline
                                  G Offline
                                  GuyThiebaut
                                  wrote on last edited by
                                  #33

                                  honey the codewitch wrote:

                                  "Just because you can, doesn't necessarily mean you should"

                                  I am in complete agreement and it reminds me of the previous spate of "how do I hack a website" questions on the CodeProject forums where the questioner would be sent packing fairly quickly. When I started writing the webcam security app there was not much out there that did the same and it was only around 5+ years later that the surveillance/snooping apps/IOT devices really exploded. I think the thing about ethics is that there are a lot of grey areas and, as a massive generalisation, software developers don't tend to like grey areas which may be why the topic of ethics, from a philosophical and moral point of view, is not spoken about a lot in the community.

                                  “That which can be asserted without evidence, can be dismissed without evidence.”

                                  ― Christopher Hitchens

                                  H 1 Reply Last reply
                                  0
                                  • G GuyThiebaut

                                    honey the codewitch wrote:

                                    "Just because you can, doesn't necessarily mean you should"

                                    I am in complete agreement and it reminds me of the previous spate of "how do I hack a website" questions on the CodeProject forums where the questioner would be sent packing fairly quickly. When I started writing the webcam security app there was not much out there that did the same and it was only around 5+ years later that the surveillance/snooping apps/IOT devices really exploded. I think the thing about ethics is that there are a lot of grey areas and, as a massive generalisation, software developers don't tend to like grey areas which may be why the topic of ethics, from a philosophical and moral point of view, is not spoken about a lot in the community.

                                    “That which can be asserted without evidence, can be dismissed without evidence.”

                                    ― Christopher Hitchens

                                    H Offline
                                    H Offline
                                    honey the codewitch
                                    wrote on last edited by
                                    #34

                                    I'm actually pretty comfortable navigating grey areas, and interested in the softer, less concrete areas of life. Philosophy is interesting, and moral arguments and debate have value to us collectively. I can see that. I mean fuzzy stuff is fine, as long as I can eek out parameters within the fuzziness that I can live with. The K9 project I mentioned is an example of me making an exception to a hard and fast ethical commitment that I made, without actually violating the spirit of it. That's fine by me. But I am not most developers. I'm a little weird, even among the weird. I used to sort of categorize developers based on two major approaches to development, and to some degree, life. One type of developer approach - the most common sort, IME - is methodical and likes to deal in the very tangible and concrete. They tend to produce solid software, and things like TDD even appeal to them sometimes. These are the sort that I think you were alluding to when you said devs don't like grey areas. The other type of developer approach - a bit less common - is creative, but less methodical. If they were a buddhist, they'd appreciate buddhism, but also blow snot on their robes. Translate that sort of nothing-is-sacred, everything is up to interpretation, to code. I don't want these people testing software, and too many make a project rudderless, but having one or two a team keeps the team thinking around corners, solving problems where creativity is required. These developers can often work with the fuzzy stuff. I tend to fall into the latter category.

                                    Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                                    1 Reply Last reply
                                    0
                                    • S Sean Cundiff

                                      Not Open Source, but AI ethics related. Billie Eilish, Pearl Jam, 200 artists say AI poses existential threat to their livelihoods | Ars Technica[^]

                                      -Sean ---- Fire Nuts

                                      H Offline
                                      H Offline
                                      honey the codewitch
                                      wrote on last edited by
                                      #35

                                      This kind of reminds me of the Lars Ulrich freakout over torrents and the basic availability of music that the Internet gave rise to. It didn't break Metallica. Nor the music industry. But it did change it. The days of major record labels dictating who is popular are over. That's the good. The bad is obviously record sales, but artists (at least the ones I follow) have bridged the gap with more live shows. And I think the AI thing will bake out similarly. People aren't going to pay to see AI perform (except maybe Captured By Robots fans). And knockoff tracks I think will still mostly be comedic or otherwise unserious, like the Johnny Cash cover of the Barbie song on youtube. So I think these artists maybe - through misunderstanding and common fear of tech - are overblowing the situation. That's just my opinion though - I have no crystal ball.

                                      Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                                      S 1 Reply Last reply
                                      0
                                      • S Sean Cundiff

                                        Not Open Source, but AI ethics related. Billie Eilish, Pearl Jam, 200 artists say AI poses existential threat to their livelihoods | Ars Technica[^]

                                        -Sean ---- Fire Nuts

                                        J Offline
                                        J Offline
                                        James Ingram
                                        wrote on last edited by
                                        #36

                                        From the Ars Technica comments: I’m a Luddite (and So Can You!) | The Nib[^] Two quotes from the comic that struck me: > [William Morris] wanted people to take pleasure in their work rather than "mere toiling to live, that we may live to toil" and, from the final frame: > Questioning and resisting the worst excesses of technology isn't antithetical to progress. If your concept of "progress" doesn't put people at the center of it, is it even progress? In that spirit of questioning: AI is obviously a further iteration of the industrial revolution, with all the disruption that that entails, but is AI really all there is to human intelligence? We shouldn't underestimate ourselves.

                                        S 1 Reply Last reply
                                        0
                                        • J James Ingram

                                          From the Ars Technica comments: I’m a Luddite (and So Can You!) | The Nib[^] Two quotes from the comic that struck me: > [William Morris] wanted people to take pleasure in their work rather than "mere toiling to live, that we may live to toil" and, from the final frame: > Questioning and resisting the worst excesses of technology isn't antithetical to progress. If your concept of "progress" doesn't put people at the center of it, is it even progress? In that spirit of questioning: AI is obviously a further iteration of the industrial revolution, with all the disruption that that entails, but is AI really all there is to human intelligence? We shouldn't underestimate ourselves.

                                          S Offline
                                          S Offline
                                          Sean Cundiff
                                          wrote on last edited by
                                          #37

                                          I agree that AI is an extension of the industrial revolution. But while the industrial revolution brought progress, it also generated a lot of unethical corporate behavior that eventually was regulated out of existence. I suspect the same will be happening to AI.

                                          -Sean ---- Fire Nuts

                                          T 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups