Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. The ethics of Open Source

The ethics of Open Source

Scheduled Pinned Locked Moved The Lounge
questionbusinesshelptutorialcareer
43 Posts 20 Posters 1 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C Offline
    C Offline
    Chris Maunder
    wrote on last edited by
    #1

    The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

    Quote:

    6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

    I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

    cheers Chris Maunder

    pkfoxP J D S J 17 Replies Last reply
    0
    • C Chris Maunder

      The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

      Quote:

      6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

      I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

      cheers Chris Maunder

      pkfoxP Offline
      pkfoxP Offline
      pkfox
      wrote on last edited by
      #2

      I will have to ponder on this Chris

      In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP

      1 Reply Last reply
      0
      • C Chris Maunder

        The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

        Quote:

        6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

        I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

        cheers Chris Maunder

        J Offline
        J Offline
        Jeremy Falcon
        wrote on last edited by
        #3

        Chris Maunder wrote:

        The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones.

        It's no different than a government trying to legislate morality and trying to impose your morals unto others. That's impossible because "morality" is subjective. There is objective wrong and right, but there's also subjective. To use a non-political example, if I laugh at a dude for being rejected by a chick because he did something dumb and I find it funny, but he goes home and cries about it while listening to Celine Dion for three years. Did I do this dude harm? Him and his circle of friends that also get rejected might say yes. But, my circle of friends would say dude needs to man up. So then, was there harm? Depends on who you ask, as it's usually the receiver that dictates what's harmful or not (whether or not it really is).

        Chris Maunder wrote:

        So what do you guys think?

        Methinks it's like nuclear fission. We figured out a way to make power plants from that don't pollute the atmosphere. But, of course, now we have nuclear bombs and some plants that cause radiation leaks. Power plants have gotten better about the radiation, but still, you win some you lose some and we can literally destroy the planet with this technology if we go into nuclear war. Would the world be better off without nuclear energy? Dunno. But it happened and cannot unhappen. AI is going to be the same. And while some policing should be done on AI, is that person/entity doing the policing really wise enough to do it? It sure isn't the government. They screw up everything they touch. IMO, making sure AI is open source is our best bet to keep it under control. If anyone can complete with your tech (in theory) then the odds of you turning into the next Google monster are slimmer. Edit: The future of tech will be less and less technical as machines start programming themselves. So, it'll be soft skills more so than source code that'll become the next nugget of IP. Happens with every industry. I'm sure soap was expensive when it first came out. Eventually, source code will be less revered as computers no longer need humans to handle the tech aspects of it. So, not sharing source code won't stop much anywa

        S C 2 Replies Last reply
        0
        • C Chris Maunder

          The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

          Quote:

          6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

          I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

          cheers Chris Maunder

          D Offline
          D Offline
          Daniel Pfeffer
          wrote on last edited by
          #4

          I see little utility in the OSI's imprimantur. If I publish anything (open or closed source), I publish it under a licence that I feel comfortable with. If I don't want my code to be used by (for example) lawyers, that is my right. My freely-available code may be less popular because of the restrictions, but as I'm not earning money from it, who cares?

          Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.

          1 Reply Last reply
          0
          • C Chris Maunder

            The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

            Quote:

            6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

            I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

            cheers Chris Maunder

            S Offline
            S Offline
            Sean Cundiff
            wrote on last edited by
            #5

            I have a lot to say on this, particularly since the current set of AI-bro ethics is basically 'late-stage capitalism' (and this is coming from someone who likes conservative capitalism). I'm going to hold off on most of my thoughts for now. (1) Regulation is coming. Biden has given federal US agencies 60 days (iirc) to hire a C-level AI officer to handle the ethics. As someone else has mentioned, I have very little faith that they will know what they are doing. You're not going to get competent C-levels in the government at government wages. (2) Once your code is out in the open, REGARDLESS of your license restrictions, ne'er-do-wells are going to use it however they want. Full Stop. (3) The biggest fight for ethics in AI will most likely not come from tech, but from Hollywood (imho). That being said, I'm all for open source software. As you say, there is a LOT of good that comes from it. So you're really left with two choices, release it and come to terms with some people using it unethically, illegally, and immorally. Or don't release it. Personally, I'd release my code with my ethical 'code' attached. You've made your intentions known. Back to (1) + 'late-stage capitalism': Regulation is sorely needed, and putting massive amounts of people out of work will shoot these companies in the feet. People don't have money == People ain't gonna buy your product.

            -Sean ---- Fire Nuts

            1 Reply Last reply
            0
            • C Chris Maunder

              The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

              Quote:

              6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

              I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

              cheers Chris Maunder

              J Offline
              J Offline
              jschell
              wrote on last edited by
              #6

              Chris Maunder wrote:

              What is more important to you, as the developer of code you want to share with the World:

              As one of the CP insider articles from today suggested, at least some have a more pressing concern. That they get to make some bucks from the usage.

              1 Reply Last reply
              0
              • J Jeremy Falcon

                Chris Maunder wrote:

                The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones.

                It's no different than a government trying to legislate morality and trying to impose your morals unto others. That's impossible because "morality" is subjective. There is objective wrong and right, but there's also subjective. To use a non-political example, if I laugh at a dude for being rejected by a chick because he did something dumb and I find it funny, but he goes home and cries about it while listening to Celine Dion for three years. Did I do this dude harm? Him and his circle of friends that also get rejected might say yes. But, my circle of friends would say dude needs to man up. So then, was there harm? Depends on who you ask, as it's usually the receiver that dictates what's harmful or not (whether or not it really is).

                Chris Maunder wrote:

                So what do you guys think?

                Methinks it's like nuclear fission. We figured out a way to make power plants from that don't pollute the atmosphere. But, of course, now we have nuclear bombs and some plants that cause radiation leaks. Power plants have gotten better about the radiation, but still, you win some you lose some and we can literally destroy the planet with this technology if we go into nuclear war. Would the world be better off without nuclear energy? Dunno. But it happened and cannot unhappen. AI is going to be the same. And while some policing should be done on AI, is that person/entity doing the policing really wise enough to do it? It sure isn't the government. They screw up everything they touch. IMO, making sure AI is open source is our best bet to keep it under control. If anyone can complete with your tech (in theory) then the odds of you turning into the next Google monster are slimmer. Edit: The future of tech will be less and less technical as machines start programming themselves. So, it'll be soft skills more so than source code that'll become the next nugget of IP. Happens with every industry. I'm sure soap was expensive when it first came out. Eventually, source code will be less revered as computers no longer need humans to handle the tech aspects of it. So, not sharing source code won't stop much anywa

                S Offline
                S Offline
                Sean Cundiff
                wrote on last edited by
                #7

                Jeremy Falcon wrote:

                Edit: The future of tech will be less and less technical as machines start programming themselves. So, it'll be soft skills more so than source code that'll become the next nugget of IP. Happens with every industry. I'm sure soap was expensive when it first came out. Eventually, source code will be less revered as computers no longer need humans to handle the tech aspects of it. So, not sharing source code won't stop much anyway.

                You're probably right, but I'm still gonna use assembly, C, and Rust and hack the matrix no matter what. :laugh:

                -Sean ---- Fire Nuts

                1 Reply Last reply
                0
                • C Chris Maunder

                  The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

                  Quote:

                  6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

                  I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

                  cheers Chris Maunder

                  Greg UtasG Offline
                  Greg UtasG Offline
                  Greg Utas
                  wrote on last edited by
                  #8

                  I want the ability to restrict the use of my code based on ethical concerns, and licensing allows this. I have absolutely no faith in laws and regulations from the official sector, which are rife with regulatory capture and self-serving exemptions. The nasty examples that you listed are already generally prohibited. The real danger is abuse by the official sector. Large open source projects are typically produced by multiple contributors. In that case, you have to decide whether you're comfortable with the existing license when deciding whether to contribute.

                  Robust Services Core | Software Techniques for Lemmings | Articles
                  The fox knows many things, but the hedgehog knows one big thing.

                  <p><a href="https://github.com/GregUtas/robust-services-core/blob/master/README.md">Robust Services Core</a>
                  <em>The fox knows many things, but the hedgehog knows one big thing.</em></p>

                  1 Reply Last reply
                  0
                  • C Chris Maunder

                    The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

                    Quote:

                    6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

                    I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

                    cheers Chris Maunder

                    J Offline
                    J Offline
                    jmaida
                    wrote on last edited by
                    #9

                    These last 2 statements seem contradictory to me 1.That the code is always able to be used for anything, without constraint 2.That you have the ability to restrict the use of your code based on ethical concerns ?

                    "A little time, a little trouble, your better day" Badfinger

                    P C 2 Replies Last reply
                    0
                    • C Chris Maunder

                      The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

                      Quote:

                      6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

                      I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

                      cheers Chris Maunder

                      A Offline
                      A Offline
                      Amarnath S
                      wrote on last edited by
                      #10

                      Given that a good percentage of any code base today is borrowed/adapted from different sites on the Internet (including open source code), I feel the question is more of Accountability. The final Accountability (including ethical accountability) of any software should rest with the current owner, releaser of that software, and not be transferred to the various internet sources from where extracts were taken. In other words, the "buck stops" at the person/company which released such code into production, deployment. As an open source developer i will be unaware of the possible use cases of my code 40 years hence. And i need to be insulated against possible misuse.

                      C J 2 Replies Last reply
                      0
                      • J jmaida

                        These last 2 statements seem contradictory to me 1.That the code is always able to be used for anything, without constraint 2.That you have the ability to restrict the use of your code based on ethical concerns ?

                        "A little time, a little trouble, your better day" Badfinger

                        P Offline
                        P Offline
                        Peter_in_2780
                        wrote on last edited by
                        #11

                        They are presented as alternatives. Pick one.

                        Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012

                        J 1 Reply Last reply
                        0
                        • C Chris Maunder

                          The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

                          Quote:

                          6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

                          I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

                          cheers Chris Maunder

                          R Offline
                          R Offline
                          RainHat
                          wrote on last edited by
                          #12

                          The OSI is talking about what defines open source. If a repository was restricted to use by people with the initials Q.C. there would be a lot of people who could not use it and calling it open source would be a stretch of the imagination. As for putting ethical restrictions on software I would say only ethical people will respect it, the rest will use your code anyway and hope they do not get caught. Personally if I felt strongly about something I would put in a disclaimer, rather than a licence clause. Something like: This code is not endorsed for use in cold blooded murder. It makes the point without adding legal restrictions.

                          C 1 Reply Last reply
                          0
                          • C Chris Maunder

                            The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

                            Quote:

                            6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

                            I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

                            cheers Chris Maunder

                            R Offline
                            R Offline
                            realJSOP
                            wrote on last edited by
                            #13

                            I think there should be no bans on any type ammunition. Guns are intended to be dangerous, regardless of the ammo you use. (To keep it on topic...) I don't use AI, but I have guns.

                            ".45 ACP - because shooting twice is just silly" - JSOP, 2010
                            -----
                            You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
                            -----
                            When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013

                            C 1 Reply Last reply
                            0
                            • J Jeremy Falcon

                              Chris Maunder wrote:

                              The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones.

                              It's no different than a government trying to legislate morality and trying to impose your morals unto others. That's impossible because "morality" is subjective. There is objective wrong and right, but there's also subjective. To use a non-political example, if I laugh at a dude for being rejected by a chick because he did something dumb and I find it funny, but he goes home and cries about it while listening to Celine Dion for three years. Did I do this dude harm? Him and his circle of friends that also get rejected might say yes. But, my circle of friends would say dude needs to man up. So then, was there harm? Depends on who you ask, as it's usually the receiver that dictates what's harmful or not (whether or not it really is).

                              Chris Maunder wrote:

                              So what do you guys think?

                              Methinks it's like nuclear fission. We figured out a way to make power plants from that don't pollute the atmosphere. But, of course, now we have nuclear bombs and some plants that cause radiation leaks. Power plants have gotten better about the radiation, but still, you win some you lose some and we can literally destroy the planet with this technology if we go into nuclear war. Would the world be better off without nuclear energy? Dunno. But it happened and cannot unhappen. AI is going to be the same. And while some policing should be done on AI, is that person/entity doing the policing really wise enough to do it? It sure isn't the government. They screw up everything they touch. IMO, making sure AI is open source is our best bet to keep it under control. If anyone can complete with your tech (in theory) then the odds of you turning into the next Google monster are slimmer. Edit: The future of tech will be less and less technical as machines start programming themselves. So, it'll be soft skills more so than source code that'll become the next nugget of IP. Happens with every industry. I'm sure soap was expensive when it first came out. Eventually, source code will be less revered as computers no longer need humans to handle the tech aspects of it. So, not sharing source code won't stop much anywa

                              C Offline
                              C Offline
                              Chris Maunder
                              wrote on last edited by
                              #14

                              It's not about whether something is good and bad. It's about which is more ethical. Is saying "I don't allow this to be used for bad things" or "This must be allowed to be used for anything, no matter how immoral, damaging or outright evil" more ethical?

                              cheers Chris Maunder

                              1 Reply Last reply
                              0
                              • R realJSOP

                                I think there should be no bans on any type ammunition. Guns are intended to be dangerous, regardless of the ammo you use. (To keep it on topic...) I don't use AI, but I have guns.

                                ".45 ACP - because shooting twice is just silly" - JSOP, 2010
                                -----
                                You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
                                -----
                                When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013

                                C Offline
                                C Offline
                                Chris Maunder
                                wrote on last edited by
                                #15

                                I'm sitting here, sipping a beer, while giving you a very flat look. Never change, John. The world will crumble.

                                cheers Chris Maunder

                                R D 2 Replies Last reply
                                0
                                • J jmaida

                                  These last 2 statements seem contradictory to me 1.That the code is always able to be used for anything, without constraint 2.That you have the ability to restrict the use of your code based on ethical concerns ?

                                  "A little time, a little trouble, your better day" Badfinger

                                  C Offline
                                  C Offline
                                  Chris Maunder
                                  wrote on last edited by
                                  #16

                                  It was a question:

                                  Quote:

                                  What is more important to you, as the developer of code you want to share with the World: - That the code is always able to be used for anything, without constraint - That you have the ability to restrict the use of your code based on ethical concerns

                                  cheers Chris Maunder

                                  J 1 Reply Last reply
                                  0
                                  • A Amarnath S

                                    Given that a good percentage of any code base today is borrowed/adapted from different sites on the Internet (including open source code), I feel the question is more of Accountability. The final Accountability (including ethical accountability) of any software should rest with the current owner, releaser of that software, and not be transferred to the various internet sources from where extracts were taken. In other words, the "buck stops" at the person/company which released such code into production, deployment. As an open source developer i will be unaware of the possible use cases of my code 40 years hence. And i need to be insulated against possible misuse.

                                    C Offline
                                    C Offline
                                    Chris Maunder
                                    wrote on last edited by
                                    #17

                                    Amarnath S wrote:

                                    i need to be insulated against possible misuse.

                                    Therein lies the rub.

                                    cheers Chris Maunder

                                    1 Reply Last reply
                                    0
                                    • R RainHat

                                      The OSI is talking about what defines open source. If a repository was restricted to use by people with the initials Q.C. there would be a lot of people who could not use it and calling it open source would be a stretch of the imagination. As for putting ethical restrictions on software I would say only ethical people will respect it, the rest will use your code anyway and hope they do not get caught. Personally if I felt strongly about something I would put in a disclaimer, rather than a licence clause. Something like: This code is not endorsed for use in cold blooded murder. It makes the point without adding legal restrictions.

                                      C Offline
                                      C Offline
                                      Chris Maunder
                                      wrote on last edited by
                                      #18

                                      So would that suggest that morally you would be comfortable releasing code you knew could (and perhaps was) being used for Evil Purposes as long as you had a non enforceable "Don't use this for Evil Purposes" statement in the code? At a practical level an expensive, water tight legally binding license is just as enforceable as your note in the minds of many, so I guess it comes down to: Do you make a statement that has no teeth, or do you make a statement with teeth that will not really help the situation? which reduces down to Do you put the effort into a statement, knowing it will not actually help, or do you just mail it in? Which is really Do you put time and money into a statement as a statement unto itself, or just put a statement in so you can say you said "I told them not to" This stuff is hard.

                                      cheers Chris Maunder

                                      J 1 Reply Last reply
                                      0
                                      • C Chris Maunder

                                        I'm sitting here, sipping a beer, while giving you a very flat look. Never change, John. The world will crumble.

                                        cheers Chris Maunder

                                        R Offline
                                        R Offline
                                        realJSOP
                                        wrote on last edited by
                                        #19

                                        But you laughed, right? Consider your answer - remember, I have guns. :)

                                        ".45 ACP - because shooting twice is just silly" - JSOP, 2010
                                        -----
                                        You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
                                        -----
                                        When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013

                                        C 1 Reply Last reply
                                        0
                                        • R realJSOP

                                          But you laughed, right? Consider your answer - remember, I have guns. :)

                                          ".45 ACP - because shooting twice is just silly" - JSOP, 2010
                                          -----
                                          You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
                                          -----
                                          When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013

                                          C Offline
                                          C Offline
                                          Chris Maunder
                                          wrote on last edited by
                                          #20

                                          Yes, absolutely 😅

                                          cheers Chris Maunder

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups