Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. The ethics of Open Source

The ethics of Open Source

Scheduled Pinned Locked Moved The Lounge
questionbusinesshelptutorialcareer
43 Posts 20 Posters 3 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • S Sean Cundiff

    I agree that AI is an extension of the industrial revolution. But while the industrial revolution brought progress, it also generated a lot of unethical corporate behavior that eventually was regulated out of existence. I suspect the same will be happening to AI.

    -Sean ---- Fire Nuts

    T Offline
    T Offline
    trønderen
    wrote on last edited by
    #41

    Sean Cundiff wrote:

    it also generated a lot of unethical corporate behavior that eventually was regulated out of existence

    It WAS???

    Religious freedom is the freedom to say that two plus two make five.

    1 Reply Last reply
    0
    • C Chris Maunder

      The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

      Quote:

      6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

      I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

      cheers Chris Maunder

      L Offline
      L Offline
      Leonardo Pessoa
      wrote on last edited by
      #42

      I may be a bit more radical about this but I'd say "who cares about OSI?" Better yet a simple "seal of approval"? I certainly don't. Ever. Whenever I provide code, it is free of moral judgements as to what will be done with it. Whenever such concerns arise, I'd rather not share the code and maybe the app too, or even better I just don't even start coding at all (at least pet projects) because once the cat is out of the box the morals of whoever uses the code cannot be guaranteed.

      - Leonardo

      1 Reply Last reply
      0
      • C Chris Maunder

        The definition of Open Source, by the OSI, has a clause that I feel is an issue today.

        Quote:

        6. No Discrimination Against Fields of Endeavor The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

        I would argue that the most important issue facing development today is the ethics around AI. Beyond even the economic upheavals going on where entire job titles are disappearing in the same way that the printing press caused a fair bit of gnashing of teeth. AI provides the world with a weapon more dangerous than a gun in a more convenient and cheaper package. We all know that laws are woefully slow to keep up with even the previous pace of IT innovation, and AI has leapt forward so fast that the catch up will take years or decades. The OSI specifically says that if you want their badge on your software you cannot say to someone: "with this code, do no harm". You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones. This isn't a commentary on the rights and wrongs of writing software. A knife can save a life or take a life: we need them, and so too with software. What I'm concerned about is whether, after 40 years, the blessing of the Open Source badge makes ethical sense. AI provides an escape hatch here where the code can remain Free (as in freedom) but the models are subject to ethical constraints imposed by the owners (or collectors) of the data. To me this won't work because it's like saying the gun is safe because one of the types of bullets it uses is banned (but the other 9 are on Amazon next-day). So what do you guys think? Let's ignore the practical difficulties of ever enforcing a restriction on code use, as well as the difficulty in defining "ethical" in a way that covers every culture, society and time. What is more important to you, as the developer of code you want to share with the World: 1. That the code is always able to be used for anything, without constraint 2. That you have the ability to restrict the use of your code based on ethical concerns

        cheers Chris Maunder

        P Offline
        P Offline
        pmauriks
        wrote on last edited by
        #43

        I for one, would rather not have code tied up in ethics discussions. I disagree with your statement "You have to explicitly be OK with someone using your AI creation to harm kids, to destroy lives, to create scams, to automate cyberbullying, to impersonate loved ones". That's like loaning your car to your teenage relative, and saying it's equivalent to agreeing to their use of the vehicle to break road laws, do burnouts, and conduct ram raids. I don't think that's the case. As usual, things are not binary "good" or "bad" but but more subtle. Often, restrictions on use serve to restrict use for the "good guys". The bad people really don't care what you think. Don't want your software used for harmful purposes, define harm. One of the most heinous people of the Nazi regime discovered an efficient means to create nitrate fertilisers. (look it up). Without this discovery, it might be difficult to feed the population of the earth at this point in time. I'm sure he thought he was doing the right thing. Oppenheimer and the Manhattan project created nuclear weapons - was that the right thing? It depends on your perspective. Or rather their perspective - which you don't have a lot of control over. . . . and then who fundamentally decides. Is it you personally, or a self imposed restriction by the end users, assuming they even read the license. While trying to control who uses your software is well meaning. In practise, I think you can only do it where you explicitly allow access through a license. Once the software is "open" - it's open.

        1 Reply Last reply
        0
        Reply
        • Reply as topic
        Log in to reply
        • Oldest to Newest
        • Newest to Oldest
        • Most Votes


        • Login

        • Don't have an account? Register

        • Login or register to search.
        • First post
          Last post
        0
        • Categories
        • Recent
        • Tags
        • Popular
        • World
        • Users
        • Groups