Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Electronic Consciousness?

Electronic Consciousness?

Scheduled Pinned Locked Moved The Lounge
questionworkspace
107 Posts 34 Posters 8 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • S SeattleC

    Creating an artificial consciousness is a far easier task, because you are not constrained to follow any particular implementation. It just has to have the external interface of a consciousness. That is what the Turing Test is about; defining an acceptance test for an artificial consciousness. It avoids metaphysical arguments about what consciousness *is*, and effectively says, if it looks like a duck and it quacks like a duck... See? Way easier.

    J Offline
    J Offline
    jschell
    wrote on last edited by
    #101

    SeattleC++ wrote:

    Creating an artificial consciousness is a far easier task, because you are not constrained to follow any particular implementation. It just has to have the external interface of a consciousness.

    Since both are very, very difficult the fact that one might or might not be easier doesn't alter the fact that it is very, very difficult. Moreover recognizing that consciousness exists is far different than understanding it. And although understanding it might lead to creation there is no guarantee.

    SeattleC++ wrote:

    That is what the Turing Test is about; defining an acceptance test for an artificial consciousness. It avoids metaphysical arguments about what consciousness *is*, and effectively says, if it looks like a duck and it quacks like a duck

    Not really. In particular notice that the third subtopic in the following which notes that people have already used that 'test' to decide that programs are intelligent. http://en.wikipedia.org/wiki/Turing_test#Weaknesses_of_the_test[^]

    1 Reply Last reply
    0
    • J jschell

      patbob wrote:

      It doesn't need to know we exist to do something that's catastrophic to us

      Then, by definition, we would know it exists. And if we know it exists we can certainly do something to it.

      P Offline
      P Offline
      patbob
      wrote on last edited by
      #102

      We'd know, but only after the fact.. and only maybe. We'd need to know with certainty that an unnatural event happened. We'd know it was unnatural if A) our predictions didn't predict it, B) we're very certain our model isn't incorrect, and C) we don't just toss that data point out as anomalous. You hit most of the stoplights green on the way home tonight.. natural event, or unnatural? How can you tell?

      We can program with only 1's, but if all you've got are zeros, you've got nothing.

      J 1 Reply Last reply
      0
      • Y YDaoust

        I see no reason why it wouldn't be possible. After all, the human brain is nothing but a big processor that obeys the laws of physics. Man-made systems with similar capabilities of cognition, affectivity, introspection... should be able to support consciousness. Wikipedia supplies interesting material on this topic (https://en.wikipedia.org/wiki/Mind_uploading[^]). Moving consciousness from one being to another is something we (I) don't understand at the moment, and it seems to raise paradoxical situations.

        F Offline
        F Offline
        Freak30
        wrote on last edited by
        #103

        I remember the book Neverness (think it was from Alan Dean Foster) that I read several years ago. In this book a sentence was quoted several times: If our brain was simple enough that we could understand it, we would be so simple that we couldn't. So maybe creating and teaching an AI is possible, but I don't think copying/transferring would work.

        1 Reply Last reply
        0
        • P patbob

          We'd know, but only after the fact.. and only maybe. We'd need to know with certainty that an unnatural event happened. We'd know it was unnatural if A) our predictions didn't predict it, B) we're very certain our model isn't incorrect, and C) we don't just toss that data point out as anomalous. You hit most of the stoplights green on the way home tonight.. natural event, or unnatural? How can you tell?

          We can program with only 1's, but if all you've got are zeros, you've got nothing.

          J Offline
          J Offline
          jschell
          wrote on last edited by
          #104

          patbob wrote:

          We'd know, but only after the fact..

          That is a pretty big what if. You are suggesting that the did something awful but did nothing at all before that which was detectable. Sort of like suggesting that humans went from a club straight to a full nuclear missile launch.

          1 Reply Last reply
          0
          • L Lee Chetwynd

            I am curious from the point of view of people from a programming sort of environment, how many of us believe that it will be possible to store a consciousness electronically. This could be either to store an existing conciousness (as in copying or backing up an existing mind electronically) or to develop a completely new conciousness that never existed biologically. I'm trying to stay away from the far reaching philosophical and moralistic implications of doing either of these things. That's a massive can of worms. I just wondered how many coders actually think it will ever be possible and how many think it is something that could never be achieved. I think it will be possible.

            S Offline
            S Offline
            Sparkenstein
            wrote on last edited by
            #105

            Please see the page at http://tmor.exnihilum.org/ and reply here if anyone would be interested in working together with me on this way overdue project. Cheers, TJL

            1 Reply Last reply
            0
            • L Lee Chetwynd

              Lets just hope its not touch screen.

              E Offline
              E Offline
              Ed Gadziemski
              wrote on last edited by
              #106

              Depends upon who's doing the touching. :)

              L 1 Reply Last reply
              0
              • E Ed Gadziemski

                Depends upon who's doing the touching. :)

                L Offline
                L Offline
                Lee Chetwynd
                wrote on last edited by
                #107

                Good Point. :) As long as they wash their hands.

                1 Reply Last reply
                0
                Reply
                • Reply as topic
                Log in to reply
                • Oldest to Newest
                • Newest to Oldest
                • Most Votes


                • Login

                • Don't have an account? Register

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • World
                • Users
                • Groups