Electronic Consciousness?
-
I am curious from the point of view of people from a programming sort of environment, how many of us believe that it will be possible to store a consciousness electronically. This could be either to store an existing conciousness (as in copying or backing up an existing mind electronically) or to develop a completely new conciousness that never existed biologically. I'm trying to stay away from the far reaching philosophical and moralistic implications of doing either of these things. That's a massive can of worms. I just wondered how many coders actually think it will ever be possible and how many think it is something that could never be achieved. I think it will be possible.
Yes, it will be possible IMO. This is how it can be done: It will not be a scan, or a download, it will be slow. Every cell in your brain can be replaced by a synthetic. Your consciousness is not in one cell, but is made up of a collection of all of your cells. Switching each one individual will allow your consciousness to switch over to electronics, at what point does that occur? It's not binary- at what spoonful of soil does a hill become a mountain? After the transition is complete, the body and be destroyed and electric brain removed.
-
I am curious from the point of view of people from a programming sort of environment, how many of us believe that it will be possible to store a consciousness electronically. This could be either to store an existing conciousness (as in copying or backing up an existing mind electronically) or to develop a completely new conciousness that never existed biologically. I'm trying to stay away from the far reaching philosophical and moralistic implications of doing either of these things. That's a massive can of worms. I just wondered how many coders actually think it will ever be possible and how many think it is something that could never be achieved. I think it will be possible.
Don't. Build. Cylons.
-
I am curious from the point of view of people from a programming sort of environment, how many of us believe that it will be possible to store a consciousness electronically. This could be either to store an existing conciousness (as in copying or backing up an existing mind electronically) or to develop a completely new conciousness that never existed biologically. I'm trying to stay away from the far reaching philosophical and moralistic implications of doing either of these things. That's a massive can of worms. I just wondered how many coders actually think it will ever be possible and how many think it is something that could never be achieved. I think it will be possible.
The question is premature, at least for Western science, as it hasn't even been figured out yet. Though an advanced Buddhist meditator who has studied the Abhidharma(Google it) might have some intersting ideas to relate. So how could you save it if you don't have a clue what it is? Mike
-
You don't need compassion to be conscious, in fact you don't need any emotion to be conscious. Psychopaths don't have any and not just the murderous types. I'm sure they are still considered conscious. I'm not keen on having a back up made of my mind that was stripped of emotion however. But a part backup is better than none.
-
But do you think it will ever be possible? :)
I wrote a short SF story to address that. Many stories, like James P. Hogan's Immortality Option, deal with transfer through a distructive process, others, make copies, one (title forgotten) dealt with an "entangled" cloud of particles to image the brain. Mine, uses a slow transfer from association. Ghost in the Works[^] Not surprisingly, I think my method has the likeliest chance of success, it gets past all the problems with imaging the brain to make a copy.
Psychosis at 10 Film at 11 Those who do not remember the past, are doomed to repeat it. Those who do not remember the past, cannot build upon it.
-
I am curious from the point of view of people from a programming sort of environment, how many of us believe that it will be possible to store a consciousness electronically. This could be either to store an existing conciousness (as in copying or backing up an existing mind electronically) or to develop a completely new conciousness that never existed biologically. I'm trying to stay away from the far reaching philosophical and moralistic implications of doing either of these things. That's a massive can of worms. I just wondered how many coders actually think it will ever be possible and how many think it is something that could never be achieved. I think it will be possible.
The very detailed book "Astral Dynamics" by Robert Bruce provides first hand accounts of different types of consciousness in our different levels of bodies, saying that the physical body brain does indeed have a type of consciousness, even though it appears to be mortal and not the eternal consciousness of the soul. My own experience tends to agree with this. So, theoretically, I would say a qualified yes, to a degree.
-
I am curious from the point of view of people from a programming sort of environment, how many of us believe that it will be possible to store a consciousness electronically. This could be either to store an existing conciousness (as in copying or backing up an existing mind electronically) or to develop a completely new conciousness that never existed biologically. I'm trying to stay away from the far reaching philosophical and moralistic implications of doing either of these things. That's a massive can of worms. I just wondered how many coders actually think it will ever be possible and how many think it is something that could never be achieved. I think it will be possible.
If we consider the brain as simple computing device, then the answer is yes, but first we must understand what defines a consciousness, the minimum detail level we need to emulate so it works and the appropriate platform to emulate it. Having said that, I believe that the time for everyone of us wearing and "Intel Inside" badge in our skulls, is a bit far away. ;P
CEO at: - Rafaga Systems - Para Facturas - Modern Components for the moment...
-
I am curious from the point of view of people from a programming sort of environment, how many of us believe that it will be possible to store a consciousness electronically. This could be either to store an existing conciousness (as in copying or backing up an existing mind electronically) or to develop a completely new conciousness that never existed biologically. I'm trying to stay away from the far reaching philosophical and moralistic implications of doing either of these things. That's a massive can of worms. I just wondered how many coders actually think it will ever be possible and how many think it is something that could never be achieved. I think it will be possible.
Lee Chetwynd wrote:
it will be possible to store a consciousness electronically
Reading out the data and storing the structure of a brain is simply an engineering problem, probably not possible today, but within our grasp should we want to develop the technology. Whether that translates to storing a consciousness is unknown. The difficult part comes in what ways you expect it can be used. Emulating it electronically, a la Heechee, we're still a ways away from that. Using it as a backup implies that you can restore it. We're a long, long ways away from being able to restore the saved data. As for developing a new consciousness, well, saving, executing and restoring a consciousness probably isn't terribly relevant to developing one. Will we develop an artificial consciousness? I think we're starting to approach it in the right ways, but I'm not sure we'll recognize when we succeed. Personally, I'm beginning to suspect we've already succeeded.
We can program with only 1's, but if all you've got are zeros, you've got nothing.
-
Lee Chetwynd wrote:
it will be possible to store a consciousness electronically
Reading out the data and storing the structure of a brain is simply an engineering problem, probably not possible today, but within our grasp should we want to develop the technology. Whether that translates to storing a consciousness is unknown. The difficult part comes in what ways you expect it can be used. Emulating it electronically, a la Heechee, we're still a ways away from that. Using it as a backup implies that you can restore it. We're a long, long ways away from being able to restore the saved data. As for developing a new consciousness, well, saving, executing and restoring a consciousness probably isn't terribly relevant to developing one. Will we develop an artificial consciousness? I think we're starting to approach it in the right ways, but I'm not sure we'll recognize when we succeed. Personally, I'm beginning to suspect we've already succeeded.
We can program with only 1's, but if all you've got are zeros, you've got nothing.
patbob wrote:
I'm not sure we'll recognize when we succeed.
Leave it running with no input. If it gets bored and starts singing "Daisy Bell," it's conscious. And should be destroyed immediately.
-
I am curious from the point of view of people from a programming sort of environment, how many of us believe that it will be possible to store a consciousness electronically. This could be either to store an existing conciousness (as in copying or backing up an existing mind electronically) or to develop a completely new conciousness that never existed biologically. I'm trying to stay away from the far reaching philosophical and moralistic implications of doing either of these things. That's a massive can of worms. I just wondered how many coders actually think it will ever be possible and how many think it is something that could never be achieved. I think it will be possible.
I think that cannot be reproduced electronically something that is biologically or chemically done.
-
patbob wrote:
I'm not sure we'll recognize when we succeed.
Leave it running with no input. If it gets bored and starts singing "Daisy Bell," it's conscious. And should be destroyed immediately.
-
I am curious from the point of view of people from a programming sort of environment, how many of us believe that it will be possible to store a consciousness electronically. This could be either to store an existing conciousness (as in copying or backing up an existing mind electronically) or to develop a completely new conciousness that never existed biologically. I'm trying to stay away from the far reaching philosophical and moralistic implications of doing either of these things. That's a massive can of worms. I just wondered how many coders actually think it will ever be possible and how many think it is something that could never be achieved. I think it will be possible.
Will we be able to create an artificial consciousness sufficiently similar to our own that we would recognize it as conscious? Absolutely. Probably within a human lifetime. Self-awareness isn't as hard as people make it out to be. Consciousness is mostly a data storage and retrieval problem, I think. Will we be able to download a human consciousness and run it on a computer? Probably not feasible. The human brain stores its memories in the wiring of the entire brain. You'd have to be able to read out all the neural synapses. Can you do that without carefully dissassembling a brain? I rather doubt it. Synaptic function is more than connectivity. The strength of signalling of any neuron is a function of the precise functioning of the neuron's internal machinery. You may expect this machinery to be as variable as human appearance. Think of all the many kinds of behavioral health issues people may display, all the kinds of genius and developmental disability, and you get some idea of how variable this function is. Even if we could wave a magic wand and make the problem of downloading the synapses go away, the brain is not a single organ, but rather hundreds of related, specialized processors. DNA stores only an approximate map of these processors, they are self-assembling during development, and no two are alike. Making these specialized processors run efficiently on a general-purpose computer is unlikely. You'll have to simulate them at a very low level to get high fidelity. And in the end, how valuable would this be, even if we could do it? Few people are going to be excited about having their consciousness downloaded if the process is destructive, because the consciousness in your body would then very definitely die, with only the promise that a very similar one would be created. Like life insurance, this bet doesn't benefit your original consciousness. If the process was not destructive, then the result would be that there are two "yous", each wanting to live, each wanting to control the assets "you" own, each rapidly diverging into different identities as their experiences differed. You forgot to ask the "upload" question. Could you upload your copied consciousness into other brains? Again, given the variability of brains, you'd need a way to exactly recreate your original brain in order for the consciousness "program" to run reliably. Sorry, the human brain is the ultimate intellectual property. Its design completely frustrates copying and duplication. DRM is designed in, intelligently or not.
-
Nope.. that's what broken computers do. Not conscious ones :)
We can program with only 1's, but if all you've got are zeros, you've got nothing.
Ah, but all conscious minds are broken :)
-
Ah, but all conscious minds are broken :)
-
I am curious from the point of view of people from a programming sort of environment, how many of us believe that it will be possible to store a consciousness electronically. This could be either to store an existing conciousness (as in copying or backing up an existing mind electronically) or to develop a completely new conciousness that never existed biologically. I'm trying to stay away from the far reaching philosophical and moralistic implications of doing either of these things. That's a massive can of worms. I just wondered how many coders actually think it will ever be possible and how many think it is something that could never be achieved. I think it will be possible.
I think it will be possible, but it's going to take a better understanding of the quantum state of neurons to implement. There is evidence that neurons operate using electrons in a hyperpositional state, meaning that it is entirely possible what we think of as information in the brain only exists as it interacts with the world. Even developing a non-biological conciousness, at least as we currently understand what that means, will likely require some form of non-deterministic computation. In the nearer term, developing more and more sophisticated simulations that can, ultimately, fool the user/observer into thinking that they are conscious is a much more straightforward goal.
-
I think it will be possible, but it's going to take a better understanding of the quantum state of neurons to implement. There is evidence that neurons operate using electrons in a hyperpositional state, meaning that it is entirely possible what we think of as information in the brain only exists as it interacts with the world. Even developing a non-biological conciousness, at least as we currently understand what that means, will likely require some form of non-deterministic computation. In the nearer term, developing more and more sophisticated simulations that can, ultimately, fool the user/observer into thinking that they are conscious is a much more straightforward goal.
But Troy you are working on an assumption that you know what consciousness is. You don't. Remember the story of the blind men and the elephant. Each had a different perception depending on which part of the elephant they were touching. Try reading some early 20th or even 19th thoughts on science and see how each generation is so presumptious about what it thinks it knows. We have a very, very long way to go. I think 1000 years from now might we might be a little closer. Lay people often ask me about things like artificial intelligence and I tell them the key word is "artificial" NOT intelligence.
-
Exactly my point.. how can you tell a broken computer, from a conscious one that thinks so differently, that its completely unaware of you?
We can program with only 1's, but if all you've got are zeros, you've got nothing.
patbob wrote:
how can you tell a broken computer, from a conscious one that thinks so differently, that its completely unaware of you?
You also stated that we will not be able to recognize it. We don't know it exists. It doesn't know we exist. Only the first part matters. And from that one can conclude that the question is irrelevant. It cannot impact our lives, by definition, so it can't ever matter. One can just as easily claim that a rock is conscious and it would mean the same thing.
-
I am curious from the point of view of people from a programming sort of environment, how many of us believe that it will be possible to store a consciousness electronically. This could be either to store an existing conciousness (as in copying or backing up an existing mind electronically) or to develop a completely new conciousness that never existed biologically. I'm trying to stay away from the far reaching philosophical and moralistic implications of doing either of these things. That's a massive can of worms. I just wondered how many coders actually think it will ever be possible and how many think it is something that could never be achieved. I think it will be possible.
Lee Chetwynd wrote:
believe that it will be possible to store a consciousness electronically.
There are several points. 1. Can it be recorded? 2. What happens after it is recorded? For the first one must be able to demonstrate that consciousness itself has been recorded and not just a fixed state. That means that the second requires that it must be able to interact with the world in such a way that it is verifiable. And if either is possible it is a long, long way off.
Lee Chetwynd wrote:
or to develop a completely new conciousness that never existed biologically.
No that is a different problem. A rough analogy is the difference between recording a artist performing a song and then playing it back versus creating the song in the first place. Again if possible at all it is a long, long way off. And given the lack of real progress in this in the last 50 years, despite many attempts, I suspect it will never be possible.
-
patbob wrote:
how can you tell a broken computer, from a conscious one that thinks so differently, that its completely unaware of you?
You also stated that we will not be able to recognize it. We don't know it exists. It doesn't know we exist. Only the first part matters. And from that one can conclude that the question is irrelevant. It cannot impact our lives, by definition, so it can't ever matter. One can just as easily claim that a rock is conscious and it would mean the same thing.
-
Exactly my point.. how can you tell a broken computer, from a conscious one that thinks so differently, that its completely unaware of you?
We can program with only 1's, but if all you've got are zeros, you've got nothing.
It wouldn't think differently if done correctly, that's the point, you wouldn't be able to tell the difference if you didn't already know it was a machine. For instance, the Turing Test. Also if we can build machines that replicate human consciousness, those machines would also be vulnerable to human neuroses. Hence HAL. Hence broken. Throwing an exception could get a lot more dangerous. For the record, I don't think that it will ever be possible to do this, and in any case it's currently nothing more than a fantasy given the current state of technology and our knowledge of the mind. But the attempt can teach us much about ourselves, I think.