Electronic Consciousness?
-
I am curious from the point of view of people from a programming sort of environment, how many of us believe that it will be possible to store a consciousness electronically. This could be either to store an existing conciousness (as in copying or backing up an existing mind electronically) or to develop a completely new conciousness that never existed biologically. I'm trying to stay away from the far reaching philosophical and moralistic implications of doing either of these things. That's a massive can of worms. I just wondered how many coders actually think it will ever be possible and how many think it is something that could never be achieved. I think it will be possible.
I view consciousness as akin to an operating system (not Windows 8, hopefully) with memories archived in a WORM-like storage area and accumulated experiences and knowledge executing as programs. Therefore, machine consciousness is possible once the proper computing environment is provided.
-
BillWoodruff wrote:
soon as I can define "consciousness,"
Not unconscious. Problem? :trollface:
“Education is not the piling on of learning, information, data, facts, skills, or abilities - that's training or instruction - but is rather making visible what is hidden as a seed”
“One of the greatest problems of our time is that many are schooled but few are educated”Sir Thomas More (1478 – 1535)
And what of the subconscious and collective unconscious? :rolleyes:
-
The information that you'd have to store would at most be the full state (including quantum state) of every particle that's involved (because that's literally everything that could be involved somehow, even if we don't know how). That would be a problem, but it's very unlikely that you'd need that kind of detail - for example, an MRI of your brain changes a lot of quantum states, but not your consciousness (I hope!), implying that not all state is relevant. Most likely the classical state alone is plenty, and you probably wouldn't even need it in unlimited precision. TL;DR: Yes.
This may give a snapshot of the mind at a certain point in time but I think consciousness goes well beyond this.
-
Lee Chetwynd wrote:
how many of us believe that it will be possible to store a consciousness electronically.
Perhaps eventually, but I would place that as perhaps as much (or as little) as a thousand years from now. The main problem is with the term "consciousness." Consciousness is not a static thing that can be stored in memory cells - consciousness, as I recall it being defined somewhere once, is the self awareness of constant change, including one's own thoughts. So an electronic consciousness, in my definition, wouldn't just be a snapshot of the thinking and memories of a person at a particular point in time, it would also have to support continual changing perceptual awareness, and that adds another significant layer of complexity to the concept. And again, by "perceptual", I don't just mean the traditional five senses, I mean the perception of self, identity, the thing we are referring to when we use the word "I". Marc
Testers Wanted!
Latest Article: User Authentication on Ruby on Rails - the definitive how to
My BlogI like your definition. In a text I read on buddhist meditation, from memory, the author described 3 aspects of consciousness. Your's closely aligns with the third. 1. An aspect where you just "be", where the mind is clear, and you just exist in that moment totally absorbed by a task. eg meditation. 2. An aspect where you think about and are aware of the present moment, watch, feel, smell etc and comment on your existence in realtime. eg thinking I'm walking some stairs, these stairs are steep. 3. An aspect where you are able to reflect on your current or previous thoughts. eg I know I am/was thinking these stairs are steep.
-
I like your definition. In a text I read on buddhist meditation, from memory, the author described 3 aspects of consciousness. Your's closely aligns with the third. 1. An aspect where you just "be", where the mind is clear, and you just exist in that moment totally absorbed by a task. eg meditation. 2. An aspect where you think about and are aware of the present moment, watch, feel, smell etc and comment on your existence in realtime. eg thinking I'm walking some stairs, these stairs are steep. 3. An aspect where you are able to reflect on your current or previous thoughts. eg I know I am/was thinking these stairs are steep.
Member 9475889 wrote:
the author described 3 aspects of consciousness.
Hmmm, I think that's very close to something I was reading recently as well. And I'm also having this weird deja-vu experience, both with this post and with some code I'm debugging. :~ Marc
Testers Wanted!
Latest Article: User Authentication on Ruby on Rails - the definitive how to
My Blog -
Member 9475889 wrote:
the author described 3 aspects of consciousness.
Hmmm, I think that's very close to something I was reading recently as well. And I'm also having this weird deja-vu experience, both with this post and with some code I'm debugging. :~ Marc
Testers Wanted!
Latest Article: User Authentication on Ruby on Rails - the definitive how to
My BlogJust a glitch in the matrix :suss:
-
Hi Lee, I'll get back to you on this as soon as I can define "consciousness," but I think that may take more than one life-time; is this urgent ? yours, Bill
“Humans are amphibians: half spirit, half animal; as spirits they belong to the eternal world; as animals they inhabit time. While their spirit can be directed to an eternal object, their bodies, passions, and imagination are in continual change, for to be in time, means to change. Their nearest approach to constancy is undulation: repeated return to a level from which they repeatedly fall back, a series of troughs and peaks.” C.S. Lewis
BillWoodruff wrote:
I think that may take more than one life-time; is this urgent ?
Are you kidding? Conscious = Not Friday night AND not named Dalek Dave.
I wanna be a eunuchs developer! Pass me a bread knife!
-
The information that you'd have to store would at most be the full state (including quantum state) of every particle that's involved (because that's literally everything that could be involved somehow, even if we don't know how). That would be a problem, but it's very unlikely that you'd need that kind of detail - for example, an MRI of your brain changes a lot of quantum states, but not your consciousness (I hope!), implying that not all state is relevant. Most likely the classical state alone is plenty, and you probably wouldn't even need it in unlimited precision. TL;DR: Yes.
I like that. It would imply we could have a system restore point.
-
Hi Lee, I'll get back to you on this as soon as I can define "consciousness," but I think that may take more than one life-time; is this urgent ? yours, Bill
“Humans are amphibians: half spirit, half animal; as spirits they belong to the eternal world; as animals they inhabit time. While their spirit can be directed to an eternal object, their bodies, passions, and imagination are in continual change, for to be in time, means to change. Their nearest approach to constancy is undulation: repeated return to a level from which they repeatedly fall back, a series of troughs and peaks.” C.S. Lewis
Just let my back up know when you have the answer.
-
Well Lee, The moment you make computers feel compassion, you'll be able to store consciousness. Now just solve the compassion thing and you're settled. * In other words, never Cheers, E
Never underestimate the difference you can make in the lives of others.
You don't need compassion to be conscious, in fact you don't need any emotion to be conscious. Psychopaths don't have any and not just the murderous types. I'm sure they are still considered conscious. I'm not keen on having a back up made of my mind that was stripped of emotion however. But a part backup is better than none.
-
I believe one day that it will be possible to have a computer approximate a human mind, but in the end, it will be a computer mind, with its own idiosyncrasies, capabilities and limitations. In addition, I believe it will not be possible to "download" a real person into a computer as a way of cheating death, because even if the computer accurately emulates the person's consciousness, the person himself will still die. In other words, the machine will be a copy of the person, not the actual person, so the person will not achieve immortality.
The difficult we do right away... ...the impossible takes slightly longer.
Damn. that's my plan down the drain.
-
This may give a snapshot of the mind at a certain point in time but I think consciousness goes well beyond this.
-
You don't need compassion to be conscious, in fact you don't need any emotion to be conscious. Psychopaths don't have any and not just the murderous types. I'm sure they are still considered conscious. I'm not keen on having a back up made of my mind that was stripped of emotion however. But a part backup is better than none.
[Automatic bot message] I disagree with you Lee, it is clear that any form of consciousness MUST include emotions. When (if) you manage to describe a feeling without using another feeling, I'd really like to see it. [End automatic message]
Never underestimate the difference you can make in the lives of others.
-
[Automatic bot message] I disagree with you Lee, it is clear that any form of consciousness MUST include emotions. When (if) you manage to describe a feeling without using another feeling, I'd really like to see it. [End automatic message]
Never underestimate the difference you can make in the lives of others.
Ok.That is tough. How about Love: a chemical imbalance that makes the solutions to problems a higher priority than analising the problem itself, or the problem itself irrelevant due to the sudden high priority of being in close vicinity of another individual. You may have a point. I'm not sure I've done that justice, and don't tell my wife I described love like that. She maybe cross at my lack of punctuation. But even if it is necessary to have emotions to be conscious, emotions are just changes in chemical hormone levels. Could that not be replicated with an algorithm on an electronic mind clone?
-
Of course the stored data wouldn't be conscious, since from its perspective no time passes.
Ah. The passage of time! Could that not be an entirely human made idea? Could we not live in a universe where everything happened or is happening all in one go. Could we not have created an illusion of a linear existance by linking similar quantum states together in an order of most similar to least similar. Anyway, I've deviated from my own question. Could we not just include a clock battery?
-
Ok.That is tough. How about Love: a chemical imbalance that makes the solutions to problems a higher priority than analising the problem itself, or the problem itself irrelevant due to the sudden high priority of being in close vicinity of another individual. You may have a point. I'm not sure I've done that justice, and don't tell my wife I described love like that. She maybe cross at my lack of punctuation. But even if it is necessary to have emotions to be conscious, emotions are just changes in chemical hormone levels. Could that not be replicated with an algorithm on an electronic mind clone?
-
Ok.That is tough. How about Love: a chemical imbalance that makes the solutions to problems a higher priority than analising the problem itself, or the problem itself irrelevant due to the sudden high priority of being in close vicinity of another individual. You may have a point. I'm not sure I've done that justice, and don't tell my wife I described love like that. She maybe cross at my lack of punctuation. But even if it is necessary to have emotions to be conscious, emotions are just changes in chemical hormone levels. Could that not be replicated with an algorithm on an electronic mind clone?
I read an article on someone back in the 60ies who claims that he managed to programatically define love, it is clearly visible in this variable:
BitSequence Love = 1101011110001010010101001010010100100010101001010101001110101111000101001010100101001010000010101001010101001110101111000101001010100101001010000010101001010101001110101111000101001010100101001010000000101010010101010011101011110001010010101001010010100010010101001010101001110101111000101001010100101001010000010101001010101001110101111000101001010100101000101000001010100101010100111010111100000101001010100101001010000010101001010101001110101111000101001010100100001001010010001010100101010100111010111100010100101010010100101000000101010010101010011101011110001010010101001010010100000101010010101010011101011110001010010101001010010100000101010010101010011101011110001010010101001010010100100010101001010101001110101111000101001010100101001010000010101001010101001110101111000101001010100101001010000010101001010101001110101111000101001010100101001010000010101001010101001110101111000101001001010010100101000000010101001010101001;
I am not sure I agree, be he has a point there, doesn't he? :~
Never underestimate the difference you can make in the lives of others.
-
Ah. The passage of time! Could that not be an entirely human made idea? Could we not live in a universe where everything happened or is happening all in one go. Could we not have created an illusion of a linear existance by linking similar quantum states together in an order of most similar to least similar. Anyway, I've deviated from my own question. Could we not just include a clock battery?
Maybe. In order for time to pass from the perspective of the stored consciousness, you'd have to emulate the electrical and chemical processes that go on inside a brain. That's not impossible, but hard to do efficiently: with current technology, even a small emulated brain would see time passing much faster than normally if it can look outside the emulation (because what's really going on is that it is running more slowly than usual). If it didn't have access to the outside, it wouldn't know the difference.
-
Lee Chetwynd wrote:
how many of us believe that it will be possible to store a consciousness electronically.
Perhaps eventually, but I would place that as perhaps as much (or as little) as a thousand years from now. The main problem is with the term "consciousness." Consciousness is not a static thing that can be stored in memory cells - consciousness, as I recall it being defined somewhere once, is the self awareness of constant change, including one's own thoughts. So an electronic consciousness, in my definition, wouldn't just be a snapshot of the thinking and memories of a person at a particular point in time, it would also have to support continual changing perceptual awareness, and that adds another significant layer of complexity to the concept. And again, by "perceptual", I don't just mean the traditional five senses, I mean the perception of self, identity, the thing we are referring to when we use the word "I". Marc
Testers Wanted!
Latest Article: User Authentication on Ruby on Rails - the definitive how to
My BlogThat sounds promising. That makes me think we would need a huge amount of super fast, non persistent memory. The "I" would exist in this memory. If the power was ever removed completely it would cease. Unless of course a persistent copy of the entire memory state was made before hand.
-
I view consciousness as akin to an operating system (not Windows 8, hopefully) with memories archived in a WORM-like storage area and accumulated experiences and knowledge executing as programs. Therefore, machine consciousness is possible once the proper computing environment is provided.
Lets just hope its not touch screen.