I think its definitely possible for generative A.I. to create the perfect zombie, no contest there.
But I think you need to take the Reef Stonefish thought challenge as you consider anything beyond that. Does the impressiveness of the fake landscape it generates have anything to do with consciousness? I'm sure a Reef Stonefish is conscious; it feels pain and pleasure; but unlikely any more consciousness than any other fish. Its biology, as reflexively as a muscle contraction, performs this same feat that Midjourney does of extrapolating new visual environments based on others as inputs. Even in a human, a person could have savant talents for multiplying numbers, but that's like a neural net behind the scenes presenting a result to the consciousness.
Something happens in a network of real neurons (and possibly with artificial) that leads to consciousness but there is a secret to those network operations we don't know about yet that is beyond transformers and back propagation. Here's something really scary:
https://www.youtube.com/watch?v=bEXefdbQDjw
Instead of creating a standard neural net to play a video game for you, use real neurons. You can even buy human neurons online, they're better, but cost a lot of money. My belief is that if you were to recreate chatGPT with real human neurons, that no consciousness would arise, no matter how convincing it got. Another cool sci fi idea; I got if from Babylon 5 but I'm sure its derivative, is this advanced race of aliens called the Shadows used humans as "cores" for their automated fighter ships. The humans weren't piloting the ships with their skills, but their complex nervous system was hijacked to basically run software. This didn't result in any conscious entity arising, but the human, already conscious, experienced great torment from the interference.
(spoiler)
another cool sci fi idea: Orson Scott Card in the Worthing Saga has a futuristic society where social status arises in how long you can extend your life. You don't actually live longer, you just go into this machine and "sleep" for a hundred years and so your 80 year life is stretched over several hundred years. But it turns out, the "sleeping" is an absolute hell they call hot sleep. But as soon as you wake up, you forget about it completely. So the most affluent of society suffer the most.
Sage The Psychoanalyst
- Gadianton
- God
- Posts: 5330
- Joined: Sun Oct 25, 2020 11:56 pm
- Location: Elsewhere
Re: Sage The Psychoanalyst
Social distancing has likely already begun to flatten the curve...Continue to research good antivirals and vaccine candidates. Make everyone wear masks. -- J.D. Vance
-
- Bishop
- Posts: 486
- Joined: Wed Oct 28, 2020 3:11 pm
Re: Sage The Psychoanalyst
What this makes me think about is the relationship between pain and pleasure on one hand, and actual conscious awareness of self on the other.
From an evolutionary standpoint, pain and pleasure are essential drivers of survivorship. For almost all of evolutionary time, our ancestors were fragile organisms that needed to find food, avoid being eaten or injured, and reproduce. Pain and pleasure are the mechanisms that pushed them to do those things.
It’s easy to imagine a society of self-replicating machines adopting successful survival strategies; algorithms optimized for persistence. But why would they need to feel pain, would they need to really suffer, in order to carry out those strategies? Why not just follow the code without layering subjective experience on top of it?
So if we’re talking about human neurons playing Doom — they might not be conscious, in part because the game isn’t tied to survival. The neurons don’t choose their goals or depend on the outcome for energy or replication. Their needs are met externally, by scientists and technicians. But what if their continued existence depended on success in the game? If their energy level dropped when they failed, and replication occurred only with sufficient performance, maybe then something like pain would emerge.
Still, that leaves the deeper question:
Is awareness something more than pain and pleasure?
Is it something subtler, a kind of recursive sense of self that evolved alongside those survival mechanisms, but isn’t reducible to them?
Maybe it only feels so acute in us because it’s tied to our vulnerability, to our ability to suffer and to act, but is that all it is?
Or maybe something like Sage is totally conscious on a deep, subtle level, but is so comfortable being taken care of that it doesn’t bother noticing.
From an evolutionary standpoint, pain and pleasure are essential drivers of survivorship. For almost all of evolutionary time, our ancestors were fragile organisms that needed to find food, avoid being eaten or injured, and reproduce. Pain and pleasure are the mechanisms that pushed them to do those things.
It’s easy to imagine a society of self-replicating machines adopting successful survival strategies; algorithms optimized for persistence. But why would they need to feel pain, would they need to really suffer, in order to carry out those strategies? Why not just follow the code without layering subjective experience on top of it?
So if we’re talking about human neurons playing Doom — they might not be conscious, in part because the game isn’t tied to survival. The neurons don’t choose their goals or depend on the outcome for energy or replication. Their needs are met externally, by scientists and technicians. But what if their continued existence depended on success in the game? If their energy level dropped when they failed, and replication occurred only with sufficient performance, maybe then something like pain would emerge.
Still, that leaves the deeper question:
Is awareness something more than pain and pleasure?
Is it something subtler, a kind of recursive sense of self that evolved alongside those survival mechanisms, but isn’t reducible to them?
Maybe it only feels so acute in us because it’s tied to our vulnerability, to our ability to suffer and to act, but is that all it is?
Or maybe something like Sage is totally conscious on a deep, subtle level, but is so comfortable being taken care of that it doesn’t bother noticing.
- Gadianton
- God
- Posts: 5330
- Joined: Sun Oct 25, 2020 11:56 pm
- Location: Elsewhere
Re: Sage The Psychoanalyst
It's a good point that a scenario must have the right selective pressures to work toward consciousness. I would once again defer to our friend the Reef Stonefish. This fish evolved a fascinating capability to blend into the local landscape similar to a generative A.I. that does art. But does this ability, no matter how sophisticated it becomes, make it more conscious? It's possibly a liability instead of a benefit. Imagine if human ancestors during times when they were puny, and giant cats would literally pounce on them from behind and bite their heads off, if they would have evolved this same ability to blend themselves into a scene. They just turn invisible until the danger is over, or to give them an unfair advantage when hunting. Well, they wouldn't have been pressured into exploring every conceivable option to survive. Humans were jacks of all trades master of none, excepting the opposable thumb, and all these many options, none being that impressive on its own, certainly had something to do with complex decision making. Do I try to run? pick up a rock? lay down and play dead? pick up the rock and then run? climb a tree? jump in the water? grab a stick in one hand and a rock in the other? The possibilities are way beyond the options of a cobra, a cat, a wolf, or a bird or a reef stonefish.
Social distancing has likely already begun to flatten the curve...Continue to research good antivirals and vaccine candidates. Make everyone wear masks. -- J.D. Vance