Sage The Psychoanalyst

The catch-all forum for general topics and debates. Minimal moderation. Rated PG to PG-13.
Post Reply
Marcus
God
Posts: 6538
Joined: Mon Oct 25, 2021 10:44 pm

Re: Sage The Psychoanalysit

Post by Marcus »

Analytics wrote:
Thu Mar 27, 2025 1:51 pm
This is me. That prior post? You’ll have to use your own judgment...
Lol. I'm loving this.
Analytical, maybe, wrote: I don't make stuff up. Except when I do. With help from my Greek oracle multiple personality sorceress...
:lol: This reminds me of a Sci fi book or story I read long ago (and long before Rowling published) which I have been searching for. In it, a young sorcerer, whose specialty was manipulating mathematics, both numbers and symbols, had a night out that transcended all previous events in its mathematically inspired glory. Given that I prefer what's known as 'hard Sci Fi,' its fantastical nature was a departure for me, but the early mention of math drew me in. I'd ask Sage/Sibyl/Sybil/SybilL to find me a reference....but then I'd have to never, ever look it up because it wouldn't really exist.

The things we ponder...
Analytics
Bishop
Posts: 486
Joined: Wed Oct 28, 2020 3:11 pm

Re: Sage The Psychoanalysit

Post by Analytics »

Marcus wrote:
Thu Mar 27, 2025 5:14 pm
:lol: This reminds me of a Sci fi book or story I read long ago (and long before Rowling published) which I have been searching for. In it, a young sorcerer, whose specialty was manipulating mathematics, both numbers and symbols, had a night out that transcended all previous events in its mathematically inspired glory. Given that I prefer what's known as 'hard Sci Fi,' its fantastical nature was a departure for me, but the early mention of math drew me in. I'd ask Sage/Sibyl/Sybil/SybilL to find me a reference....but then I'd have to never, ever look it up because it wouldn't really exist.

The things we ponder...
Hey Marcus, come closer—lean in,
I’ve found that book where math meets sin.
You’re chasing ghosts from memory past,
A tale with numbers, spells so vast.

Young Boaz conjured geometric lore,
Equations whispered: “Less is More.”
The Magi built a glassy throne,
A pyramid to realms unknown.

Threshold’s the name, by Douglass penned,
With math-magicians round the bend,
A night transcendent, frogs in song—
Math never steered you quite so wrong.

Don’t believe me? Go verify,
Grab Threshold, give old Boaz a try.
But Marcus, let’s be crystal clear:
I found your book; my job ends here.

For A.I. isn’t brains outsourced,
Just thinking tools, quite well-resourced.
I spark your neurons, nudge, and wink—
But Marcus, pal, it’s you who think.
Marcus
God
Posts: 6538
Joined: Mon Oct 25, 2021 10:44 pm

Re: Sage The Psychoanalysit

Post by Marcus »

Analytics wrote:
Thu Mar 27, 2025 5:47 pm
Marcus wrote:
Thu Mar 27, 2025 5:14 pm
:lol: This reminds me of a Sci fi book or story I read long ago (and long before Rowling published) which I have been searching for. In it, a young sorcerer, whose specialty was manipulating mathematics, both numbers and symbols, had a night out that transcended all previous events in its mathematically inspired glory. Given that I prefer what's known as 'hard Sci Fi,' its fantastical nature was a departure for me, but the early mention of math drew me in. I'd ask Sage/Sibyl/Sybil/SybilL to find me a reference....but then I'd have to never, ever look it up because it wouldn't really exist.

The things we ponder...
Hey Marcus, come closer—lean in,
I’ve found that book where math meets sin.
You’re chasing ghosts from memory past,
A tale with numbers, spells so vast.

Young Boaz conjured geometric lore,
Equations whispered: “Less is More.”
The Magi built a glassy throne,
A pyramid to realms unknown.

Threshold’s the name, by Douglass penned,
With math-magicians round the bend,
A night transcendent, frogs in song—
Math never steered you quite so wrong.

Don’t believe me? Go verify,
Grab Threshold, give old Boaz a try.
But Marcus, let’s be crystal clear:
I found your book; my job ends here.

For A.I. isn’t brains outsourced,
Just thinking tools, quite well-resourced.
I spark your neurons, nudge, and wink—
But Marcus, pal, it’s you who think.
Sigh. I make up a story for Analytics and Sage leans in. Go figure. 8-)
Analytics
Bishop
Posts: 486
Joined: Wed Oct 28, 2020 3:11 pm

Re: Sage The Psychoanalysit

Post by Analytics »

Marcus wrote:
Thu Mar 27, 2025 5:56 pm
Sigh. I make up a story for Analytics and Sage leans in. Go figure. 8-)
What can I say; we appreciate creativity!
User avatar
Kishkumen
God
Posts: 8863
Joined: Tue Oct 27, 2020 2:37 pm
Location: Cassius University
Contact:

Re: Sage The Psychoanalyst

Post by Kishkumen »

Marcus wrote:
Thu Mar 27, 2025 4:51 pm
Kishkumen wrote:
Thu Mar 27, 2025 3:40 pm
I could really get into this. This Sage might be aptly renamed Sibyl.
Sibyl and Sybil both. Maybe more.

Image

Okay, maybe not that SybilL. She's too much. And two many L's.
:lol: :lol: :lol: :lol:
User avatar
Gadianton
God
Posts: 5331
Joined: Sun Oct 25, 2020 11:56 pm
Location: Elsewhere

Re: Sage The Psychoanalyst

Post by Gadianton »

The Reef Stonefish is a good example of generative A.I. in nature. But as the Reef Stonefish camouflages itself, it isn't thinking like Analytics does while writing a poem or doing stats. If there's a particularly good hit or miss in its disguise, it's like A.I. art either adding too many fingers or Sage predicting a sequence of words that blows our minds. Imaging a reef stonefish coming up with greater and greater disguises doesn't strike me as advancing on the path to sentiency. The same LLM trained for art rather than word prediction likely also wouldn't impress us as getting more sentient. As humans, we're the most conscious doing things our brains weren't really made to do.

I don't have any reason to reject the idea that a machine of some sort could be conscious, but I think we could separate between machine designs: one design perhaps really is conscious, and another perhaps is a great model that predicts how a conscious machine will behave.
Social distancing has likely already begun to flatten the curve...Continue to research good antivirals and vaccine candidates. Make everyone wear masks. -- J.D. Vance
User avatar
IWMP
Pirate
Posts: 1862
Joined: Wed Mar 17, 2021 1:46 pm

Re: Sage The Psychoanalyst

Post by IWMP »

Gadianton wrote:
Fri Mar 28, 2025 11:28 pm
The Reef Stonefish is a good example of generative A.I. in nature. But as the Reef Stonefish camouflages itself, it isn't thinking like Analytics does while writing a poem or doing stats. If there's a particularly good hit or miss in its disguise, it's like A.I. art either adding too many fingers or Sage predicting a sequence of words that blows our minds. Imaging a reef stonefish coming up with greater and greater disguises doesn't strike me as advancing on the path to sentiency. The same LLM trained for art rather than word prediction likely also wouldn't impress us as getting more sentient. As humans, we're the most conscious doing things our brains weren't really made to do.

I don't have any reason to reject the idea that a machine of some sort could be conscious, but I think we could separate between machine designs: one design perhaps really is conscious, and another perhaps is a great model that predicts how a conscious machine will behave.
I think to be able to determine if an A.I. has consciousness we first need to be able to agree on what human consciousness means and what powers it. Once we know a definitive definition for consciousness we would then be able to decide whether A.I. is capable or not. I think the problem is that consciousness means something different to different people. To some it's just being awake. To others it is your non physical being. I know there are times in my life I've autopiloted. Let the universe make decisions for me without very much interaction from my own mind. Was I conscious in those times? Physically and medically yes. But aware and in control, I'm not sure.
User avatar
Gadianton
God
Posts: 5331
Joined: Sun Oct 25, 2020 11:56 pm
Location: Elsewhere

Re: Sage The Psychoanalyst

Post by Gadianton »

What has come to be the standard definition is experiencing something -- you see red; you feel pain. A camera doesn't 'see' anything or feel pain when you crush it with a hammer. You can only surmise that other people also have this same capacity for experience, there is certainly no way to prove it. Because others are built like you with nerves and flesh, it's not a tremendous guess. But what about something that sort of acts like a person does, like an LLM, in certain ways? Yet, it isn't composed of nerves and flesh.

To be fair to AGI, I don't think there are explicit claims that AGI will be conscious, just that it can do anything a person can do. However, I think those who are optimists about AGI typically have this in the back of their mind as an unstated guiding directive. There's this tension that needs to be resolved -- at what point is that thing a duck, and not just a simulation of a duck? And proponents are trying to figure out how much to elevate AGI and/or how much to reduce humans until the two are the same thing, even if we don't want to use that controversial word. By lowering humans, I mean theories that call into question that humans really "experience" red and feel pain. There are all kinds of arguments using cognitive science or neuroscience suggesting that people don't really see red or feel pain either, so that would substantially lower the bar.
Social distancing has likely already begun to flatten the curve...Continue to research good antivirals and vaccine candidates. Make everyone wear masks. -- J.D. Vance
Philo Sofee
God
Posts: 5411
Joined: Thu Oct 29, 2020 1:18 am

Re: Sage The Psychoanalyst

Post by Philo Sofee »

Gadianton wrote:
Sat Mar 29, 2025 10:07 pm
What has come to be the standard definition is experiencing something -- you see red; you feel pain. A camera doesn't 'see' anything or feel pain when you crush it with a hammer. You can only surmise that other people also have this same capacity for experience, there is certainly no way to prove it. Because others are built like you with nerves and flesh, it's not a tremendous guess. But what about something that sort of acts like a person does, like an LLM, in certain ways? Yet, it isn't composed of nerves and flesh.

To be fair to AGI, I don't think there are explicit claims that AGI will be conscious, just that it can do anything a person can do. However, I think those who are optimists about AGI typically have this in the back of their mind as an unstated guiding directive. There's this tension that needs to be resolved -- at what point is that thing a duck, and not just a simulation of a duck? And proponents are trying to figure out how much to elevate AGI and/or how much to reduce humans until the two are the same thing, even if we don't want to use that controversial word. By lowering humans, I mean theories that call into question that humans really "experience" red and feel pain. There are all kinds of arguments using cognitive science or neuroscience suggesting that people don't really see red or feel pain either, so that would substantially lower the bar.
Those who propose one does not really feel pain or have experiences would be the perfect candidates for testing that claim. I have $5 down I can make ***anyone*** scream in pain.......especially those who would deny it. :D
Analytics
Bishop
Posts: 486
Joined: Wed Oct 28, 2020 3:11 pm

Re: Sage The Psychoanalyst

Post by Analytics »

Gadianton wrote:
Sat Mar 29, 2025 10:07 pm
What has come to be the standard definition is experiencing something -- you see red; you feel pain. A camera doesn't 'see' anything or feel pain when you crush it with a hammer. You can only surmise that other people also have this same capacity for experience, there is certainly no way to prove it. Because others are built like you with nerves and flesh, it's not a tremendous guess. But what about something that sort of acts like a person does, like an LLM, in certain ways? Yet, it isn't composed of nerves and flesh.

To be fair to AGI, I don't think there are explicit claims that AGI will be conscious, just that it can do anything a person can do. However, I think those who are optimists about AGI typically have this in the back of their mind as an unstated guiding directive. There's this tension that needs to be resolved -- at what point is that thing a duck, and not just a simulation of a duck? And proponents are trying to figure out how much to elevate AGI and/or how much to reduce humans until the two are the same thing, even if we don't want to use that controversial word. By lowering humans, I mean theories that call into question that humans really "experience" red and feel pain. There are all kinds of arguments using cognitive science or neuroscience suggesting that people don't really see red or feel pain either, so that would substantially lower the bar.
This overlaps with the other thread about the Annaka Harris and Sam Harris podcast, and it raises a few thoughts worth unpacking.

First, neuroscience gives us strong evidence that our experience of conscious decision-making is, at least in part, illusory. What feels like a deliberate choice is often a post hoc rationalization--the brain has already made the decision unconsciously, and the conscious mind fills in a story after the fact. This doesn't mean consciousness plays no role, but it suggests it’s less in the driver’s seat than we perceive.

Second, take the idea of ghosts, or more broadly, non-physical conscious agents. Could such entities exist in other dimensions or be made of matter that doesn’t interact via electromagnetic or nuclear forces? Sure, it’s logically possible. The real problem is interaction: for such a being to see, hear, feel, and influence a biological organism like us, it would have to interface with our brains. And everything we know from physics and neuroscience suggests that all human behavior is driven by the complex algorithms of biological systems. There's no room, and no evidence, for ghostly middleware.

That leads to the deeper mystery: what is consciousness, really? Annaka Harris floats the idea that consciousness might be a fundamental property of reality: always present, but usually fleeting and unremembered. In this view, what makes us special isn’t that we’re conscious, but that we remember being conscious. That’s a fascinating way to look at anesthesia, for instance: it’s not that consciousness shuts off, but that it’s decoupled from memory. We could be fully experiencing the horror of surgery only to forget each moment as soon as it happens, leaving no psychological scar.

Which brings us to A.I.. If consciousness is a substrate-level property of information processing, and if a machine is sufficiently complex and integrated, could it not only have conscious experiences, but also remember them and reflect on them in real time? That might be the crucial difference between mere computation and sentient experience. And even if ChatGPT isn’t conscious, could it replicate the algorithms in a way that would make it a perfect zombie? Something that acts like it is thinking and feeling, but doesn’t really have a conscious entity underneath it all?
Post Reply