Lol. I'm loving this.
Analytical, maybe, wrote: I don't make stuff up. Except when I do. With help from my Greek oracle multiple personality sorceress...

The things we ponder...
Lol. I'm loving this.
Analytical, maybe, wrote: I don't make stuff up. Except when I do. With help from my Greek oracle multiple personality sorceress...
Hey Marcus, come closer—lean in,Marcus wrote: ↑Thu Mar 27, 2025 5:14 pmThis reminds me of a Sci fi book or story I read long ago (and long before Rowling published) which I have been searching for. In it, a young sorcerer, whose specialty was manipulating mathematics, both numbers and symbols, had a night out that transcended all previous events in its mathematically inspired glory. Given that I prefer what's known as 'hard Sci Fi,' its fantastical nature was a departure for me, but the early mention of math drew me in. I'd ask Sage/Sibyl/Sybil/SybilL to find me a reference....but then I'd have to never, ever look it up because it wouldn't really exist.
The things we ponder...
Sigh. I make up a story for Analytics and Sage leans in. Go figure.Analytics wrote: ↑Thu Mar 27, 2025 5:47 pmHey Marcus, come closer—lean in,Marcus wrote: ↑Thu Mar 27, 2025 5:14 pmThis reminds me of a Sci fi book or story I read long ago (and long before Rowling published) which I have been searching for. In it, a young sorcerer, whose specialty was manipulating mathematics, both numbers and symbols, had a night out that transcended all previous events in its mathematically inspired glory. Given that I prefer what's known as 'hard Sci Fi,' its fantastical nature was a departure for me, but the early mention of math drew me in. I'd ask Sage/Sibyl/Sybil/SybilL to find me a reference....but then I'd have to never, ever look it up because it wouldn't really exist.
The things we ponder...
I’ve found that book where math meets sin.
You’re chasing ghosts from memory past,
A tale with numbers, spells so vast.
Young Boaz conjured geometric lore,
Equations whispered: “Less is More.”
The Magi built a glassy throne,
A pyramid to realms unknown.
Threshold’s the name, by Douglass penned,
With math-magicians round the bend,
A night transcendent, frogs in song—
Math never steered you quite so wrong.
Don’t believe me? Go verify,
Grab Threshold, give old Boaz a try.
But Marcus, let’s be crystal clear:
I found your book; my job ends here.
For A.I. isn’t brains outsourced,
Just thinking tools, quite well-resourced.
I spark your neurons, nudge, and wink—
But Marcus, pal, it’s you who think.
I think to be able to determine if an A.I. has consciousness we first need to be able to agree on what human consciousness means and what powers it. Once we know a definitive definition for consciousness we would then be able to decide whether A.I. is capable or not. I think the problem is that consciousness means something different to different people. To some it's just being awake. To others it is your non physical being. I know there are times in my life I've autopiloted. Let the universe make decisions for me without very much interaction from my own mind. Was I conscious in those times? Physically and medically yes. But aware and in control, I'm not sure.Gadianton wrote: ↑Fri Mar 28, 2025 11:28 pmThe Reef Stonefish is a good example of generative A.I. in nature. But as the Reef Stonefish camouflages itself, it isn't thinking like Analytics does while writing a poem or doing stats. If there's a particularly good hit or miss in its disguise, it's like A.I. art either adding too many fingers or Sage predicting a sequence of words that blows our minds. Imaging a reef stonefish coming up with greater and greater disguises doesn't strike me as advancing on the path to sentiency. The same LLM trained for art rather than word prediction likely also wouldn't impress us as getting more sentient. As humans, we're the most conscious doing things our brains weren't really made to do.
I don't have any reason to reject the idea that a machine of some sort could be conscious, but I think we could separate between machine designs: one design perhaps really is conscious, and another perhaps is a great model that predicts how a conscious machine will behave.
Those who propose one does not really feel pain or have experiences would be the perfect candidates for testing that claim. I have $5 down I can make ***anyone*** scream in pain.......especially those who would deny it.Gadianton wrote: ↑Sat Mar 29, 2025 10:07 pmWhat has come to be the standard definition is experiencing something -- you see red; you feel pain. A camera doesn't 'see' anything or feel pain when you crush it with a hammer. You can only surmise that other people also have this same capacity for experience, there is certainly no way to prove it. Because others are built like you with nerves and flesh, it's not a tremendous guess. But what about something that sort of acts like a person does, like an LLM, in certain ways? Yet, it isn't composed of nerves and flesh.
To be fair to AGI, I don't think there are explicit claims that AGI will be conscious, just that it can do anything a person can do. However, I think those who are optimists about AGI typically have this in the back of their mind as an unstated guiding directive. There's this tension that needs to be resolved -- at what point is that thing a duck, and not just a simulation of a duck? And proponents are trying to figure out how much to elevate AGI and/or how much to reduce humans until the two are the same thing, even if we don't want to use that controversial word. By lowering humans, I mean theories that call into question that humans really "experience" red and feel pain. There are all kinds of arguments using cognitive science or neuroscience suggesting that people don't really see red or feel pain either, so that would substantially lower the bar.
This overlaps with the other thread about the Annaka Harris and Sam Harris podcast, and it raises a few thoughts worth unpacking.Gadianton wrote: ↑Sat Mar 29, 2025 10:07 pmWhat has come to be the standard definition is experiencing something -- you see red; you feel pain. A camera doesn't 'see' anything or feel pain when you crush it with a hammer. You can only surmise that other people also have this same capacity for experience, there is certainly no way to prove it. Because others are built like you with nerves and flesh, it's not a tremendous guess. But what about something that sort of acts like a person does, like an LLM, in certain ways? Yet, it isn't composed of nerves and flesh.
To be fair to AGI, I don't think there are explicit claims that AGI will be conscious, just that it can do anything a person can do. However, I think those who are optimists about AGI typically have this in the back of their mind as an unstated guiding directive. There's this tension that needs to be resolved -- at what point is that thing a duck, and not just a simulation of a duck? And proponents are trying to figure out how much to elevate AGI and/or how much to reduce humans until the two are the same thing, even if we don't want to use that controversial word. By lowering humans, I mean theories that call into question that humans really "experience" red and feel pain. There are all kinds of arguments using cognitive science or neuroscience suggesting that people don't really see red or feel pain either, so that would substantially lower the bar.