honor is flooding a certain thread with information. PLEASE have a look. I think he's providing an invaluable service to all of us. At least he is for me. Starting on page 5 of this thread all in blue font so far. I wanted to bring attention to his efforts without disrupting what he's there.
viewtopic.php?f=7&t=159881
Please view honor's flood
- Jersey Girl
- God
- Posts: 8337
- Joined: Mon Oct 26, 2020 3:51 am
- Location: In my head
Please view honor's flood
LIGHT HAS A NAME
We only get stronger when we are lifting something that is heavier than what we are used to. ~ KF
Slava Ukraini!
We only get stronger when we are lifting something that is heavier than what we are used to. ~ KF
Slava Ukraini!
-
- God
- Posts: 1803
- Joined: Mon Mar 14, 2022 1:49 am
Re: Please view honor's flood
Did you vet it? Did you even read it all? Who wrote it?
-
- God
- Posts: 4358
- Joined: Mon Nov 23, 2020 2:15 am
Re: Please view honor's flood
Says the guy who used a Joe Rogan rant about tribalism as evidence for an argument.
Cool cool.
https://ethanmilne.medium.com/a-differe ... a39d0cd2c5
A Different Way to Think About Confirmation Bias
Ethan Milne
In his book, The Righteous Mind, Jonathan Haidt describes an odd dynamic that occurs when humans think about new evidence:
This happens all the time. You’ve done it, I’ve done it, everyone does it. It’s just how our minds work. But the world has changed faster than our minds can evolve, so sometimes this tendency leads us astray.
Can I Believe It?
This is what most people think of when they say “confirmation bias”. We look for any evidence — no matter how weak — that confirms our prior intuitions. Tim Urban from the Wait But Why blog describes this as thinking like a sports fan:

Where a person thinking like a scientist wants to find the truth even if it contradicts their preferred belief, a sports fan values truth a little bit less and confirmation a little bit more. Note that these are stereotypes: lots of scientists are motivated reasoners, and lots of sports fans, I assume, are rational people.
Must I Believe It?
This is another component of confirmation bias, albeit a less direct part. While we may leap to support evidence confirming our beliefs, we also discount evidence we don’t like.
An example Haidt uses frequently is a study that asks participants to read a study on coffee drinking’s link to bad outcomes. Participants who drank coffee regularly were, unsurprisingly, extremely good at identifying flaws in experimental design relative to their decaffeinated counterparts.
“Must I Believe It” is a mindset that’s very hard to identify as confirmation bias at the beginning. This mindset often comes in the form of what Scott Alexander calls Isolated Demands for Rigour; It’s all well and good to criticize a study for bad methodology, or point out the flaws of using single studies as proof, but if this rigour is only applied to positions you disagree with, it’s probably a case of “Must I Believe It” applied in a biased manner.
Cool cool.
https://ethanmilne.medium.com/a-differe ... a39d0cd2c5
A Different Way to Think About Confirmation Bias
Ethan Milne
In his book, The Righteous Mind, Jonathan Haidt describes an odd dynamic that occurs when humans think about new evidence:
“… When we want to believe something, we ask ourselves, “Can I believe it?” Then (as Kuhn and Perkins found), we search for supporting evidence, and if we find even a single piece of pseudo-evidence, we can stop thinking. We now have permission to believe. We have a justification, in case anyone asks. In contrast, when we don’t want to believe something, we ask ourselves, “Must I believe it?” Then we search for contrary evidence, and if we find a single reason to doubt the claim, we can dismiss it.”
This happens all the time. You’ve done it, I’ve done it, everyone does it. It’s just how our minds work. But the world has changed faster than our minds can evolve, so sometimes this tendency leads us astray.
Can I Believe It?
This is what most people think of when they say “confirmation bias”. We look for any evidence — no matter how weak — that confirms our prior intuitions. Tim Urban from the Wait But Why blog describes this as thinking like a sports fan:

Where a person thinking like a scientist wants to find the truth even if it contradicts their preferred belief, a sports fan values truth a little bit less and confirmation a little bit more. Note that these are stereotypes: lots of scientists are motivated reasoners, and lots of sports fans, I assume, are rational people.
Must I Believe It?
This is another component of confirmation bias, albeit a less direct part. While we may leap to support evidence confirming our beliefs, we also discount evidence we don’t like.
An example Haidt uses frequently is a study that asks participants to read a study on coffee drinking’s link to bad outcomes. Participants who drank coffee regularly were, unsurprisingly, extremely good at identifying flaws in experimental design relative to their decaffeinated counterparts.
“Must I Believe It” is a mindset that’s very hard to identify as confirmation bias at the beginning. This mindset often comes in the form of what Scott Alexander calls Isolated Demands for Rigour; It’s all well and good to criticize a study for bad methodology, or point out the flaws of using single studies as proof, but if this rigour is only applied to positions you disagree with, it’s probably a case of “Must I Believe It” applied in a biased manner.