In some cases this might be just a failure on the speech roll. I've been in this situation a few times, that someone asks me a question about my deeply held beliefs, which I would absolutely be able to answer in text if you give me five minutes, on a belief that I certainly examined deeply and in some cases even published original research on -- and yet I got verbally stuck, because I can't answer complicated questions when put on the spot without preparation, and what comes out of my mouth is some noncommital hand-wavy nonsense, more aimed at getting out of the situation than at explaining my thoughts.
I later became proficient as a teacher, and got better at the skill on answering unexpected questions, but most people aren't teachers.
I feel that most people fail Socrates' test for true fulfillment, "The unexamined life is not worth living." We spend too much time examining and being critical of others' lives and too little time examining and changing our own lives.
Spot on. I've often thought that opinions are just fashion accessories. Also interesting is the 'sophistication effect', where smarter people have a harder time chaging their minds because they are better at defending their opinions.
Yeah, kinda. I think often, the reason someone thinks of one of their beliefs as "deeply held" is because they are unable to think of any alternatives. They haven't learned or haven't understood alternative framings that would make a different viewpoint possible, and perhaps more plausible. So providing alternative frames can enable them to see the other possibilities.
I know this is mostly about being bolder in challenging other people's beliefs, but presumably this applies to me too and that my own deeply held beliefs are not as well-thought or deeply held as I might think they are.
You have to be careful whenever you start believing something because of someone else who wanted you to. It's easy to get tricked, to become part of a social group that filters things out, in a way that you can't directly observe and compensate for, and end up believing crazy things while thinking that you're extremely rational.
If I understand correctly by ‘deeply held’ you mean ‘strongly held’. Often either of these are confused with beliefs which are held in such a way that if someone challenges them, the holder becomes angry. Surprisingly, these are often a sign of ‘weakly held’. Often when such beliefs are relevant to a group membership and if the holders credentials are suspect, it makes sense for the holder to respond aggressively when challenged.
On the other hand, imho truly strongly held beliefs are probably such that when challenged, the holder would be genuinely confused. If someone were to tell me that the Earth is flat, I would probably try to understand if they were telling a joke, making fun of something, trying to explain something difficult like relativity and so on. I would find it genuinely difficult to believe that someone could truly believe that the earth is flat.
I've experienced this relative to my own beliefs. It's something like having a shelf full of books you haven't read but represent how you want to perceive yourself and how you want to be perceived. It reminds me of psych studies I've read about post-hoc rationalization for different instinctual reactions. Also- I noticed that when you shifted to 'applications' you described other "minds [you] could have changed", but does this realization cause you question your own beliefs?
Whether or not this is true, using it to dismiss an argument would be bulverism and also ‘rude’. My policy is to engage with any argument on its own, and I can decide later based on psychological motivations whether I want to take it seriously or not
You've noticed people can't defend their "deeply held beliefs" when you probe them. You've correctly identified they're performing association rather than reporting conviction. But there's a more fundamental explanation:
Beliefs aren't possessed. They're calculated.
When someone expresses a belief, they're not retrieving a stored conviction from memory. They're running a real-time computation: "Given my current context, social position, and goals, what belief-expression maximizes my reward?"
This explains what you observed:
Why people are hand-wavy under questioning (they haven't cached justifications, only learned the expression)
Why "deeply held" beliefs change rapidly when context shifts (the calculation inputs changed)
Why beliefs correlate so strongly with social environment (the calculation is primarily social)
The calculation produces the experience of conviction.
People aren't lying when they say something is deeply held. The subjective experience of certainty is the OUTPUT of the calculation - "this belief-expression is highly valuable to me right now" - not evidence of examination.
This has implications beyond pedagogy fads:
If beliefs are calculations seeking social rewards, then belief manipulation isn't about argumentation - it's about engineering reward structures. Change what's socially costly vs. rewarding, and beliefs recalculate automatically.
This is how normal people adopt totalitarian beliefs without cognitive dissonance. They're not abandoning examined convictions - they're recalculating based on new social physics.
You've discovered something more fundamental than you wrote.
The "boring follow-up questions" work because they interrupt the calculation - forcing explicit justification the system never generated. But the deeper pattern explains how billions of humans coordinate through narrative rather than truth, and why that becomes civilizationally dangerous when technological systems manipulate the calculation at scale.
I think we sometimes misinterpret what a “deeply held belief” really is. Responses to this question usually pertain to people taking a side on a popular debate (i.e. god exists, the education system is fundamentally flawed, climate change is real) as opposed to a forward-looking view of the world (i.e. we WILL have ___ + some assemblage of facts to support this). Not really sure if it’s a deeply held “belief” if it’s just a regurgitation of someone else’s thinking
Yeah, this makes sense, assuming (which I do) that humans are to a very high degree status-seeking animals. I try to keep my identity (and my terminal preferences) small, but I think I have thought deeply and could defend well my most cherished one, i.e., truth-seeking.
Our next hangout will be tense but stimulating.
In some cases this might be just a failure on the speech roll. I've been in this situation a few times, that someone asks me a question about my deeply held beliefs, which I would absolutely be able to answer in text if you give me five minutes, on a belief that I certainly examined deeply and in some cases even published original research on -- and yet I got verbally stuck, because I can't answer complicated questions when put on the spot without preparation, and what comes out of my mouth is some noncommital hand-wavy nonsense, more aimed at getting out of the situation than at explaining my thoughts.
I later became proficient as a teacher, and got better at the skill on answering unexpected questions, but most people aren't teachers.
I feel that most people fail Socrates' test for true fulfillment, "The unexamined life is not worth living." We spend too much time examining and being critical of others' lives and too little time examining and changing our own lives.
Spot on. I've often thought that opinions are just fashion accessories. Also interesting is the 'sophistication effect', where smarter people have a harder time chaging their minds because they are better at defending their opinions.
Yeah, kinda. I think often, the reason someone thinks of one of their beliefs as "deeply held" is because they are unable to think of any alternatives. They haven't learned or haven't understood alternative framings that would make a different viewpoint possible, and perhaps more plausible. So providing alternative frames can enable them to see the other possibilities.
I know this is mostly about being bolder in challenging other people's beliefs, but presumably this applies to me too and that my own deeply held beliefs are not as well-thought or deeply held as I might think they are.
Great piece.
You have to be careful whenever you start believing something because of someone else who wanted you to. It's easy to get tricked, to become part of a social group that filters things out, in a way that you can't directly observe and compensate for, and end up believing crazy things while thinking that you're extremely rational.
If I understand correctly by ‘deeply held’ you mean ‘strongly held’. Often either of these are confused with beliefs which are held in such a way that if someone challenges them, the holder becomes angry. Surprisingly, these are often a sign of ‘weakly held’. Often when such beliefs are relevant to a group membership and if the holders credentials are suspect, it makes sense for the holder to respond aggressively when challenged.
On the other hand, imho truly strongly held beliefs are probably such that when challenged, the holder would be genuinely confused. If someone were to tell me that the Earth is flat, I would probably try to understand if they were telling a joke, making fun of something, trying to explain something difficult like relativity and so on. I would find it genuinely difficult to believe that someone could truly believe that the earth is flat.
I've experienced this relative to my own beliefs. It's something like having a shelf full of books you haven't read but represent how you want to perceive yourself and how you want to be perceived. It reminds me of psych studies I've read about post-hoc rationalization for different instinctual reactions. Also- I noticed that when you shifted to 'applications' you described other "minds [you] could have changed", but does this realization cause you question your own beliefs?
Reminds me of this: https://pubmed.ncbi.nlm.nih.gov/23620547/
Nice. It reminds of CBT therapy in the best way.
There are apparently a lot of similar-ish studies too
Whether or not this is true, using it to dismiss an argument would be bulverism and also ‘rude’. My policy is to engage with any argument on its own, and I can decide later based on psychological motivations whether I want to take it seriously or not
I enjoyed this piece, but the explanation for why people hold superficial beliefs to be lacking and too cynical.
What if beliefs aren't held at all?
You've noticed people can't defend their "deeply held beliefs" when you probe them. You've correctly identified they're performing association rather than reporting conviction. But there's a more fundamental explanation:
Beliefs aren't possessed. They're calculated.
When someone expresses a belief, they're not retrieving a stored conviction from memory. They're running a real-time computation: "Given my current context, social position, and goals, what belief-expression maximizes my reward?"
This explains what you observed:
Why people are hand-wavy under questioning (they haven't cached justifications, only learned the expression)
Why "deeply held" beliefs change rapidly when context shifts (the calculation inputs changed)
Why beliefs correlate so strongly with social environment (the calculation is primarily social)
The calculation produces the experience of conviction.
People aren't lying when they say something is deeply held. The subjective experience of certainty is the OUTPUT of the calculation - "this belief-expression is highly valuable to me right now" - not evidence of examination.
This has implications beyond pedagogy fads:
If beliefs are calculations seeking social rewards, then belief manipulation isn't about argumentation - it's about engineering reward structures. Change what's socially costly vs. rewarding, and beliefs recalculate automatically.
This is how normal people adopt totalitarian beliefs without cognitive dissonance. They're not abandoning examined convictions - they're recalculating based on new social physics.
You've discovered something more fundamental than you wrote.
The "boring follow-up questions" work because they interrupt the calculation - forcing explicit justification the system never generated. But the deeper pattern explains how billions of humans coordinate through narrative rather than truth, and why that becomes civilizationally dangerous when technological systems manipulate the calculation at scale.
I think we sometimes misinterpret what a “deeply held belief” really is. Responses to this question usually pertain to people taking a side on a popular debate (i.e. god exists, the education system is fundamentally flawed, climate change is real) as opposed to a forward-looking view of the world (i.e. we WILL have ___ + some assemblage of facts to support this). Not really sure if it’s a deeply held “belief” if it’s just a regurgitation of someone else’s thinking
Yeah, this makes sense, assuming (which I do) that humans are to a very high degree status-seeking animals. I try to keep my identity (and my terminal preferences) small, but I think I have thought deeply and could defend well my most cherished one, i.e., truth-seeking.