Everyone likes to believe they’re thinking independently. That they’ve arrived at their beliefs through logic, self-honesty, and some kind of epistemic discipline. But here’s the problem - that belief itself is suspiciously comforting. So how can you tell it’s true?
What if your worldview just happens to align neatly with your temperament, your social environment, or whatever gives you emotional relief? What if your reasoning is just post-hoc justification for instincts you already wanted to follow? That’s what scares me - not being wrong, but being convinced I’m right for reasons that are more about mood than method.
It reminds me of how people think they’d intervene in a violent situation - noble in theory, but until it happens, it’s all just talk. So I’m asking: what’s your actual evidence that you think the way you think you do? Not in terms of the content of your beliefs, but the process behind them. What makes you confident you’re reasoning - not just rationalizing?
Everyone likes to believe they’re thinking independently.
Can you elaborate on that claim?
I exercise some critical analysis, but for the most part I just have trust in human ambition. For example: the reason I believe human CO2 emissions are driving climate change is not because I’ve looked at the evidence and evaluated it for myself.
The reason I believe that human CO2 emissions are driving climate change is: that seems to be the consensus of people that have worked hard to impartially develop expertise and gather data to understand climate science.
There are two important systems at play
1: Scientific research, which harnesses human ambition by rewarding impartial research and discoveries which overturn old assumptions/paradigms.
2: Journalism, which harnesses human ambition by rewarding impartial reporting on various fields of human interest. (Reporting is why it seems to be the consensus of the scientific community)
The impartiality of these systems is (has always been) under assault by capitalism (which also derives its power by harnessing human ambition) and so one must, to an increasing degree, evaluate the appropriate level of personal mental effort to allocate to identifying biases in the reporting.
Pick something fringe that you have a belief about, like Bigfoot. Look at the evidence, really look at first hand interviews. How many witnesses before it becomes plausible? Do those hundreds of people really get off on making things up?
You could do this with anything, even western medicine. It takes practice to figure out which part of your understanding was just accepted as truth, and which part you have evidence for.
The biggest bias that everybody has is thinking that a million people can’t be wrong. That surely some other expert would have discovered it if there was anything there.
I recently heard of some physics-breaking experiment that had been repeated by a few YouTubers. Two large torus magnets with opposite ends clamped close together. This object falls slower than one of the same mass and size.
Self examination. Question and challenge your beliefs constantly. Dig into anything that makes you uncomfortable especially
If you’re not growing, your views are not getting closer to the truth. There is no end point, only growth
It’s funny. I’ve seen research about LLMs “reasoning” and “introspecting” that has shown that they make up stories when you ask them why they answered questions in certain ways that don’t match how their neurons actually fired, and a common response in the comments is to triumphantly crow about how this shows they’re not “self aware” or “actually thinking” or whatever.
But it may be the same with humans. There’s been fun experiments where people would have neurons artificially stimulated in their brains that cause them to take some action, such as reaching out with their hand, and then when you ask them why they did that they’ll say - and believe - that they did it for some made-up reason like they were just stretching or that they wanted to pick something up. Even knowing full well that they’re in an experiment that’s going to use artificial stimulus to make them do that.
I suspect that much of what we call “consciousness” is just made up after-the-fact to explain to ourselves why we do the things that we do. Maybe even all of it, for all we currently know. It’s a fun shower thought to ponder, if nothing else. And perhaps now that we’ve got AI to experiment with in addition to just our messy organic brains we’ll be able to figure it all out with more rigor. Interesting times ahead.
I’m not terribly concerned about it, though. If it turns out that this is how we’ve been operating all along, well, it’s how we’ve been operating all along. I’ve liked being me so far, why should that change when the curtain’s pulled back and I can see the hamster in the wheel that’s been making me work like that all along? It doesn’t really change anything, and I’d like to know.
You might be referring to the split-brain experiments, where researchers studied patients who had their brain hemispheres separated by cutting the corpus callosum – the “bridge” between the two sides.
In these experiments, text can be shown to only one eye, allowing researchers to communicate with just one hemisphere without the other knowing. The results are fascinating for several reasons, especially because each hemisphere demonstrates different preferences and gives different answers to the same questions. This naturally raises the question: “Which one is you?”
Another striking finding, similar to what you were referring to, is that researchers can give instructions to the non-verbal hemisphere and then ask the verbal one to explain why it just performed a certain action. Since it doesn’t know the real reason, it immediately starts inventing excuses – ones the researchers know to be false. Yet the participant isn’t lying. They genuinely believe the made-up explanation.
As for consciousness, I think you might be using the term a bit differently from how it’s typically used in philosophical discussions. The gold standard definition comes from Thomas Nagel’s essay What Is It Like to Be a Bat?, where he defines consciousness as the fact of subjective experience – that it feels like something to be. That existence has qualia. This, I (and many others) would argue, is the only thing in the entire universe that cannot be an illusion.
You might be referring to the split-brain experiments, where researchers studied patients who had their brain hemispheres separated by cutting the corpus callosum – the “bridge” between the two sides.
Nope, I would have described the split-brain experiments if that’s what I was referring to. I dug around a bit to find a direct reference and I think it was Movement Intention After Parietal Cortex Stimulation in Humans by Desmurget et al. In particular:
the fact that patients experienced a conscious desire to move indicates that stimulation did not merely evoke a mental image of a movement but also the intention to produce a movement, an internal state that resembles what Searle called “intention in action”
I did misremember the fact that they only felt the intention to move, they didn’t actually move their limbs when those brain regions were stimulated.
A related bit of research I dug up on this reference hunt that I’d forgotten about but is also neat; Libet in the 1980s, who used observation of the timing of brain activity to measure when a person formed an intention to do something compared to when they became consciously aware that they had formed an intention to do something. There was a significant delay between those two events, with the intention coming first and only later with the conscious mind “catching up” and deciding that it was going to do the thing that the brain was already in the process of doing.
As for consciousness, I think you might be using the term a bit differently from how it’s typically used in philosophical discussions.
Probably, I’m less interested in philosophy than I am in actual measurable neurology. The whole point of all this is that human introspection appears to be flawed, and a lot of philosophy relies heavily on introspection. So I’d rather read about people measuring brain activity than about people merely thinking about brain activity.
This, I (and many others) would argue, is the only thing in the entire universe that cannot be an illusion.
You can argue it all you like, but in the end science requires evidence to back it up.
Then what do you mean when you’re using the word “consciousness”? Whose definition are you going by?
Loosely, the awareness of our own actions and the reasons why we do them. The introspective stuff that the research I linked to is about.
The specific word doesn’t really matter to me much. Substitute a different one if you prefer. Semantic quibbling is more of what I leave to the philosophers.
You’re calling it “semantic quibbling,” but defining terms isn’t a sideshow - it’s the foundation of a meaningful conversation. If two people are using the same word to mean different things, then there’s no actual disagreement to resolve, just a tangle of miscommunication. It’s not about clinging to labels – it’s about making sure we’re not just talking past each other.
And on the claim that consciousness – in the Nagel sense – is the one thing that can’t be an illusion: I don’t think you’ve fully appreciated the argument if your first response is to ask for scientific evidence. The entire point is that consciousness is the thing that makes evidence possible in the first place. It’s the medium in which anything at all can be observed or known. You can doubt every perception, every belief, every model of the universe - but not the fact that you are experiencing something right now. Even if that experience is a hallucination or a dream, it’s still being had by someone. That’s the baseline from which everything else follows. Without that, even neuroscience is just lines on a chart with nobody home to read them.
You asked:
Everyone likes to believe they’re thinking independently. That they’ve arrived at their beliefs through logic, self-honesty, and some kind of epistemic discipline. But here’s the problem - that belief itself is suspiciously comforting. So how can you tell it’s true? […] I’m asking: what’s your actual evidence that you think the way you think you do? Not in terms of the content of your beliefs, but the process behind them. What makes you confident you’re reasoning - not just rationalizing?
And I’m answering that. You literally asked for “actual evidence,” and I gave links to the specific research I’m referencing.
I’m not here to argue with you over the meaning of the word “consciousness” when you didn’t even ask about that in your question in the first place. If you think I’m talking about something other than consciousness go ahead and tell me what other word for it suits you.
Introspective narration or metacognitive awareness seems to better describe what you’re talking about rather than consciousness.
The scientific method. Try to falsify your own beliefs.
Having good adversaries helps considerably but they also have a bad habit of slowing turning into friends.
The best friends I’ve ever had were people that I could argue with, about anything. Win or loose we were both better for it.
Im convinced i just post rationalize my instinctive behaviour. But my instincts react not only to the external world, but also to my previous thoughts. So my reasonings influence somewhat my behaviour, they dont dictate my behaviour, but they influence it.
By remaining self critical. Listening to new ideas. Trying to not hold it against myself when I’m wrong. Also reading, lots and lots of reading. I’ve been reading at a college level since the middle of elementary school, and Ive used that time to read a ridiculous corpus of books, tomes, treatises and manuscripts. Can even speak and write a little Old Babylonian.
Come back at it in a different mood, time, setting, etc. and run it against multiple people who you know won’t just coddle you about it if you’re really concerned.
Part of human psychology is coping with the world around us, not just learning about it. Our individual natures, moods and pasts will always color our experiences and that’s not necessarily a bad thing; the sooner you accept it’s okay to be human, the more authentically you can work with the quirks that come with it.
I do know personally I can be prone to overthinking and getting fuzzier in my logic when I’m tired, hungry, or overwhelmed. I also know it’s fixable.
On the flip side, I also have a history of trusting/getting anxious about bigger picture things based on precedent rather than imagined ideas (new things don’t bother me, but having to face certain known patterns stresses me to a very bad point).
I factor both of these things in when I get to an almost-crisis point. But at the end of the day, the human perspective is like a prism in water: multifaceted, fluid, and dependent on environment but also sometimes beautiful. :)
i’m constantly in conflict with myself, always seek out perspectives that challenge my own, not just to understand them, but to see if they break mine. that inner war keeps me from trusting my first instincts too easily. it doesn’t guarantee i’m reasoning, but it forces me to notice when i’m just agreeing with what feels good
deleted by creator
What if your worldview just happens to align neatly with your temperament
I feel like you’re putting the cart before the horse. I chose this worldview because of my temperament, not the other way around. I’ve also revised many of my views because of who I am, and that’s why I’m thankful for being queer, for example, which has opened my eyes to many other experiences.
What if your reasoning is just post-hoc justification for instincts
I’ve been wrong before and I’ve changed those views, and some others are under review. My views aren’t static. I also do many things that go against my instinct all the time. And also, who’s to say that your instincts are static anyway? Your whole genetics changes based on your environment, and by extension, so does your behavior.
I think you’re putting way too much of an existential dread on something dynamic and malleable, and it doesn’t need to be that way.
How do you tell stories without reasoning in the first place?
You don’t.
It’s just not possible.
Everything else just gets layered on over the years.
For me, one of the biggest indicators is that I’ve actually changed my mind on several issues. I even keep a list of things I’ve changed my mind about or been proven wrong on. I don’t resist being wrong – I take pride in it.
Similarly, there are things I’ve changed my mind about and then later changed back to my original position. To me, that signals a certain mental flexibility and openness to new views, which I see as crucial for error correction.
Another thing that comes to mind is that there are topics where my opinions fundamentally differ from those of my peers. That alone isn’t concrete evidence of independent thinking, but at the very least, it shows a willingness to resist conforming under peer pressure.
What if your reasoning is just post-hoc justification
Relax, part of it is :)
The brain is able to do many different kinds of thinking. Reasoning is one of them. Dreaming is another, etc.
When you are thinking logically, then it is reproducible, and it is understandable and verifiable by other people.
When you do scientific research, such verfification by others is even required, before your results are considered meaningful and true.
That’s what scares me - not being wrong, but being convinced I’m right for reasons that are more about mood than method.
I think that’ll be the case most of the time. The only known way to get a grasp of reality is cooperation, and the agreement to let experiment be the arbiter of truth. Which is difficult, as many systems can’t easily be experimented on in isolation (and you’re left with counterfactuals only).
Histories greatest thinkers struggled with this too. Descartes changed his famous postulate “je pense donc je suis” to “je doute donc je suis” later in life. (“I think therefore I am”, became “I doubt therefore I am”). I read that as trading reason for emotion as core to his being.