Learning to Distrust Your Thoughts
What fracking and autoharps reveal about psychedelic messages
Is ChatGPT conscious?
Our public conversation about this is totally confused. We mostly agree that LLMs like ChatGPT are not conscious beings. But many people seem to think that, with enough advancements, they could be.
But how will that happen? Will chatbots get a little bit conscious first, and then increasingly so? Or will they pass some kind of event horizon where they switch from unconscious to conscious? From not having an inner world of experience to suddenly having one?
This public confusion about AI consciousness mirrors a private confusion we have about our own minds. We assume our intelligence is the source of our awareness, when it's the other way around.
These basic philosophical questions descend into absurdity so quickly because we’re fundamentally confused about how thinking and consciousness relate to one another. We tend to assume that more intelligent beings, like us, are more conscious than less intelligent beings, like lobsters. From this it follows that, as machines get smarter, they will also get more conscious.
But our thinking happens within conscious awareness, not the other way around.
There’s a story about an old Zen teacher who was asked to give a talk about the challenges Westerners face when trying to mediate. The master shuffled onstage, said, “Lost in thought,” and shuffled offstage.
There’s a paradigm shifting experience we can have with meditation. It’s the awareness that thoughts and thinking are conditions that arise and pass in consciousness, just as sounds and bodily sensations do. There’s nothing more real about thinking than these other conditions. In fact, thoughts tend to be less real and concrete than other objects of awareness.
The one thing the thinking mind does that other objects of awareness do not is tell stories. Our thoughts tell stories about who we are, what is important, and what will happen next. You only have to meditate on thinking for a few minutes to realize how incredibly powerful these stories are. They draw and maintain our attention in diverse, sneaky and bizarre ways. We get lost in them.
The sensations in our toes don’t have anything to say about our career or whether our friend who hasn’t texted us back is mad at us. But our thoughts have a lot to say, and they say so emphatically, so we tend to pay more attention to them than the sensations in our toes.
Yet our thinking mind rarely tells one incredibly salient story: about how it’s almost always wrong. Your friend who hasn’t texted you back has almost never actually been mad at you, despite the hundreds of stories your thinking mind has told you.
It’s not just little quibbles that the thinking mind gets consistently wrong. It’s also big, important shit, like fracking.
Fracking and the cosmic giggle
I used to have very strong opinions about fracking. This technology, which has greatly increased fossil fuel extraction in the last 25 years, works by shooting high pressure water into rock, breaking it apart and loosening the oil and natural gas trapped inside.
As someone who cares about climate change and reducing the world’s dependence on fossil fuels, I saw fracking as an unambiguous evil. It was the equivalent of an enabling friend who kept buying tequila shots for you as you tried to stay sober.
Because I’ve been passionate about climate change, I’ve also been interested in alternative energy production technologies like fusion and geothermal power. So I was psyched to learn that a company called Fervo Energy recently achieved major breakthroughs in what’s called “advanced geothermal power,” which basically unlocks a near-unlimited power source (the heat of the Earth’s core) and could be the final straw that breaks the fossil fuel industry’s back.
How did they figure it out? By fracking.
The CEO of Fervo, which is based in Houston, got his start in the oil and gas industry. All the profits in fracking led to huge technological improvement, which in turn unlocked advanced geothermal drilling, which could render fracking obsolete.
It seems like the benefits of advanced geothermal energy are likely to outweigh the negative climate effects of fracking in the long run. It’s a little cosmic joke. “Oh, you thought this thing was going to destroy the world? Actually it’s going to save it.”
We humans are consistently wrong about our doomsday predictions.
Nuclear weaponry was expected to destroy humanity. Instead it has led to arguably the most peaceful era in human history.
The “population bomb” predicted enormous overpopulation by the end of the 21st century. Now we’re worried about populations dropping too much.
We’ve been worried about automation destroying living conditions since the dawn of the industrial revolution, and have been consistently wrong (I’m sure we’re right this time, though).
Our thinking minds have something called negativity bias, which is a scientific concept that means we worry that shit will hit the fan even though it almost never does. If our friend doesn’t text us back, we think she hates us despite overwhelming contrary evidence.
The point is, our thoughts — especially our anxious thoughts about the future — are almost never correct. Yet the next one we have is likely to feel important, real and scary. Yes, AI might destroy humanity. It might! But probably something weirder will happen, just like it always does.
The trick, then, is to stop believing everything we think. When we give less credence to their spellbinding stories, we can begin to notice that our thinking minds are basically just anxious idiots.
If our analytical, future-predicting mind is so consistently wrong about big things like global warming and geopolitics, what should we be listening to instead? The answer isn't to stop thinking, but maybe to start listening to a different part of our awareness—one that speaks in a quieter, weirder voice.
What’s your autoharp?
I just listened to a podcast with Laraaji, the old laughing grandfather of new age music. He describes (at 8:50) a “mystical communication event in a pawn shop” in 1974 that led him to his bizarre preferred instrument, the autoharp.
“On one particular day, my finances were low, and I had a good Yamaha steel-stringed guitar that I wasn’t using although I loved it. I decided to take it into town and pawn it. And as I was going into the store I noticed, on the right hand side, an autoharp. And I remember thinking to myself, ‘there’s that chunky looking instrument I used to see in the Village when I did stand-up comedy.’ So I go into the pawn shop, I’m ready to exchange my guitar … and the clerk only offered me $25. And I go, ‘whoa, that’s not gonna work.’ And just about then — the clerk and I were the only ones in the store, so it was very quiet — and there’s this moment of am I really going to sell this for $25? And this very clear distinctive directive comes through and says, ‘Don’t take money, swap it for the instrument in the window.’ So here was something trying to help me make a choice and I thought it was way out. How was this voice appearing? And I heard it so clear. There was so much love and so much wisdom that I just had to see where this was going to go.”
The instrument came to define his career and change the course of his life.
I love this story because it shows what can happen when we stop trusting our thoughts. Laraaji’s thinking mind makes a brief appearance: “I thought it was way out” meaning it was crazy to trade a nice guitar for an instrument nobody cared about. But because Laraaji’s awareness was open to more than just thoughts, he was able to receive another, weirder message: “Get the autoharp, man.”
Many of us, after intense psychedelic experiences, are open to these kinds of messages. Some of these feelings might seem dark and paranoid. Others nonsensical. But we might receive a knowing about something important.
Yet along with this knowing will almost certainly come something else: thinking.
Laraaji knew he needed to get the autoharp, but thought that was crazy. His decision ultimately rested on which part of him — which aspect of his conscious experience — he decided to trust. Sometimes we talk about listening to the body, as it often has subtler and wiser messages than our thoughts. But we might also get the message in other ways.
So the first step is noticing messages that come from somewhere other than the thinking mind. The next step is to listen to the concerns of the thinking mind (i.e. “it’s way out”) without letting them hijack the process. The goal is to make a choice from a place of wholeness, where the intuitive knowing is given as much weight as the rational analysis. Laraaji heard his doubt, but he trusted the deeper message.
Yes, there is danger in jettisoning faith in the thinking mind altogether. As Ram Dass used to say, “don’t forget your ZIP code.” The thinking mind helps us navigate the perils of this human incarnation. Stories of people who took acid and jumped out of windows because they thought they could fly aren’t totally baseless. That shit happens when we deactivate the thinking mind.
What we’re looking for is balance. It can be so easy to get lost in the stories of the thinking mind that we lose the bigger picture. We get so hyper-focused on our careers and fears about AI takeovers that we don’t flow with life. We don’t get the autoharp.
Learning to distrust your thoughts isn't about becoming passive or irrational. It's about creating a little space between a thought and your belief in it. In that space, you gain the freedom to see things as they are, not just as your mind narrates them to be. And in that freedom, you might just hear a weirder, wiser message trying to come through.
So, will AI become conscious? Our thinking mind, with its love for simple, dramatic stories, might just be the wrong tool to answer that question. It will likely tell us a story of doom or salvation. But reality is almost always stranger, more nuanced, and more surprising than our thoughts would have us believe.