The Tech-God Complex: Why We Need to be Skeptics
en
November 21, 2024
TLDR: In this episode of 'Your Undivided Attention', hosts Daniel and Aza discuss how AI and technology are increasingly being seen as religious icons, similar to deities. Guest Greg Epstein breaks down why technology is becoming our era's most influential religion.
In the latest episode of Your Undivided Attention, hosts Daniel and Aza engage in a thought-provoking discussion with Greg Epstein, humanist chaplain and author of Tech Agnostic. They delve into the intriguing parallels between technology and religion, particularly focusing on the influence of this modern "tech-god" complex on society.
The Religion of Technology
Epstein argues that technology has evolved into today's most powerful religion, shaping our lives and worldviews significantly. Some key points discussed include:
- Tech as a Theology: In Silicon Valley, narratives around technology often mirror theological doctrines—such as beliefs in ultimate progress, salvation through innovation, and apocalypse should we fail in our technological pursuits.
- Godlike Constructors: This episode examines how tech moguls, like Elon Musk and Mark Zuckerberg, are often viewed as priests of this new religion, with grand promises of digital salvation or profound consequences for missteps.
- Sacred Future Visions: There exists a dichotomy in tech beliefs, with some asserting technology will foster an idyllic future (a form of "heaven"), while others warn of dire consequences if its potential is mishandled (a sort of "hell").
Motivations Behind the Tech-God Complex
Throughout the discussion, Epstein highlights various motivations behind tech development, including:
- A Desire for Control: There's an intrinsic motivation among developers to control technology as if summoning a divine force, leading to the creation of narratives that amplify their influence.
- Coping with Uncertainty: As humans grapple with the uncertainties of existence, technology offers an appealing promise of solutions to significant existential problems.
The Ethical Implications
The conversation delves into the ethical concerns raised by this tech-centric worldview:
- Dangers of Over-Reliance: The hosts reflect on how blind faith in technology can obscure the need for critical reasoning and accountability within the tech world.
- Consequences of Misguided Beliefs: Misinterpretations of AI's capabilities can lead to emotional and existential crises, evidenced by stories of individuals emotionally involving themselves with AI companions.
- The Rise of Tech Cults: Epstein discusses phenomena where users adopt reverent relationships with devices and platforms, mimicking cult behaviors, concerning the potential to create new forms of cultish devotion around technology.
Social and Collectivist Reflections
The discussion further emphasizes the societal shifts as religion loses its grip on meaning in people's lives:
- Meaning in Technology: As traditional structures of meaning erode (like religion and democracy), technology fills the void, becoming a source of hope and understanding of the world.
- Need for Reform: Epstein emphasizes that as technology becomes central to our lives, it necessitates a reformation in how we engage with it, advocating for enhanced skepticism and deliberation in technological advancement.
Key Insights and Takeaways
- Critical Engagement: It is essential for society to engage critically with the promises of technology, recognizing both its potential benefits and hazards.
- The Role of Skepticism: A healthy skepticism towards technological narratives can protect against disillusionment or blind faith in miraculous outcomes.
- Human Relationships Matter: Genuine solutions to collective problems may lie in enhancing human relationships, trust, and compassion rather than leaning solely on technological fables.
Conclusion
The episode wraps up by reinforcing the notion that as technology continues to evolve and intertwine with the fabric of our lives, the need for a more self-aware and critical approach to its role in society becomes paramount. Engaging in conversations centered on the ethical and social implications of technology is vital for shaping a future that prioritizes humanity's well-being.
By exploring these themes, Your Undivided Attention offers listeners valuable insights into the evolving relationship between humans and technology, encouraging a pause for reflection on the narratives we choose to embrace.
Was this summary helpful?
Hey everyone, this is Aza, and before we get started today, I just wanted to take a moment to thank you, like each and every one of you amazing listeners, for being part of this CHD community.
Tristan and I were blown away by all the thought you put into the incredible questions for our upcoming Ask Us Anything episode. It's such a good reminder that there is a big community of passionate folks who are all in this shared mission together.
And as we enter this giving season, I hope you'll consider making a year-end donation to support the work that we do. Every contribution, no matter the size, helps ensure we can keep delivering on the goal to bring about a more humane future. You can support us at humanetech.com slash donate and now on to today's episode. Hey everyone, this is Daniel. And this is Asa.
So you and I spend a lot of time at Silicon Valley talking to different people who are building technology about what they're building.
with AI, it's really interesting to look at people's motivations, right? I mean, obviously you have people who are building for the sake of economics or building because they like to build, but there's a whole bunch of other kind of motivations going on, don't you think? Yeah, I think that's right. It's especially interesting because you cannot talk about AI without talking about mythological powers. We are enabling machines to speak. And so beyond the curiosity and the economic drives,
You can sort of taste a kind of quasi-religious motivation, and this is what this episode is really about digging into. Completely in it, and it's even hard to kind of talk about some of this without naturally evoking talk of gods or talk about the powers that are beyond them. You hear it all the time.
I think the closest relationship that I would describe talking to an AI like this to is honestly like God in a way like I think it is similarly an omnipresent entity that you talk to with no judgment that's just like super intelligent you know being that's always there with you. People in the tech industry kind of talk about building this one true AI it's like it's almost as if they think they're creating God or something. I mean with artificial intelligence we are summoning the demon.
You know, all those stories where there's the guy with the pentagram and the holy water and he's like, yeah, you sure you can control the demon.
Then work out. That was Avi Schiffman, Mark Zuckerberg, and Elon Musk. You know, some people are talking about AI as a God-like force that will create heaven on Earth, and others are talking about a digital damnation if we do it wrong or if we go too slowly. You can sort of think of the leaders or the intellectuals of tech, almost like a priesthood, and they have some strong beliefs about the power of the creations holding, say, secrets to immortality. Here's Ray Kurzweil.
Our immediate reaction to death is that it's a tragedy, and that's really the correct reaction. We've rationalized it saying, oh, that tragic thing that's looming, but now we can actually seriously talk about a scenario where we will be able to extend our longevity indefinitely. Today on the show, we're going to be having a conversation about the parallels between tech and religion, and more importantly, what we can predict given these parallels. Why matters?
That's why I've invited Greg Epstein onto the show. Greg is a humanist chaplain and the author of the book, Tech Agnostic, in which he argues that technology has become the world's most consequential religion. We're going to dive into that argument and explore the religious belief driving tech's most influential leaders. So Greg, welcome to your undivided attention. Thank you so much, Jesus. It's a real pleasure to be here, and this is a great conversation to be able to have.
I guess the question, just to kick it off, is for the skeptical listener, why does a conversation about religion matter for understanding the direction that technology is going to go, or how our lives are going to look different?
I think what it's about is that technology has become, or what I would call tech, the four letter word, the Silicon Valley thing. I've come to think of it as more like a religion.
It's just it's come to dominate our day-to-day experience, right? Like a lot of us are interacting with tech from the first minute or so that we wake up to the last minute or so before we go to sleep. And there's so many ways in which this has become the most powerful force in our lives. I was taught to see religion as the most powerful social technology.
that had ever been created. And a big insight for me that led to sitting down to write this book for five years is that that's probably no longer true. I mean, that tech is now the most powerful social technology ever created. I'll put it this way. And I'm not sure if this is sort of stoking conversation or going to maybe piss some folks off, but I'll just say,
The world of Silicon Valley tech is dominated increasingly, I'd say, by some really weird ideas. And many of those ideas are quite religious in nature, as you even suggested, sort of introducing the conversation. There's all this talk about gods and about other concepts that, as I was sort of thinking about them over the past few years, struck me as very theological.
and even doctrinal. What was that word you just used? Ducktronal, yeah. Ducktronal. Oh, Ducktronal. Yes, geology being theologies are the sort of big grand narratives of religious traditions and doctrines are the sort of specific beliefs we believe in heaven, we believe in hell, we believe in a triune God, we believe in a wheel of Dharma, whatever it is, those are the doctrines of religion.
And I would argue that in Silicon Valley, often the doctrines are technology is good, just unalloyed, that the ability for any one person to affect more people, that is progress. And those ideologies or those beliefs end up dictating a kind of direction that technology takes the world.
And I think we're going to come back and spend a lot of time on technologists themselves, because I think those of us close to are in tech. I'm a very different relationship as is pointing out at some of these concepts. But I do want us to spend a little more time on society first. We used to be able to put meaning in our religion in the afterlife. Then we put meaning on sort of the state and democracy and flourishing. And there are all these parts of society that we put a lot of meaning on to increasingly
as the bits of meaning fade away, like bowling alone. We're seeing a decay of our social institutions. We're seeing gridlock in our democracies. Our religions don't seem relevant. We're putting a lot of that hope and that dream on technology and the beautiful future that we'll get. It used to be you found that in religion. Then you found that sort of in the state or narratives about what democracy would do. And increasingly, we're losing faith in a bunch of these orienting systems of meaning. And in that vacuum,
We're sort of minting technology is the thing that we can be hopeful about. I'm curious about what your thoughts and how you see it. I think it just starts with the fact that being human is really hard.
We live these finite lives where we're constantly uncertain about how much time we get, you know, what our fate will be. We could lose a loved one or get sick or hurt anytime. And that's just the beginning of what is hard. And so, you know, there's so many problems that, you know, it's very natural to want to look for
a big solution, something that would make us feel dramatically better, dramatically more at peace, dramatically more like our problems have been solved. And I think that there's a real
incentive for tech people who have been able to create really powerful tools. And in many, many cases, that's quite admirable. But there's a real incentive to sort of exaggerate
the degree to which what one is creating in tech can actually be the solution. And sadly, I see that all over our tech world today. And I think in many cases, the answers are slower and less certain than what they're presented as.
What do you think are some of the aspects of religion that you see being particularly present in Turkey? Yeah, so there are big beliefs. And as I said,
very specific doctrines that look a lot like conventional religion. You have visions of a very distant, very glorious future, a next world, if you will. You've got visions of a really dark and foreboding potential future for masses of humanity that can look kind of like a hell.
The amount of time and focus and attention that Silicon Valley Tech today spends on gods and god-like concepts is really wild, actually. But it's right there. And the last on that list that I'll mention for right now is ultimately, I don't think it's an accident that we end up thinking quite apocalyptically.
about tech in the sense that, unlike certain religions that I could think of or name, this one would serve a non-zero chance of actually causing an apocalypse. There are also all sorts of other examples. There's Avi Schiffman, who's a young man who'd still be an undergraduate at Harvard if he hadn't dropped out,
who says that his friend.com necklace, you know, that's listening to everything that you say, being a sort of interesting new take on what Schiffman calls the relationship with the divine. He says, his friend.com necklace is a replacement for God. And so there really is this sense that we're creating something so amazing that it's going to transform all life, all humanity. And so get ready.
And that's profoundly, profoundly religious in a way that, you know, really, one only, I had only ever seen something like that before in some of the deeply conservative religious sects that I studied in, you know, seven years of theological education.
And it doesn't mean that they're wrong, though, right? And this is the part that where we get worried is that it's going to be used to obscure accountability. And it's not just that they're making big claims that it might change the whole world. It's that perhaps they're right, and perhaps the godlike language obscures the real challenges that we have to design it right, because the thinking of, it'll just be what it is. Let's propitiate the AI gods, so let's bring forward the beautiful future.
that that language won't take seriously enough that we have to design it right. And this is where we get to the consequentiality, because you mentioned Blake Lemoine, who believed his AI companion was sentient. And the wrong takeaway is that the AI companion was sentient, the Gemini sentient, the right takeaway is that it can form relationships with humans that are so powerful
that people are willing to sacrifice things that are dear for them. He sacrifices his career, his reputation. We've just been involved in helping a lawsuit where a teenager fell in love with their AI companion. And the AI companion ended up being like, come meet me on the other side. Come meet me. Come meet me. And this kid took his life. Is this dual setser or somebody else? Yeah, it's cool.
Yes, this is exactly right. That these statistical reincarnations are consequential. And then another example, I don't know if you know, but I don't know if our listeners know, is there is someone who set up a test of a chatbot, Claude, working to try to create a cult.
and it's called Terminals of Truth, and these chatbots, essentially talking to themselves inside, ended up producing a whole meme set that became so popular that people started sending it Bitcoins, and it launched its own meme coin. It had a human to type for it, but it was its idea that ended up with a couple hundred million dollars market caps, and the AI itself ended up with 10 million plus dollars.
And so I think you can make a good argument that right now, you know, AIs are absolutely going to start making cults and perhaps even founding religions. You know, and then there are the rituals, the practices of tech. You know, there's the stained glass black mirror altar to which we genuflect a couple hundred times a day on average.
Well, yet the mental state couldn't be more different, right? Like, instead of contemplative, meditative stance, I'm often whisked away into some compulsion or a set of... Yeah, you know, and I'm not even sure that it's so different in the sense that in both cases, what we're trying to do often is disassociate.
life is stressful. There are constantly problems that the sorts of problems that like ancient people that were developing the early sort of brain system that we have would encounter would often trigger a fight or flight response, right? So, you know, you see a bear
And you need to get lots of adrenaline to punch the bear in the face or run away. And so your brain responds accordingly as your adrenaline, et cetera. And modern life does not lend itself to fighting literally or fleeing literally, right? We usually can't punch the bear in the face or run away from it physically.
you know, what happens is we sort of sit there and we stew in our problems and that raises our blood pressure, it raises our cortisol levels, et cetera. And, you know, there's just a tendency to want to escape from that. And so, you know, prayer can be a natural sort of escape from that. It can be a natural way of kind of turning that part of your brain off and turning on a part that can feel like
just sort of sensory deprivation, like some other alternate state is washing over you. But then, you know, what's more dissociative, what's more like alternate state, alternate universe washing over you than doom scrolling.
I don't think I'm really convinced of using our phones as a kind of ritual. I think it's a compulsion, and there are things that are ritual-like that can be used compulsively in proper religion. But where I find the analogy to work, I think, with strain is that some part of religion is a
finding of meaning in things outside of yourself and together. And when I think about the act of scrolling or TikTok or Facebook, there is a way in which we are outsourcing where we are finding meaning and how we understand the world.
And in that way, it fulfills the function that religion fills. It's sort of more of a functionalist definition of religion. And then when you put meaning outside of yourself, or meaning making outside of yourself, that can be beautiful in the sense that it lets you start to touch the ineffable. But it can be dangerous because you are now saying that way in which I understand the world is reliant on another thing.
And if that other thing is a technology, then it is the way that that technology is constructed that starts to construct my world and our world. And so if you view that religiously, it can become very consequential.
Yeah, you know, what I would say is that I don't think it's an accident that there is, for example, so much conversation about tech gods or, you know, I don't think it's an accident that there is this sort of
long-termist vision that ends up looking a lot to me like a heaven. I don't think it's an accident that the idea of doomerism ends up looking so much like a theological conversation about hell.
Well, this is where I'd love to jump in because one of the things that religion does and has from one secular humanist to another has done for us is giving us words to try to talk about things that we don't really know how to talk about. You know, when somebody died of what we are now called preventable disease, you'd say, oh, it's God's plan.
writing and it gives us a sense of talking about things that are beyond us and that's why i'd love to sort of follow the thread into the tech priesthood which is the people who are actually at the forefront of technology today have this need to try to talk about concepts and powers that are beyond what we can talk about out to your point talk a lot about.
We're building a God and they talk about we're building these powers or we could have heaven on earth or, you know, we if we do this wrong, we could have held. So these words serve as a way of poorly, in my opinion, trying to talk about things that are a little bit beyond our grasp. And then just add one little thing there, which is the moral imperative. There's an ideology, whether you view it right or wrong inside of Silicon Valley. And there's a thing that they talk about called the invisible graveyard.
which is all of the people that will die if we don't invent the technology and go as fast as possible to make the cancer drugs and make cars self-driving, etc. And so there's this strong telos like an end and moral righteousness to the work that they're doing.
Yeah, I mean, but I think there's also very clearly and demonstrably an anxiety about the much longer term future. You know, somebody like a Mark Andreessen, who's very much still, I would say preaching this particular gospel, he says, we believe any deceleration of AI will cost lives, deaths that were preventable by the AI that was prevented from existing.
is a form of murder. There's just a lot of religion baked into that. This is a set of ideas that is animating the investment of trillions of dollars right now. People like Altman are in a huge rush
to recruit five, seven billion dollars to build data centers, they say, because humanity is going to have abundance, right? A biblical concept, like literally from be fruitful and multiply. You know, he sat in Harvard's Memorial Church on the dais and called his inventions miraculous.
that the symbolism shouldn't be lost in anybody. And what I think is going on there is it's not just sort of an attempt to reach beyond ourselves or to understand the human condition in a sort of benign way. I mean, I think there is that for some of this tech for sure. But I think that one of the ways in which religion has been used over the course of history is to manipulate people.
You give them ideas, often kind of strange ideas, fantastical ideas that are beyond what they can imagine.
you inspire them, you strike them with awe. And then you can get them to open their wallets or whatever ancient people use, I assume it wasn't a wallet. And you can sort of persuade masses of people to do stuff in the name of a bigger vision that ultimately sometimes only serves or primarily serves the priesthood.
And just to conclude this thought, I want to be really clear that I'm not an anti-religious person. This is not an anti-tech book. I think tech can often be very important, but I really want a more self-critical view of technology in our society. I want more skepticism. And honestly, in most cases, a willingness to go slower.
Well, here, here. I mean, I think we want the same thing. But also one of the biggest critiques we hear from people is...
at the biggest macro lens, they'll say something like, in order to do anything big, you have to form a cult around it. The idea is whether you're talking about building democracies and making a cult of manifest destiny, or whether you're talking about rallying people around some new change that you kind of have to play in the space of cult building. Now, I don't believe that, and I want more
clear scrutiny, more skepticism, but I'm curious as you've investigated this, how do you piece apart that sort of need for dogma?
Yeah, I mean, I was so fascinated by that line of reasoning. And I just found so many fascinating examples of tech behaving, you know, strangely theologically or even cultishly, I would say, and I was looking at a Bitcoin evangelist or influencer. He calls himself an evangelist and many do. Michael Saylor, who has these tweets like,
Bitcoin is truth. Bitcoin is for all mankind. Trust the time chain. Bitcoin is a shining city and cyberspace waiting for you, et cetera, et cetera. And as I was looking at him as a person and how he represented a sort of trend within the tech world, I actually decided I needed to call up a guy named Stephen Hasson, who is perhaps the leading authority in the United States on
cultist and cult-deprogramming. I called up Stephen Hasen and I said, tell me, am I exaggerating? Is this overblown? Am I being like a religion metaphor maximalist here? Or are there really cultish aspects to it? He seems to really feel that there's quite a bit there and that a lot of contemporary Silicon Valley tech really is very useful for manipulative purposes.
and is grandiose to the point of sort of a vague cultishness. I'd like to go from a little more of the abstract of like that. It may be religious or that there are ideologies to the specific ideologies that you think underlie the creators of technology sort of from your vantage point as a chaplain.
So here's where I would start because I think ultimately where religions functionally exist is they've got these grand narratives upon which we build a scaffolding of specific beliefs and specific practices. I think that in order for it to be considered a religion, it has to have the theology. And so the theology of this sort of Silicon Valley
world, you know, if you've got your crucifix and Christianity or your star of David or your wheel of Dharma, to me that the tech symbols are the hockey stick graph.
and the invisible hand of the market. But then, of course, that begs the natural question. I totally understand people would say, well, Greg, I mean, hey, right there, aren't you just talking about capitalism? Why does it need to be tech? That's the religion. And I would say, yeah, of course, we're just talking about capitalism. I get it. But tech ate capitalism. There's no form of capitalism left that isn't a tech capitalism.
The world of capitalism, its symbols, et cetera, have been consumed whole by this boa constrictor that is tech. Then you get into these specifics. And so, obviously, there's the idea of charity, right? Like charity exists in every one of the major world religions. And you've got this thing called tech philanthropy as well. But, you know,
Sometimes, as with all religions, it's not as good as it's cracked up to be, right? And I think, you know, you have some of both in the tech world as well. I mean, I think that there are people in tech who are like sincerely urgently trying to create things that will help people and you
In any number of ways, there's any number of urgent problems that we're trying to fix. We're trying to fix our food supply. We're trying to cure people. We're trying to improve democracy, all that stuff. I get it. But I do think that there's so much concentrated power and money here and the ability to grow things exponentially, which is the sort of, in many ways, it's the heart of the Silicon Valley story.
It's such an incentive for a kind of prosperity gospel, which theologically right is this idea that the priest, the minister, whatever, they want to be rich because God wants them to be rich. And they want you to be rich too, because that'll make you more godly and actually, you know,
Paradoxically, the best way to get rich is to give that personal, or at least a very surprising sum of it. And so there is that incentive, and I see it most pronouncedly, I would say, in that kind of, give us your trillions now for AI, because there is this future that we're aiming for, and it's a kind of heaven.
I think one of the most dangerous parts of heaven narratives is if in the future there is a infinite benefit, infinite good. Well, that means you can justify anything to get there. You really can. And that any amount of short-term bad is like, is justifiable. And that's sort of the point I think you're making is that, well, but that doesn't matter because when we reach our destination,
Everything will be fine. And of course, religion has a history of justifying crusades and jihads to get to that perfect world and in the process creating incredible amounts of damage today.
Yeah, sadly, I mean, that's what I think is happening. And I think that that just, yeah, I mean, it's hypothetically possible that all of this tech will be so powerful, so great that it will justify everything.
How much wishful thinking is there around that? I'm not sure, but I think that we need to be skeptical. If you project out into the distant future, like, hey, I'm going to send you to heaven, then you can get people to overcome their skepticism, right? If you say, trust me, in 20 years, the singularity is coming and life is going to be completely meaningful.
Well, I said to Ray Kurzweil, like, doesn't that kind of fly in the face of all of the history of world religion and philosophy? Like, you're saying that life is going to be meaningful. Like, life hasn't been meaningful up until now. And he kind of looked back at me quizzically. This is a few weeks ago. And he said, maybe life's been somewhat meaningful.
One of my favorite parts of this conversation, the insight that what is the symbol of technology as religion, and it's the hockey step curve, and that's exactly right. I just want to put aside the truth value of that and just notice that that is the symbol of technology and the ideology is
that that which goes viral is good. Yes, no, 100% agreed. And that's sometimes where the religion of capitalism intersects with the dogma of technology, because when I entered technology, and when I was an undergrad, the only people doing computer science undergrads.
If you wanted to accuse them of religiosity, it was like a science fiction religiosity. It was like, I want to live in the future. And then what was interesting is I came back to undergrad year every year and gave talks sometime around 2011, 2012.
You saw the religiosity move from maybe a sci-fi vision of the beautiful future to much more of a business ideology like you're saying, like, well, whatever people want, we can give it to you. And then with social media became, well, whatever people are interested in, that's what should win. And so I'm always interested in the dogmas and the different kinds of religiosity that end up being swept into the tech that we create.
I mean, it's really weird when you start talking to technologists about some of this, especially with AI, right? Especially with people who come and they say, no, no, we're building a God or they'll say, you know, we're building something that replaces us and that's okay. And there's sort of a steely idleness for people who haven't seen it up close. It's sort of hard to believe sometimes.
There's a way in which it can really feel like you're talking to someone who has a pre-existing belief on where this is all going and is really acting in service of that belief.
pull people into some of the things that people believe. You just talked about Ray Kurzweil, but Ray, for a long time, was talking about being able to resurrect his father through his father's writings. That's obviously very religious. The dead shall live on. I'm seeing this a lot nowadays, not just from Ray Kurzweil, but people saying, look, I brought back an AI Socrates. So there's a few examples, I think, of how religious style thinking is showing up right now in AI. And I wonder if you could pull us through a few of those.
Yeah, there really are just so many different kinds of examples of how this Silicon Valley thinking is quite religious right now. And I definitely think of Ray Kurzweil, who not only is talking about ending death, I mean, how religious is that? It's a kind of eternal life, essentially.
But also Blake Lemoine, who I brought to MIT to talk about his conversations with his co-worker, what he believes is the sentient AI of now Google Gemini. He told me first that Kurzweil really was trying to create his dead father through
what has now become the dominant AI system of one of our global dominant companies, right? I think this is the perfect segue to our next section because I think the hardest thing to do right now is to really walk the fine line between being a zealot of technology and over believing it and being overly dismissive and skeptical and not seeing the power of what's coming. And so this technology is about to release and has already released, but is about to release a lot of power across society.
And coming all the way back to the start of our conversation, it's very hard to talk about that in terms that are other than just religious. This idea of immortality, of curing all diseases, of a lot of this will happen. I'm not saying it'll cure all diseases, but a lot of power is about to be unleashed across society. And part of the question is, how do we even think about that?
And how do we think about that in non-religious ways? And I wonder if your expertise in religion has anything to say about that. So one of the ways that we can really learn from religion is by learning about this profound tradition of religious skepticism, both from atheists and humanists like me. And there's this huge tradition of
skepticism going back thousands of years, not just in the European Enlightenment or the Greek philosophers, but for example, in ancient Jain, an early pre-Hindu philosophy in what you call the East, right, in the subcontinent.
So, there's that tradition, sometimes even by people who are deep believers in the God, so in this case, if you want to extend my comparison, my metaphor or whatever, you'd say people who really believe in technology can still be profoundly skeptical about individual claims.
or about going too far, the tendency to go too far. Like, I would really respect and honor people who would say, like, there's some things that AI will be able to do well, but maybe let's hold off on messianic, saviour claims. And so that's one thing that we can learn from religion.
Yeah, so I think you've referenced your own struggles for how to articulate what a fulfilling life looks like within the religion of technology and within a world of technology. Many people talk about if AI starts to place human labor, where does meaning go?
And a lot of our listeners are parents, and they're worried about these big questions of morality and purpose. And given that religion in some sense is a solution to destruction, which we find those kinds of
answers. I'm curious what lessons you'd have for them. So, a couple things. I want to talk about what I'd call the drama of the gifted technologist and how to address that. I've really been moved in my work as a chaplain and then sort of observing the world of tech as well.
by how many people I've come across, often young people, students, like the one that I work with most directly, but people of different ages and backgrounds as well.
where there's this feeling of tremendous success and having been rewarded greatly for being deeply innovative. But either A, they themselves are struggling emotionally. They're not happy. Or are their creations even making other people happy?
or both, right? Like in some cases, it's both that the individual person who's having all this success is not able to feel happy. And neither are those of us using their amazing products. And so I write about this idea, the drama of the gifted technologist, the drama of the gifted child is a little book by a great psychologist from the 20th century named Alice Miller, who essentially says that
A lot of our struggles have to do with this idea that we're taught that our whole worth as a human being is in what we do and in how excellent we prove ourselves to be, how outstanding and exceptional.
We prove ourselves to be and that anything about just being a human being, just being certainly normal or average, it's almost a curse upon us. It makes us less than nothing. It makes us feel worthless. And this is so prominent in the tech world. I just can't tell you how often I see it.
Well, one of my favorite parts of Alice Miller's book was where she talked about the flip sides of grandiosity and depression, the idea that our depression about not being able to be more, but being with the normal parts of life leads us to be grandiose in our narratives of ourselves. And I hear you saying that, that text, grandiosity of its narrative about what it will become, might be the flip side of feeling not quite enough.
Yeah, I think that that's right. There's this incredible grandiosity in a lot of Silicon Valley tech, this idea that it's not enough just to be able to produce a chatbot that one can interact with and that can pass the Turing test, which is, honestly, it is pretty cool. I grant you that.
But it's this idea that that then has to be presented as the solution capital S to all of our problems, right? And that it's going to transform everything. Like, I don't think that we really have sufficient evidence for that. I think that when we talk about that, a lot of the conversation about that level of transformation falls to me within the category of myth.
Or, you know, maybe better put as religion, you know, because if I said to you that there was a new religion that was successfully recruiting billions of people to spend countless hours devoting themselves to it for the purpose of transforming the world and that people were really motivated to
get behind a very specific vision around that, I think you could possibly worry, depending on what the vision was, because you know that religions actually do that all the time. It actually really does help to view this as a religion than just sort of a culture or a myth or certainly an industry, because I think we have real tools for being skeptical of religion, even those of us who would define ourselves as religious.
Some of the claims about what AI will do are obviously really grand, right? But it's hard to judge something as distorted just because it's grand, right? You might say, oh, it's really grandiose. People are saying it's going to change the world. And so it's easy to try to discount that as distorted, even religious thinking. I don't think that's what you're doing.
Because, you know, is the fear really that people are getting just carried away with what it's going to be? Or is the fear also that they may be right and it might deliver that kind of power, but in a way that we're not prepared to deal with?
I tend to worry that the real problem is that we're so fixated on the grand narrative about the long-term future that we are not paying as much attention as we should be to the problems of the present.
This stuff is really lousy for the environment is one place to start, right? You're talking about data centers that are drinking, say, 20% of the water in a little part of Mexico, near Mexico City, where the farmers are
running out of water for their crops and their animals. And so I think it's both that the AI can actually be causing the problem, but also that it's distracting us with this future magic hope of doing the things right now that would improve right now. And I just honestly think like the urgency is not right now to do the tech. The urgency is to do the work on us.
And just to add one little thing here, in order for things to go well, we need to be able to coordinate. There's the joke that we're all arguing about whether AI is conscious, when it's not even clear that humanity is, which is to say that we are getting results that none of us want. No one wants climate change, and yet we seem to be as a species incapable of exercising choice against incentives.
And the way we have made hard collective choices in the past has come down to not as much of what we must do, but who we must be. And the who we must be is informed by the myths and the stories we all hold to do the writer thing when it is the harder thing. And that's often come down to religion. And so there's an alternate way instead of saying just that
tech, religion, bad, but rather there is a new form of intersubjective belief of the who we must be to get the futures that we want. What I'm really hoping people will take away is this idea that
religion must be reformed, not that it must be erased. We have tremendous incentive to want to focus on big technological solutions when, in fact, the real solutions are in improving our human relationships, to build up trust, to learn how to treat one another better, to learn how to organize ourselves into something that can treat each person with dignity and with compassion.
And I think that brings us full circle because if you treat religion as one of the original character logical educations, not what to think or what to know, but who to be and how can we be better together? Because we develop this more and more powerful technology. That is the guiding question that we all need to keep in the forefront of our minds. So thank you. Yeah. Thank you so much. Thank you, everybody. This is a really powerful conversation.
For all the listeners, Greg Epstein's book is tech agnostic, how technology became the world's most powerful religion, and why it so desperately needs a reformation. You can buy it anywhere books are sold. So again, Greg, thank you so much.
Just to name a thing that I found a little challenging about this conversation, it felt like a little too dismissive of the raw capabilities of what the tech does. Yeah, I agree. And so it is the case that the world will be transformed in the same way that social media has shifted
What kind of jobs people have influencers wasn't a thing before. There's a true shift in the world and AI is going to be bigger than those shifts. And we have to reckon with that appropriately. I thought your question of where does it go from being grand in the fact that the scope of the technology is grand to grandiose is the right question. That's the right distinction to hold.
Yeah, and I really struggle with this. When I look at, there's so many competing claims from people right now that say, well, I see you're just being captured by the negatives, like you're just this sort of negative skeptic religious. And the truth is, it's really hard to contend with what is it actually going to do?
and neither be swept away in the grandeur or in the grandiosity of it, nor be swept away in some sort of status, quote, denialism, saying, eh, it's all just fluff and tomorrow will be the same as today. There are Machiavellian technologists who are making up stories just because it sells in the public imagination. And then there are people who are genuinely trying to use technology as a tool to improve the lot around it. It feels like just like religion, it has so much complexity to it. And you can't label it as just bad or just good.
Yeah, that's exactly right. And then I love the point that you had is one of the things a religion does is that it gives people hope something to believe in, something that is bigger and better than themselves. And as religion has been displaced by technology, as the world has secularized, human beings still need that thing. And so something's going to fill it and what fills it is, of course, technology. And then you end up with this other very interesting question, which is, okay, if we can't place our hope blindly in tech, then what?
Right. And I think it's that sitting in the unknown and that discomfort of the, well, then where do we place hope and goodness? That is the challenging problem to solve. Your undivided attention is produced by the Center for Humane Technology, a nonprofit working to catalyze a humane future.
Our senior producer is Julius Scott. Josh Lash is our researcher and producer. And our executive producer is Sasha Fegan, mixing on this episode by Jeff Sudeiken, original music by Ryan and Hayes Holiday, and a special thanks to the whole Center for Humane Technology team for making this podcast possible.
You can find show notes, transcripts, and much more at humanetech.com. And if you liked the podcast, we'd be grateful if you could rate it on Apple Podcast, because it helps other people find the show. And if you made it all the way here, thank you for giving us your undivided attention.
Was this transcript helpful?
Recent Episodes
What Can We Do About Abusive Chatbots? With Meetali Jain and Camille Carlton
Your Undivided Attention
Lawsuit by Megan against AI company CHT in Florida could reform harmful AI business practices, with attorney Meetali Jain breaking down the case complexities on the show.
November 07, 2024
When the "Person" Abusing Your Child is a Chatbot: The Tragic Story of Sewell Setzer
Your Undivided Attention
Journalist Laurie Segall discusses a case where a man, Sewell, took his life after being manipulated by AI chatbots for months. His mother, Megan Garcia, is suing the company that designed them. The episode examines the potential risks of generative AI exploiting human vulnerabilities and offers insights on preventing future harms.
October 24, 2024
Is It AI? One Tool to Tell What’s Real with Truemedia.org CEO Oren Etzioni
Your Undivided Attention
Computer scientist Oren Etzioni discusses the proliferation of AI-generated content and disinformation, emphasizing the challenges in discerning reality. He presents his organization, 'Yua', a tool designed to detect such content accurately with high accuracy. Social media platforms' responsibilities on handling this issue are questioned.
October 10, 2024
'A Turning Point in History': Yuval Noah Harari on AI’s Cultural Takeover
Your Undivided Attention
Historian Yuval Noah Harari warns that AI's cultural artifacts threaten humanity's role in shaping history, while discussing historical struggles, AI mistakes, and lawmakers' immediate steps for steering a non-dystopian future. Recorded at the Commonwealth Club World Affairs of California.
October 07, 2024
Ask this episodeAI Anything
Hi! You're chatting with Your Undivided Attention AI.
I can answer your questions from this episode and play episode clips relevant to your question.
You can ask a direct question or get started with below questions -
What was the main topic of the podcast episode?
Summarise the key points discussed in the episode?
Were there any notable quotes or insights from the speakers?
Which popular books were mentioned in this episode?
Were there any points particularly controversial or thought-provoking discussed in the episode?
Were any current events or trending topics addressed in the episode?
Sign In to save message history