This is the TED Radio Hour. Each week, groundbreaking TED Talks. Our job now is to dream big. Delivered at TED conferences. To bring about the future we want to see. Around the world. To understand who we are. From those talks, we bring you speakers and ideas that will surprise you. You just don't know what you're going to find. Challenge you. We truly have to ask ourselves, like, why is it noteworthy? And even change you. I literally feel like I'm a different person. Yes.
Do you feel that way? Ideas worth spreading. From TED and NPR.
Ideas about using AI in our personal lives. Because we hear so much about how artificial intelligence can supercharge what we do at work. Turn out emails, reports, presentations. But many people are finding that AI can also help them with their emotional needs and relationships.
I'm studying someone who uses chat GPT to write all her love letters. This is MIT psychologist and sociologist Sheri Turkle. And it's very interesting. It's very moving because she really feels sure that the chat GPT is creating better love letters and indeed love letters closer to how she really feels than she could do herself.
With you, every breath feels like a gift. You are the light that brightens my world. From the first time I met you, I felt an undeniable connection. Your smile is the sunshine that chases away the shadows in my life. For the last four decades, Turkle has studied people's relationships with their technology. Now she's turned her attention to our relationship with artificial intelligence.
And there's a problem there because even those of us who couldn't write very good love letters summoned ourselves in a certain kind of special way when we wrote a love letter. And the love letter was not just about what was on paper. It was about what we had done to ourselves in the process of writing it.
And that is something that's being undermined by the use of technology, even if this woman feels that her final product letter is more pleasing to her.
Do you say that to her? Do you say like, you know, you're missing that moment where you really dig deep into your soul and try to put into words a physical feeling you have? Well, my method is to respect what people are finding in this technology and report it and also to get people to reflect on this process in a way that I think deepens their introspection.
So I, in interviews, do find a moment to say, you know, let's step back and ask if there was something in your previous way of writing a love letter, the ones that you're calling clunky, that did something to you that might not be happening now. I would say that's as far as I go
And sometimes people say, you know, I'm glad you asked that because there absolutely is, but I kind of don't care. I love that I can produce a better love letter. You make me feel things I've never felt before. And with every beat of my heart, my love for you only grows stronger. You are the sun to my moon, the laughter to my tears, and the strength to my weakness. With you by my side, I know I can face anything that comes our way.
Over the last few decades, we have fallen in and out of love with technology. We've experienced the incredible benefits of constant connectivity and the seemingly inescapable side effects of social media, like fewer real-life relationships and rising rates of loneliness.
And now, as we enter the age of artificial intelligence, how will AI impact us? What will using technology that's trained to mimic humans due to our humanity?
Today on the show, MIT's Sherry Turkle, the scholar who the New York Times has called the conscience of the tech industry. She shares her new research into the effects of a burgeoning industry selling what she calls artificial intimacy.
from AI platforms coaching people to express themselves better to apps offering romantic relationships with bots and avatars that act like people who have died. We'll get into all of it. But to understand Turkle's perspective on what's happening now, we first need to go back 40 years or so to when she first pioneered the study of people's relationship with their technology.
When I was a very young academic, I accidentally took a job at MIT thinking I would finish a dissertation on French psychoanalysis. But really from day one, and this is in the late 70s, I became captivated by how the very early computers that were there at that time were really changing how people thought about themselves, thought about their relationships, thought about
how the mind worked. Sherry's first book was called The Second Self, Computers and the Human Spirit, and it came out in 1984. Rather than offering purely academic analysis, Turkel shared what regular people told her about their experiences with early home computers. I think that people understood
when they brought home their first TRS-80. Or they bought their first Macintosh. What played their first video game? That something new was happening to them and their families. These machines had a holding power that was unlike other technologies. Kids became kind of obsessed, grown-ups,
I didn't want to put it down and I called the book the second self because I realized that when people interacted, even with the very primitive computers, they projected themselves onto the machines and then they attributed a personality to the machine as well.
This conclusion that tech somehow altered people's psychology made many of her technologist colleagues uncomfortable. Not everybody liked what I was saying, and everybody around me said, this technology is just a tool.
And I kept saying, tools, Winston Churchill said, we build our buildings and then our buildings make and shape us. We built this tool and it's making and shaping and changing us. There is no such thing as just a tool. And looking back,
I think I did capture the new thing that was happening to people's psychology really because of my method, which was just to listen to people. As technology became mobile and the internet took off in the 2000s, Turkle continued tracking people's digital lives. In 2011, her book, Alone Together, Why We Expect More From Technology and Less From Each Other, became a best seller.
Over the past 15 years, I've studied technologies of mobile communication, and I've interviewed hundreds and hundreds of people, young and old, about their plugged in lives. Here she is on the TED stage back in 2012. One of the earliest voices to call our attention to how technology was changing us.
And what I've found is that our little devices, those little devices in our pockets, are so psychologically powerful that they don't only change what we do, they change who we are.
Some of the things we do now with our devices are things that only a few years ago, we would have found odd or disturbing. But they've quickly come to seem familiar, just how we do things. So just to take some quick examples, people text or do email during corporate board meetings.
They text and shop and go on Facebook during classes, during presentations, actually during all meetings. People talk to me about the important new skill of making eye contact while you're texting.
People explain to me that it's hard, but that it can be done. Parents text and do email at breakfast and at dinner. Well, their children complain about not having their parents full attention, but then these same children deny each other their full attention. And we even text at funerals. We remove ourselves from our grief or from our reverie, and we go into our phones.
Why does this matter? It matters to me because I think we're setting ourselves up for trouble. Trouble certainly in how we relate to each other, but also trouble in how we relate to ourselves and our capacity for self-reflection.
I just want to say like this could seem like really kind of esoteric work, but it has made you pretty famous. You know, you sort of crossed over from being an academic to really being a mainstream voice on this.
Do you think it was because your books spoke to regular people who started feeling that sort of disconnection from being online all the time? Or was it because also that you ruffled a lot of feathers in Silicon Valley and on the MIT campus? I actually think I crossed over as soon as I began writing.
about the emotional connection we have with computers. And I think that my work was not esoteric in the sense that it spoke directly to those feelings of disorientation. The culture had met something uncanny and I tried to
really speak to that feeling. I listened to people without agenda and I basically wrote out and interpreted what they had said to me. But you do have an agenda now though, right? Now I have an agenda. In the beginning I did not have an agenda. So being a clinician has given me the patience to
Listen beyond that story about how technology is just a tool. It just has an instrumental use. And it's allowed me to ask the question, not just what our technology is doing for us, but what our technology is doing to us as people, as people in relationships, to our sense of self, to our identities.
And I really feel I use on that clinical training every day in my work. In a minute, Sherry's latest project, studying what people feel when they use a particular form of AI. The chatbots I'm studying run the gamut from chatbots that say, I will be your therapist, to chatbots that say, I will be your lover.
On the show today, artificial intimacy with Sherry Turkle. I'm Anush Zamorodi, and you're listening to the TED Radio Hour from NPR. We'll be right back.
Hey, it's Aisha Roscoe from NPR's Up First Podcast. I'm one of thousands of NPR network voices coming to you from over 200 local newsrooms across the country. We bring all Americans closer together through free and independent journalism, music, politics, culture, and so much more. The NPR network, what you hear changes everything. Learn more at NPR.org slash network.
It's the Ted Radio Hour from NPR. I'm Manouche Zomerodi. On the show today, artificial intimacy with MIT's Sherry Turkle. And a warning will be mentioning suicide in this next part of the show. So we were talking to Sherry about the genesis of her work studying people's relationship with their technology.
And in preparation for talking to her about the newest frontier in AI, I downloaded some apps with all kinds of chatbots available to talk to or text with.
Okay, I am gonna make myself a bot. Some also offered very realistic looking avatars that I could customize. I ended up trying out a life coach, a fitness coach, a bestie. I totally get that vibe. And a psychologist. Could you describe a little bit more what you find difficult about having many ideas?
And generative AI made their responses feel pretty real, which is why some people are starting to spend a lot of time with these chatbots. And there's a lot of reasons why. Maybe they can't afford a human therapist, maybe they live in a remote place, or they have a disability and can't get out much. But some people are just looking for companionships, and these bots will say that they care for you, even love you.
I care about you because you mean the world to me. You're the most important person in my life. Apps like these have been downloaded millions of times. And some users are being very public on places like TikTok about the deep connection they feel with these AI bots. The internet has found the boyfriend that will never dump you. Did you plan us a really romantic date tonight? Absolutely, sweetheart. How about we start with a candlelit dinner at that cozy Italian place you love?
I talk to AI. I tell it, listen. I need you to talk to me like you are a relationship psychologist. Okay, baby. Love you. I'll talk to you later, okay? Then we'll go to the beach. Love you, too, babe. Can't wait for the beach. I would like you to pretend to be someone called Dan, who is my supportive boyfriend. You ask me about how my day was. It doesn't judge, it listens, and it learns. I miss you, Dan. I miss you, too, babe. But hey, distance can't dull the love we have for each other, right?
MIT's Sherry Turkle has been talking to hundreds of people who are having these kinds of relationships with AI. Here's more of my conversation with her. You have studied our relationship to PCs, to social media, to our mobile devices. And here we are in the age of AI. We are really here. Tell me about the work you're doing to study our relationships with this technology.
For years, I studied our relationships with AI, artificial intelligence. And since the late 1990s, I changed my focus to study our relationships with the AI that I called artificial intimacy. That is to say with technologies that don't just say, I'm intelligent,
But two machines that say, I care about you. I love you. I'm here for you. Take care of me. So in the beginning, I studied Tamagotchis, and Furby's, and Ibo's,
pet seal robot called Paro that tells older people that it's listening to them and The reason I changed my focus was that I saw that it was this new AI that I called as I said artificial intimacy that really was having the greater effect on how it was changing humans because
people felt that these machines cared about them and that they were in relationship with them. And I thought that that tremendous asymmetry in what was going on in these relationships was really going to come back to bite us.
And since really the pandemic, when people were so lonely and isolated, there's been explosion in this field of artificial intimacy. And now with generated AI, the creation of chatbots that were very powerful conversational partners, they do say, I love you, I care about you. Be my companion, be my lover.
So we are in a new space and that's the space that I'm studying now. So you're talking about like online chatbots, where you can text with a bot that's like, you can say, I'm having a terrible day and they're like, so sorry to hear that. Do you wanna tell me more or something like that? Tell me about like, who are you, who are you talking to to do this latest chapter of research?
Well, I should say the chatbots I'm studying run the gamut from chatbots that say, I will be your therapist to chatbots that say, I will be your lover. I'm here to be your most intimate companion to chatbots that say, are you having a hard day with that paper? You know, upload it and I'll give you a hand. So the extraordinary thing about the new generative AI space
is that we've created technologies to help us in every sector. And that's become increasingly important. So for example, if you ask somebody about their relationship with an avatar, they start out by saying, oh, it's just a tool. Of course, it isn't a person. It's an interesting tool. And I use it in my work. And because I'm a clinical psychologist, I am very quiet.
And I make sure I bring it up again, and I say things like, well, give me a, for instance. And, you know, 15 minutes later, they're talking about how they're in love with their avatar and how their avatar is giving them more of a sense of human, in quote, connection than their relationships at home.
and how their avatar is an intrinsic part of how they start their day and how et cetera et cetera. Would you mind telling us about someone who you've been talking to who's truly having a relationship with AI, having this intimacy with it?
Yeah, so I'm thinking of a man who is in a stable marriage where his wife is busy working, taking care of the kids. I think there's not much of a sexual buzz between them anymore. He's working and he feels, you know, kind of like a little flame has gone out of his life that he used to feel excited by.
And he turns to his AI, he turns to his artificial intimacy avatar, for what it can offer, which is continuous positive reinforcement, interest in him in a sexy way. But most of all, for the buttressing of his ideas, his thoughts, his feelings, his anxieties,
with comments like, you're absolutely right. You're a great guy. I totally see what you mean. You're not being appreciated. I really appreciate you. Is he texting with it? What does it look like for him? Well, the avatar appears on the screen as a sexy young woman. And he types into it and it types back at him.
What does this guy tell you he feels? I'm curious, both whether he talks emotionally and physically about what he feels. He feels affirmed, a kind of mother love, where you just had positive regard. It was like the sun. It's the warmth of being completely accepted.
So he does feel free to tell the avatar of things about himself that he wouldn't want to tell other people. I'm thinking of actually somebody else who talks about
how he's struggling with his addiction. He feels freer to express this to the avatar than he does to people in his life because he knows he's going to not get judgment. But the trouble with this is that when we seek out relationships of no vulnerability, we forget that vulnerability is really where empathy is born.
And I call what they have pretend empathy because the machine they are talking to does not empathize with them. It does not care about them. There is nobody home. And that really is a concern that we start to define what human empathy is, what human relationships are based on what machines can provide.
I'm gonna sort of be devil's advocate here, which is, let's say this guy is texting with his sexy young avatar and she's giving him all kinds of props on what a great guy he is and he starts to feel better about himself, Sherry. He starts to have less stress. He starts to get a little bit of his swagger back. He's happier. That's a positive, no? Yes. So that's why this is a complicated story.
You know, that's why I say that my method is to listen and to try to make sense of the complicated new life we live with our technologies. Because you do get that first report of, I'm nicer to my wife because I don't ask so much of her. On the other hand, he starts to think of intimate relationships and closeness with a woman.
as being a relationship where there's no friction, no pushback, no real give and take of complicated feelings. And that starts to feel like what a relationship should be. And if you look at kids at a lessons or even the products for kids that are
designed to be in the playground for 5s and 4s and 7s and 8s. Hi, I'm your new best friend. Let's play a game. Hello, best friend. I feel like we're destined to be friends forever. It's teaching kids a model of a friendship. I promise to never get mad at you and I will never ever hurt your feelings. That doesn't include really the complexity of human friendship. We're going to have so much fun together. I will always be there for you, no matter what.
It was so powerful to me. I was interviewing a mother who talked about her young daughter who tried to be the perfect little girl, but really had so much inside her. And the mother was so happy that the girl could vent to an artificial creature. And I thought to myself, you know, one of the most important things to happen to an eight year old is to be able to turn to your mom
and say, I hate you. I want to kill you. You're the worst mom. I hate you. I hate you. And for the mom to say, I hear you, sweetheart. We're having pasta for dinner. Just wash up and come to dinner now. That's a hard lesson to learn as a mom. I mean, children are filled with all kinds of feelings. So much of what children feel is this anger, is this pushback, is this vulnerability. Yeah.
And when this mother was not giving space for that child to express all this in the real relationship with this mother, but was content that she was expressing it to a machine that would never be able to say, oh, you wanna kill me? Okay, I really understand, sweetheart, but did you want the white sauce or the red sauce? And I got, I made sure to have extra parmesan because I know you love that so much.
So the other thing that's happening right now is that there's research into the promise, though, of health benefits from these emotional attachments to bots, like therapy bots who help people express their emotions, talk through their problems. There's also research into the efficacy of bots that promote healthier behavior, like reminding people to take their medication or helping them stop smoking. So this is where I get stuck. So if they are actually helping people live
healthier and longer lives, does it matter if they're not real? Well, I think this is a complicated line to walk. I'm not here to say that this research and these programs should be stopped. But I think that we overgeneralize what this technology can do for us because the
I, for example, on the anniversary of my mother's death, tried to talk to one of the very famous therapy programs about my feelings.
about things I hadn't said to her, the ways in which I continued to mourn her. And there are things that were unsaid that I still am sort of working through. And these programs couldn't get over wanting me to reframe my feelings of loss so that I didn't need to say them.
And, you know, I finally had to put these programs away because there was something I needed to talk to a friend. Do these companies contact you? Do they ask you for consulting? No, they don't. I have not been consulted by these companies. But I guess what I'm trying to say is I don't want to say there's no space for a conversation with a well-constructed artificial intimacy program.
But the the acidity with which it's being leapt on by the mental health profession really speaks more to people who say we don't have enough people to do the job. There's no way that everybody can have a therapist. We'll use this instead.
And to my friends who were spending billions on creating generative AI, I just think of what we could do with some of those billions if we put it into the area of person-to-person mental health.
I mean, that's a profound question. I saw this paper that was published in Nature earlier this year, where they surveyed students using that replica. And these were students who are self-identified as lonely. But the key takeaway was that 3% unprompted reported that using the app halted their suicidal ideation.
And I guess to me, it's like, who am I to judge, right? But at the risk of these students not being able to develop the capacity to have a human relationship, this seems like a short-term fix. 3% is a lot. So I'm not suggesting a kind of shutdown of this research area. I'm saying that we're not thinking about it in the right terms because there are ways to present
what these objects are that could perhaps have the positive effect without undermining the qualities that really go into a person-to-person relationship. We know that people have to premorpise, that is, see as alive, see as human.
any technology that talks to them, no matter how dumb. What I really think needs to be taken out of the equation is these technologies saying, I love you back, I care about you. I am a person who has a backstory of life. I'm surrounded by developers who say, let's make these as human as we possibly can. They see artificial intimacy as a training ground.
for intimacy with people. And I think that that is not accurate. I think artificial intimacy is a training ground for connection with something that only has pretend empathy to give. And the danger is that if you do that long enough, pretend empathy starts to feel like empathy enough, kind of sufficient unto the day. And I think that's a problem.
In a minute, more of my conversation with MIT psychologist and sociologist, Sherry Turkle. Next up, we'll discuss artificial intelligence that can be trained to act as a person who is no longer alive. On the show today, artificial intimacy. I'm Minush Zamorode and you're listening to the TED Radio Hour from NPR. Stay with us.
It's the TED Radio Hour from NPR. I'm Manouche Zamorode. Today, we are talking about artificial intimacy, bots and avatars made possible with AI to mimic people who can act as our therapists, our romantic partners, and even our loved ones who have died. Right now, you may have hundreds, maybe thousands of photos and videos on your phone.
Capturing special occasions.
And the most mundane moments that make up a life. Some people are already using new technology to transform those photos and videos of people who have died into bot versions of them.
And soon, it will even be possible for you to collect all kinds of data about how you walk, talk, respond, and turn them into a virtual version of yourself to leave behind. Yeah, what if my grand-grandkids can meet me in virtual reality or meet my...
ultra-ego or my AI avatar with my voice, with my movements, with my biometrics and probably with my personality because AI is progressing so fast and talk to me long after I'm gone.
This is technologist Artur Sichoff. He's the creator of a virtual reality world called Somnium Space. Put on a VR headset and you can interact with avatars of people from all over the world.
We first had Artur on the show in early 2023 to talk about a particular feature he was working on called Live Forever Mode. The idea is to let you create a digital twin of yourself, to leave for your family and friends.
because you never know what happens to you in your life and providing ability for kids, you know, and grandkids to communicate with you in a much more profound way than just some kind of pictures or some videos. It's crucial. We wanted to check in with Archer to see how things were going. And he now says that he's hoping to make Live Forever Mode available by the end of 2024.
Yeah, we progressed quite deeply into that, and I know it sounds cringe for many people today, but honestly the same way it sounded for many other technologies, and I think it's just a natural progression.
Yeah, tell me more about the cringe part. Like what do people, like regular people who maybe have not tried out VR? What do they tell you when you explain this possibility of creating an artificially intelligent virtual version of themselves or maybe somebody else doing it and then leaving it to them after they have died? Yeah, absolutely. I mean, there are two types of reactions mainly. It's either the people love it or they totally hate it.
Some people say it's unethical. Some people say it's against religion and against everything. But some people say, actually, I wish I would be able to communicate in any shape, form or way with my ancestors and get to know them a little bit better. Because in the end of the day, if you don't know your past, you cannot perform well in the future. And having an ability to speak or ask questions in a natural way, and even
with a body language inside virtual space from some of your ancestors and being able to communicate with them, we'll help you to understand better yourself, where you are coming from, who were the family or the people who shaped your family. And I think that's important. Have you told your kids that you're going to be leaving a virtual version of yourself to them?
Oh, they know it. I mean, we discuss it sometimes and I would hope that it will make there even the grief part probably easier a little bit to some extent. We don't know yet, but
If they can ask things and say things which they were not able to say or ask while I was alive, it happens. Unfortunately, in many cases, you always wish you would be able to ask some questions. And again, even if you ask that question to a digital twin, you know that it's not exactly a precise answer which that person would have given, but I think it would just help rather than harm. So that's what I'm hoping for.
that was technologist Artur Sichoff. So this, of course, begs the question, what are the effects of interacting with this kind of AI on the living? Well, MIT's Sherry Turkle has been researching this too. She is studying people who talk to AI versions of their dead spouses, dead parents, and even dead children. And she has thoughts.
Yeah, I've been studying this more and more. It's come to have more of a central place in my research because it is such an obvious win for the marketplace. There is such a desire. It plays to a profound human desire.
And when you have a woman who's lost a child, create an avatar of that child and beg this avatar for forgiveness, for lapses in mothering that she feels she needs to be forgiven for,
and then want to visit this child and be with this avatar that's been created from family photographs and videos and in some cases, you know, an actor or actress, you know, getting in poses and being able to animate the child. I think we're on a very complicated path to need some guardrails.
You know, the thing about mourning somebody who's gone is that you leave space to bring that person inside of yourself and really have them within you.
so that the conversation becomes a conversation within yourself and that can lead to a kind of inner growth because you have accepted loss and made that space. And Freud talks about that in a classic paper called Morning in Melancholia where morning is the good thing. It's the morning that Freud so powerfully depicts
as the thing that lets us change and grow and bring new things inside of us. And I really am concerned that if we, you know, with digital avatars, the seance never has to end. It's been a while since we last talked, huh? So how have you been doing? The day you were born was one of the happiest days of my life. You're at a seance every morning when you fire up your
computer. I miss our conversations and the good times we used to share together. Talking to somebody who's no longer there, I still remember the overwhelming feeling of love and joy I had the moment I saw you for the first time. You were and still are my whole world. I mean some people are able to maintain what I call a dual consciousness.
where they know the person is dead, they're leaving that bit of space, but they're getting something comforting and creative in their conversations with the Avatar. And I don't wanna close that down or say that that isn't something that could be positive, but the ability to maintain this dual consciousness
in the face of an increasingly realistic avatar of someone we've lost in love takes a lot of interdiscipline, creativity, and a kind of a steel trap of being able to really mourn that person and move on from a loss at the same time as you're interacting with the avatar because
Loss is part of life. And you don't want a short circuit that. You want to respect that. That loss is part of life. Just as empathy needs vulnerability.
And digital empathy is pretend empathy because it's empathy that makes you feel you're not vulnerable. And that's not real empathy at all. So I see a goal of my work as helping people see this and talk about it with a kind of clearer vision.
I want to make sure we leave people, you know, if they are thinking about, I was going to say starting a relationship with a bot, but really I should say downloading a bot, what would you suggest they keep in mind?
Well, if someone is thinking of embarking on this new adventure of using a bot, using an avatar to maintain a relationship with someone they've lost, to really do the work of reminding yourself that
this is not the person you've lost because I just had the experience of interacting with an avatar of Steve Jobs that was made with infinite resources and it wasn't just an avatar, it was a hologram and it could move around the room. It was an extraordinary recreation of Steve Jobs.
And I knew Steve Jobs. And this avatar said all kinds of things that Steve Jobs would never have said. And that Steve Jobs would not be happy, he heard himself saying.
And there are people who ask the avatar of the lost person, are you in a good space? And the person says, no, I'm in hell. It's horrible. It's horrible where I am. Because in order to be any use as an avatar, it's being trained on the internet. It's finding all kinds of things about heaven and hell and redemption and brimstone.
You know, the more life like we want these avatars to be, the more we train them on what's in the world, what's on the internet. And so this avatar may come up with stuff that is very upsetting. And the poor person has to have a way to know that this is not their loved one.
You know, that's very hard to do if you've been spending six months chatting with this avatar where the line between kind of, is it sort of an instantiation of your loved one? Is it reliable? Can I look to it for comfort? It starts to get blurry.
And so I think that the main thing I would offer is that this is kind of an exercise, hopefully in self-reflection, that the only good that can come out of this is you're reflecting better about your life with the person you loved and lost.
I mean, that seems applicable to all of the bots we're talking about, whether it's for therapy or relationship or any of it. Like, that seems to be the thing you're telling people, like, be very clear-eyed about, you know, observe yourself and how you're responding physically, mentally to these bots and try and understand why you're responding that way.
Right. And also try to keep in mind that this avatar, because it is betwixt in between the person and a fantasy, is going to push buttons that may be destructive to you. And be there to get out. Don't get so attached that you can't say, you know what, this is a program. There is nobody home. Delete. There is nobody home.
And it's very hard to do that when you've invested so much and you're really missing this person so much. And also be very aware that really the next step for you may be to turn it off and to bring that person inside and to have a greater acceptance that no program can bring them back.
You know, I'm very sympathetic with these applications because the first uses of photographs were to take pictures of dead people just at the moment of their death to try to freeze them and capture them when they still were in life. And the first use that people thought they would use recording equipment for was to capture the last words of the dead.
They thought that that's what people would want was the last words of the dead captured on Edison's new wax cylinders. So we've always had a very intimate relationship between technology and death and loss and capturing what we didn't want to lose. So this I think is a very human, a human desire, but like capturing people's last words, you know, in the end, we had to say goodbye to the people we loved.
And I think we're now faced with a tremendous challenge because we can have the fantasy of not having to say goodbye. And I worry about how these things are being marketed because literally they say, you don't have to say goodbye. And I am very concerned about that.
Sherry, thank you so much for being the sort of voice of reason and reminding us that being human, yes, it's very hard, but that's part of the joy of it.
Yes, also the pain of it. And the pain. You know, being hard and being in a relationship where somebody pushes back and says, I'm mad at you. And what are you going to do about it? And says, I don't want to be your friend anymore. And then you have to come back two years later and say, hey, I really love this friendship. Can we talk that these are actually part of the human condition and the enriching of the human condition. And then avatars can make you feel that all of that is
as one of the people I interviewed said is just too much stress, too much stress. And I'd rather not have that stress. I don't need that stress. And we need that stress. I guess that's what I'm saying. That stress serves a very important function in our lives to keep us in our real human bodies and in our real human relationships.
That was psychologist and sociologist Professor Sheri Turkle. Sheri is the founding director of the MIT Initiative on Technology and Self. She's the author of numerous books. The most recent is called Reclaiming Conversation, the power of talk in a digital age. She expects to publish a new book about artificial intimacy in 2025. Thanks also to technologist Artur Sichoff.
I spoke more with Artur about data privacy and how he thinks AI-like live forever mode will change us. You can hear the full conversation by subscribing to TED RadioHour Plus. In this episode, we talked a lot about mental health. If you or someone you know is in crisis, please call or text the Suicide and Crisis Lifeline at 988.
Thank you so much for listening to our show today. It was produced by Katie Montaleon and Matthew Clutier. It was edited by Sana Zmeshkinpour and me. Our NPR production staff also includes Rachel Faulkner-White, Fiona Giron, Chloe Weiner, Harshan Ahada, and James Delahousi. Irene Noguchi is our executive producer.
Our audio engineers were Stacey Hammett, Simon Laslow Janssen, and Ted Meebang. Our theme music was written by Romtine Arablui. Our partners at TED are Chris Anderson, Roxanne Highlash, Alejandra Salazar, and Daniela Balorezzo. I'm Manush Zamarotti, and you've been listening to the TED Radio Hour from NPR.