If you're enjoying this episode and you want more TED Radio Hour in your life, try out NPR Plus. With NPR Plus, you get access to sponsor-free listening for the podcast, and you get special bonus episodes. Now you can also unlock perks from over 20 other NPR podcasts, too. Give a little, get a lot. Visit plus.npr.org.
This is the TED Radio Hour. Each week, groundbreaking TED Talks. Our job now is to dream big. Delivered at TED conferences. To bring about the future we want to see. Around the world. To understand who we are. From those talks, we bring you speakers and ideas that will surprise you.
just don't know what you're gonna find. Challenge you. We truly have to ask ourselves, like, why is it noteworthy? And even change you. I literally feel like I'm a different person. Yes. Do you feel that way? Ideas worth spreading. From TED and NPR. I'm Anush Zamorodi.
In 2014, Katie Kwan was a professional dancer in New York City, living her dream. Taking the train to Lincoln Center and dancing at the Metropolitan Opera Ballet and traveling all over the United States to be in various shows. I did the King and I at the Lyric Opera of Chicago. I did Cinderella at the Gateway Theater. I had my own dance company actually, and we did something between like six to 10 shows a year.
But then her father got sick. Yeah, it was very scary for our family. He had a stroke. And my dad, you know, English is his third language. He was in his mid-60s at the time. Katie found herself there with her dad in his hospital room, surrounded by medical equipment.
So he had a heart monitor, you know, the thing that blips and bloops on the screen and he also had quite a few issues with his lungs and so there was something that was helping to track his breathing.
You know, seeing my dad so small and surrounded by all of these machines, I thought these things are meant to help assist him and empower him, but he feels very alienated and afraid of them. And that struck me as such the wrong relationship between humans and our technologies. She wondered, was there a way these machines could keep someone alive without being so terrifying? Maybe they could even come across as nurturing.
And it started to open the door to so many questions about how we could live with our technologies in a way that we weren't living with them.
Katie's dad made a full recovery. But these questions continued to nag at her. She'd always been good at math and science. So she decided to go back to school to try and find some answers. So even though there were lots of valid assertions that grad school was going to be very hard for me, I kind of did it anyway. And I did my masters, my PhD, and then my postdoc all at Stanford in mechanical engineering and computer science.
At Stanford, people were building technology to change the way we live with machines, well beyond hospitals, designing robots to do everything from cleaning our homes to offering companionship. So just imagine, what's it going to be like when not only you have autonomous vehicles on the road, but then you have
autonomous robots who are showing up to deliver your burrito and you have an autonomous robot that might fly in to drop off a package and then you go to your doctor's office and there's an autonomous robot that takes your temperature like this is a completely different way of living and working. I think we're about to undergo one of the most consequential shifts to our built environment in a long time.
So the question for me is how can we then build robots that make people feel empowered, inspired, legible, and clear that they feel safe around and know how to control? Because it's not a question for me of whether or not the robots are coming. It's simply a question of how quickly and what are those robots going to look like when they do show up?
Forget sitting down at a laptop or tapping on a screen. Technology is being woven into our physical world in all kinds of new ways, from robot helpers to smart body parts that turn us into cyborgs. But what are the challenges with creating devices that feel less robotic and dystopian and more organic and useful?
Today on the show, Augmenting Humans. Ideas about designing tech that enhances us physically without diminishing or even hurting humanity. Where do we draw the line between improving a human life and augmenting it beyond recognition? We are, arguably, the first generation at mass scale to be interacting with robots.
That is a huge impactful difference. So back to Katie Kwan. She ended up merging her love for dance with her interest in technology and is now a robot choreographer. Because even though countless engineers are building machines to act intelligently, she says we also need to consider how they move.
We know that movement is incredibly impactful to us. We've created all of these adaptations to very quickly experience and observe emotion and that to categorize, is this safe? Is this unsafe? Is this welcoming? You know, what does it mean? And this is why when people talk about nonverbal communication, you can express so much through simply the way that your body moves. Okay, why is that relevant for robots?
Because robots will often perform motions that are, quote, utilitarian, right, that are picking up a cup, moving it to a different part of a table, screwing a bolt into a piece of equipment. But the way that the robot performs that motion is deeply impactful if it's done near humans.
You ended up working at a place called Everyday Robots. At the time, this was Google's robot AI moonshot lab, as they called it.
And the idea there was to design robots that could help people in their everyday lives. So how did you bring your perspective as a choreographer and a dancer to making that a reality? It was such an ambitious moonshot because we really did want to bring robots into people's everyday lives at the same scale that people are used to with all different kinds of technologies, whether that's a smartphone or a car.
And so it was not only about can we get robots to do things that are useful like sort trash or wipe tables, but also can we build robots that are welcomed in these environments so that if you have a robot inside of an office or a shopping mall or many years in the future inside of a school or a nursing home,
What are the levers that we can push and pull to make these robots evoke sensations that are positive instead of negative? If a robot slides politely out of a doorway to let you pass, it might make you feel seen and acknowledged. Here's Katie Kwan on the TED stage. If a robot marches quickly towards you and avoids you at the last second, it might cause revulsion and fear.
Robots are beginning to show up in our everyday environments, from sidewalks to offices, backyards to hospitals, and they will be threatening and confusing to us if we do not carefully examine how they move.
Before AI, programmers needed hours to script a simple dance sequence for a robot to perform, just like they needed hours to script the robot to open a single door. With AI, you can teach the robot to open just a few specific doors, and it will learn to open all of them, even ones it hasn't seen before.
It's also true for dance. You can teach the robot to dance with a specific person, and it will learn how to dance and move with many others in many different environments and circumstances. This is what I did at Everyday Robots at Google. Rather than teach one robot, I used AI to teach 15 robots how to move together as a flock.
We imagined a world where you could walk down a hallway filled with robots and they would part to make space for you, like a flock of doves or a crowd of people on a city street, where a robot could navigate seamlessly and even beautifully through a busy, chaotic time square.
Just so people can picture these robots, they kind of look like factory arms attached to platforms that wheel around and sway, and they're strangely kind of cute. But I think the idea of walking through a hallway of robots, I mean, it sounds pretty intimidating.
Are you envisioning that someday it will feel totally normal and comfortable to live with robots because the way they move around us will be so fluid and, I guess, gentle that it won't feel scary?
I hear you on the walking down a hallway filled with robots. It's fascinating because my mom actually came and saw the flocking project at Google X. She was watching me from the side and thinking, oh, I don't know. These robots are following Katie around and they're moving according to her commands. And she was a little bit skeptical. But when she wandered through this group of robots, the smile that she had on her face, I don't think I've seen my mom grin like that.
interacting with any piece of technology in her whole life. And I asked her, you know, what was that for you? Why did you notice yourself smiling and laughing? And she said, well, it felt like I was interacting with a bunch of puppies or sort of this alien species that I wasn't afraid of that kind of opened my eyes and
opened my imagination to a different way of being with robots. And mom was like, it was fun and different and unexpected.
Is the goal, you know, we always hear with technology, the goal is to remove friction so that you're not even thinking about the technology that you're using. Is part of adding grace and choreography and smooth movement to these robots? Is it part of letting them recede into the background so that it does feel sort of seamless, our interaction with them?
I would say that removing friction is certainly a way of describing it. I tend to describe it also through the lens of safety. If you feel safer around these tools because you can anticipate and understand how they're going to move, then that's always an A+. I do think it's possible that we see a world in which people like my dad, who is now in his mid-70s,
can live at home for longer with some assistance from technology that allows them to continue to be safe and independent. A robot that's going to be inside of my dad's house might do a lot of simplistic things like remind them to drink a glass of water or to notify me if
my dad has fallen, or to remind him that the male has arrived. These are the kinds of tasks that a robot might do in an environment with my dad, and I want that robot to be safe, legible, clear, and empowering to him. But even more than that, and this is where I like to nudge my fellow roboticists and my fellow engineers, I also like to think about it through the lens of fun.
I'm like, we get to choose the kind of world that we want to live in. Is that future going to include robots that play beautiful music when they wander by us while they're wiping tables and sorting trash?
that makes you feel like your environment is fun and exciting? Or are we going to choose robots that give you a sense of fear, confusion, and fatigue, right? So it's very much for me and not only about the removing friction, making things safe, having more legible communication, but it's like
We can also imbue character, you know, artistry, creativity. And that's, for me, taking a robot from simply being a utilitarian tool into an evocative social agent. That was robot choreographer Katie Kwan. You can see her full talk at TED.com. On the show today, Augmenting Humans. I'm Anush Zamorodi, and you're listening to the TED Radio Hour from NPR. We'll be right back.
This message comes from NPR sponsored TED Talks Daily, a podcast from the TED Audio Collective. TED Talks Daily brings you a new talk every day. Learn about the ideas shaping this generation from AI to zoology and everything in between. Find TED Talks Daily wherever you get your podcasts.
This message comes from Grammarly. 88% of the workweek is spent communicating, so it's important your team gets it right. Enter Grammarly. Grammarly's AI helps teams communicate clearly the first time. It goes beyond basic grammar to help teams instantly create and revise drafts in just one click, all without leaving the page there on. Join the 70,000 teams and 30 million people who use Grammarly to move work forward.
Go to Grammarly.com slash Enterprise to learn more Grammarly Enterprise Ready AI. It's the TED Radio Hour from NPR. I'm Manush Zamorode. On the show today, augmenting humans. So the day of my accident was December 26, 2014.
And I was rock climbing in the Cayman Islands with my daughter and some other friends from Maine who just happened to be there coincidentally. This is Jim Ewing. And it was, I don't know, kind of an ordinary day. It was a climbing area that I hadn't been to before at a cliff called Dixon's Wall.
The climb wasn't particularly difficult, and Jim had been climbing for 30 years in much more dangerous and remote conditions. But that day, they only had time for one more ascent, and Jim says he was distracted. We were kind of in a hurry, and I wasn't really paying great attention to how things all got set up.
on this one last climb I was actually standing on a ledge almost completely what we call a no hands rest and I don't know I just lost my focus and I stepped off the ledge and just started falling. So I fell about 60 feet to the ground. Originally I thought
I sort of felt like I hadn't hit the ground all that hard. I would just lay there on the ground until I could catch my breath. I made eye contact with another climber and I noticed my wrist was at a funny angle and I said to him, just so you know, I think my wrist is broken. And he said something along the lines of, well,
I hate to tie this, but your ankle is looking pretty broken too." Jim was helicopter to a local hospital where he found out that, yes, his wrist and ankle were both shattered. He'd also dislocated his shoulder, crushed several vertebrae, fractured his pelvis, separated my ribs from my sternum, like just a huge laundry list of injuries that you might expect from falling that kind of distance.
Amazingly, within a year after a lot of surgeries and a lot of rehab, most of those injuries had healed, all except that ankle. The CT showed that the main fracture was still there, but also most of the bone was dead. It's a condition called avascular necrosis. And that occurs when the blood supply to a bone is cut off or damaged, and then the bone slowly dies.
Walking was difficult. The pain was excruciating. I was in a lot of pain. A lot of painkillers that the doctors just, you know, like there's nothing we can really do for you. Let's just give you more pain meds. So Jim began to wonder if he needed to do something drastic. Ultimately, my research in ankle injuries and ankle rebuilding
kind of led me down the path of, well, I really probably ought to be looking at amputation. And it just so happens that Hugh Herr and I were roommates back in the mid 80s.
That's correct. Yeah, Jim and I go way back. We were climbing buddies in our early 20s. This is MIT professor Hugh Herr. And he came to me and said, Hugh, I'm in so much pain. My quality of life is so poor. Does it make sense for me to consider amputating that leg?
Hugh is one of the top experts in prosthetics and a double amputee himself, who lost both of his legs in a climbing accident as a teenager.
So I knew from having known you that amputation isn't the end of an act of life and that you can still do a lot of things as an amputee, you know, life isn't necessarily over. So we chatted about it. And Jim's timing was quite insightful because we had just invented a new surgical paradigm called the agonist antagonist Mylonar Interface in 2014.
And when Jim called me, we were actually prepared to do the first human surgery with this new technique. And Jim volunteered to be that first human subject.
What exactly was Jim volunteering for? A new way to do amputation that keeps the brain body connection intact. So a quick anatomy lesson. In a healthy leg, when you flex your ankle, muscles in the front contract and stretch the muscles in the back. Extend your ankle and the reverse happens.
The movement keeps muscles strong and registers in the brain, helping you understand where your limbs are in space. So that's called proprioception. Proprioception. But with a traditional amputation, that connection between the muscles, nerves, and the brain is severed.
The current amputation paradigm hasn't changed fundamentally since the U.S. Civil War, and breaks these dynamic muscle relationships, and in so doing eliminates normal proprioceptive sensations. Hugh Herr explains on the TED stage.
Consequently, a standard artificial limb cannot feed back information into the nervous system about where the prosthesis is in space. The patient, therefore, cannot sense and feel the positions and movements of the prosthetic joint without seeing it with their eyes.
My legs were amputated using the Civil War era methodology. I can feel my feet. I can feel them right now as a phantom awareness. But when I try to move them, I cannot. It feels like they're stuck inside rigid ski boots. The limbs are not directly controlled by my nervous system. I can't think and move them nor can I feel my limbs. It feels like I'm walking on powerful robots, that it feels like I'm being walked. It feels like I'm in the backseat of the car.
The procedure that Hugh and his team developed preserves the feeling of being connected to the amputated limb. Again, it's called agonist antagonist myo neural interface or Amy for short, which brings us back to Jim and the first time the procedure was done in 2016.
So what we did in Jim's leg is we connected his muscles within his amputated residuum in natural ways, the chaff muscle to the muscle in the front of the leg called the tibialis anterior, so that when Jim thinks after the surgery, those muscles move dynamically in a similar way to how they moved when he had an intact leg.
So what that does is it tells the brain how the ankle should move. Now it's not a physical angle after the amputation, it's a phantom ankle. But when Jim closes his eyes and moves his phantom ankle, he feels the full dynamics of that sensation.
He can point his toes. He can go the other way. And from pointing his toes down like a ballerina to pointing his toes to the ceiling. And he actually feels it as if his foot ankle were intact and biological. Did you know that that was going to work? Like, do you remember after the surgery, the first time that he came to be fitted and what happened, what it was like when he stood up?
We hypothesized that he would have those sensations and he would be better able to control the prosthesis because of those muscle dynamics. But it was a hypothesis. And when we actually saw it with our own eyes, it was a remarkable day in the laboratory. We electrically linked Jim's Amy muscles via the electrodes to a bonic limb. And Jim quickly learned how to move the bonic limb in four distinct ankle foot movement directions.
We were excited by these results, but then Jim stood up in what occurred was truly remarkable. All the natural biomechanics mediated by the central nervous system emerged via the synthetic limb as an involuntary reflexive action. Here's Jim descending steps, reaching with his bionic toe to the next stair tread, automatically exhibiting natural notions without him even trying to move his limb.
Because GIN's central nervous system is receiving the proprioceptive signals, it knows exactly how to control the synthetic limb in a natural way.
Now, Jim moves and behaves as if the synthetic limb is part of him. For example, one day in lab, he accidentally stepped on a roll of electrical tape. Now, what do you do when something's stuck to your shoe? You don't reach down like this. It's way too awkward. Instead, you shake it off. And that's exactly what Jim did after being nearly connected to the limb for just a few hours.
What was most interesting to me is what Jim was telling us he was experiencing. He said, the robot became part of me. The first few movements were like, oh, wow. Again, here's Jim Ewing. My brain got all excited. My muscles in my leg kind of got all excited, like, hey, there's something happening.
And this phenomenon that Hugh later called neural embodiment occurs. Your nervous system, your body, your brain recognizes this piece of equipment as being part of you. You have embodied this thing and it just adopts it and starts using it as if it belongs there.
And actually it got to the point where while we were doing a bunch of tests, they would occasionally have to turn the robot off to reset things. And it got to be kind of a, not a physically painful, it was like an emotionally painful experience every time they turned it off. And I asked them, I said, you have to warn me when you're going to turn it off because it's like jarring to all of a sudden lose my foot again.
That's how much my body had become accustomed to it, and it did not like it when it turned off.
I don't know if it's just because I was getting ready to talk to you, Hugh, but I suddenly noticed that many people were amputees in my neighborhood. And I read that something like there will be about nearly 2 million amputations, something like that performed in the US every year. Oh, because of the precipitous increase, sadly, of diabetes?
Extreme diabetes, sadly, often leads to the need to amputate a limb, typically a leg. So the numbers are climbing higher and higher because of the increase in diabetes. So presumably you expect demand for this procedure to skyrocket. For sure. It's been eight years since the procedure was first performed. Since then, how many people have had the surgery?
of the surgery itself, over 100 people to date. And at all levels, you know, below the knee, above the knee, below the elbow, above the elbow.
the electromechanical integration into those new surgically constructed tissues will take longer. But I think in about five years from now, from a commercial setting, the full bionic reconstruction can be happening clinically, which is quite exciting. So how many people could qualify for this kind of a prosthetic?
A lot of people qualify. We can apply the surgical technique in an acute case at the time when the limb is amputated. We can also pursue these regenerative and surgical and electromechanical strategies as a revision. So it is possible for someone like myself that already has an amputation to undergo this reconstruction surgery to go from the past to the future.
Wow. I mean, in your TED Talk, you said that you, because of the surgery that you received, you were not a good candidate for this technology. Has that changed? It has changed. And I'm actually thinking carefully about when to go under the knife, when to receive these interfaces for myself.
Right now, if one chose to do this, how much would they have to pay? Let's say they had the surgery covered, but for the limb and the technology, how much would that cost? In five years when this entire bionic reconstruction is made available clinically commercially,
You know, for a bionic safe foot ankle with the magnets and the surgery and whatnot, it's on the order of $100,000, including the surgery and the robotic components and the sensing components and the computer components. So expensive. Well, it's on par with other surgeries.
The bonic legs that I'm now wearing cost about $40,000 about the cost of a car. What is the value of being able to walk? $40,000 sounds like a lot, but it's pretty nice to be able to walk across the room.
I listened to a podcast that you did with someone who received, who has been part of one of your research projects. And the two of you were talking about how much fun it is to have a prosthetic and this idea that we have normalized the quote unquote normal body and that we hopefully are entering an era where this merging of bodies and machines is
not only functional, but like sexy even. People often talk about the example of eyeglasses. The glasses are a prosthesis, but now it's a fashion accessory. When technology really works, when we're able to rebuild bodies and give people back their freedom,
give people back their ability to dance and to run with the expression that they want to put out into the world. Will these new bodies express themselves in terms of good design and aesthetics? Absolutely. It's going to be an interesting era when part of our bodies begin to age and deteriorate and another part can be potentially continuously upgraded.
Correct. My own body, my bonic legs are upgraded every five years and my biological body continues to get worse and worse due to age-related degeneration.
Two to age related and we're getting old, aren't we? That's right. So that you're right, that is very interesting. In a sense, the bonic part of my body is immortal, but my biological body obviously won't be able to keep up unless there's major breakthroughs in aging. I mean, this speaks to your philosophy that we should, and this is the word you use if I'm correct, become cyborgs, right?
Well, I don't know about good, but I do think it's part of humanity's natural progression to go from developing and using tools that are separate from our body to a more profound integration. Half of my lab is focused on technology to augment human capability beyond innate physiological levels. So we're building exoskeletons.
that if I gave you one of our exoskeletons, you'd be able to jump higher, run faster, walk better. So that type of technological power, if you will, will be very popular and will become quite pervasive in society. I predict 10 years, certainly 20 years from now, when you walk down the streets of major cities in the world, you'll routinely see people wearing Binance that are augmenting their capabilities.
So, like, the UPS guy is gonna be flinging packages with no problem. Absolutely. If we don't have humanoids, they'll bring packages, right? Or aerial vehicles, but if it's still humans, the humans will definitely be augmented, absolutely.
That was Hugh Herr. He is an engineer and biophysicist at MIT, where he co-leads the Yang Center for Bionics. Earlier this year, Hugh and his team published a study of patients who have received this new procedure. We will link to the study and whose talks at ted.npr.org. Special thanks also to Jim Ewing for sharing his story. Jim and Hugh, by the way, are both still climbing.
On the show today, augmenting humans. I'm Anush Zamorodi and you are listening to the TED Radio Hour from NPR. We'll be back in a minute.
It's the Ted Radio Hour from NPR, I'm Anush Zamorodi. On the show today, augmenting humans. We heard how Hugh Her is taking prosthetics and integrating them into the body in new ways. Now we want to turn to technology being developed to mimic another part of our anatomy, our skin.
This research is in its early days, because replicating the sensations that the body's largest organ sends to our brains is incredibly difficult.
So the skin is a complex system where there are a lot of things that are actually all working together at the same time. This is Anna Maria Koklita. She is a material scientist who has spent the past decade trying to replicate skin artificially. This idea isn't new. Artificial skin has been around since the 1980s. It's often used for burn victims. But there's something missing from today's artificial skin.
There is the possibility to reconstruct the skin more or less that it kind of looks similar to before the burn, but still the sensation is lost. The warmth of your coffee cup in the morning, water running through your hands as you wash them, or all the different textures you touch, smooth, rough, soft, or sharp. All of those sensations are captured by your skin's receptors.
We have receptors that are for strong touch, light touch that is for the temperature. We have millions of receptors. And all day long, those millions of receptors are bombarded with all kinds of information.
And then they transmit this information through electrical stimuli to the brain thanks to the nerves, nerve connections. So it's a very complex system. And because the skin is so complex, replicating all those sensations has been really difficult until now. This is a piece of skin, or artificial skin.
Here's Anna Maria Cucleta on the TED stage, where she unveiled smart skin. We have, for the first time, produced an artificial skin that can respond at the same time to three stimuli. Touch, so force, temperature, and humidity. And it can do this also at an unprecedented resolution.
So it's a very tiny device and so this means that it can sense objects that are actually smaller than the objects that can be sensed with our skin. So first of all, imagine burns victims.
if the burn is very deep, this burns up until the lower level of the epidermis and this makes patients lose sensation. If one could make completely artificial skin, then you know, the skin, this artificial skin could be applied as a patch in the area where there is the burn and give back the sensation to the people who have lost it.
So, let's talk about what you're doing in your lab. If I came into your lab and you showed it to me, what would I see? This artificial skin is actually thinner than the cross-section of a hair.
So it's basically impossible to see and impossible to really feel it when you touch it. So it takes the properties and the characteristics of the support material. So if we deposit it on top of a glove, it will look like a glove.
We have even deposited on top of this transferable tattoos, you know, the type that kids use. And then what you see is just really the tattoo paper. So it's so tiny that you don't see it and you don't feel it, but it takes the shape of the support material.
The artificial skin is made of a bunch of nanoscopic cylinders. This is the architecture of the artificial skin, so we are really able to control the sickness and the chemical composition of a material at an atomic level.
The inner core of each cylinder is filled with a polymer that gets bigger when exposed to a stimuli. And the outer part of the cylinder is made of something called piezoelectric material. A piezoelectric material is a material that when it is compressed,
produces electricity. So when the cylinder is touched or exposed to heat, for example, the polymer on the inside kind of puffs up and compresses the material on the outside and boom. This produces a letter current. From there, each of these cylinders could be connected to a series of electrodes and then we measure the electricity at each of these locations.
And similar to how our own skin sends information about what it's feeling to our brains. The artificial skin sends information to a computer. And that's where we read this electric signal. But then, you know, this signal can also be transmitted wirelessly to, for example, a neuro prosthetic. And this is how we actually intend to transmit it to the brain. But that will be a future.
development. You mentioned prosthetics. Earlier in this episode, we talked to Hugh Herr at MIT. I'm sure you're familiar with his prosthetics work. How would smart skin be an added value, I guess, or be important to someone who needs to wear a prosthetic?
Yeah, so with this type of artificial skin, when this would be added to our processes, then we could produce electrical signals that could send directly the information, could either stimulate the rest of the arm or of the leg or they could transmit the information to our neural processes in the brain and
then help the patient recognize also the characteristics of the objects that they are touching. So if they had a prosthetic foot, they would know if they were walking on hot gravel or... Yes, exactly. So a prosthetic hand, for example, would fill a hot cup or a cold bottle of beer and would feel the difference.
Another interesting field of application would be robotics. Nowadays, humanoid robots are used in many fields, for example in medicine, but also in household. And these robots are exposed to several stimuli, several interactions with the environment and with the humans, and sometimes they have too many inputs at the same time. And this is the reason number one for robot failure.
So imagine a future where actually a robot could be a bit more sensitive, a bit smarter. This would lead also to a higher safety of this technology.
I mean, would that be, this sounds kind of like science fiction, but in the future that you have a burn and you just put on like a temporary tattoo over that part and it connects to your body? Something like that, yes. It could be a temporary tattoo or a patch, you know, that can be applied on the body and then there could be different way of detecting the electrical signals. Could be even that it is just connected to an app.
on the smartphone and then maybe the app is I don't know sending a message warning sound if the temperature goes above a certain level there could be different options
In terms of the drawbacks to this, I can only imagine this is expensive? Well, yes and no, actually. In the sense that the instruments to deposit these materials are expensive at the beginning, but then the amount of material that is produced is so tiny that when you do a calculation per centimeter squared, the price is not that high. Oh, interesting. OK, so this could be something that is accessible to people.
Yes, that's where we would like really to keep it accessible to people. It's a challenge. And therefore, it's an interesting project, you know, from all the scientific point of view, from the technological point of view. And yeah, this is what keeps me going.
That's Anna Maria Koglita. She's a material scientist and a professor in the Department of Physics at the University of Bari in Italy. You can see her full talk at TED.com.
So we've talked about adding technology to our lives in the form of robots, and prosthetics, and smart skin. But technology is also being used to enhance and alter organisms that live inside the human body. The biochemist, Jennifer Dowdeno, won the Nobel Prize for discovering the gene editing technology CRISPR.
Now she's using that technology to find a cure for diseases like asthma and Alzheimer's by manipulating the microbiome that live inside our guts. Here she is, explaining how on the TED stage in 2023. The essence of being human is that we solve problems. And when we're faced with enormous problems like disease and climate change, we need to solve them by collaboration.
I'm excited to tell you about a new kind of collaboration that will absolutely create solutions to these big problems. It's a collaboration that's unexpected because it's between humans and the tiniest organisms that populate our planet, the bacteria and other microbes that live in, on, and around us.
Bacteria may be small and unseen, but they often have inspired transformative innovations, including the one that has become the cornerstone of my own research. Over the past decade, I've been at the forefront of developing a revolutionary technology called CRISPR.
that has come from the study of how bacteria fight viral infection. CRISPR is amazing because it allows us to precisely edit the DNA in living organisms, including in people and plants. With CRISPR, we can change, remove, or replace the genes that govern the function of cells.
This means that we now have the ability to use CRISPR like a word processor to find, cut, and paste text.
CRISPR, amazingly, has already cured people of devastating disorders like sickle cell disease, and it's created rice plants that are resistant to both diseases and drought. Incredible, right? But the next world-changing advance with CRISPR will actually come from editing genes beyond just in individual organisms.
We now have the ability to use CRISPR to edit entire populations of tiny microbes called microbiomes that live in and on our bodies.
For decades, scientists studied bacteria one organism at a time as if each type of bacteria behaved independently. But we now know that bacterial behaviors, both good and bad, result from their interactions within complex microbiomes. In humans, dysfunctional gut microbiomes are associated with diseases as diverse as Alzheimer's and asthma. And in farm animals, microbiomes produce methane, a powerful contributor to climate change.
But when they're healthy, both human and animal microbiomes can actually prevent disease and reduce methane emissions. So to harness these benefits, we need a way to precisely and reproducibly control these microbial communities.
So why are microbiomes been difficult to control in the past? It turns out that microbiomes are very complex and they're difficult to manipulate. Antibiotics affect the entire microbiome and their overuse can lead to drug resistance. Diet and probiotics are nonspecific and they're often ineffective. Fecal transplants face various challenges to both effectiveness and acceptance.
But with CRISPR, we have a tool that works like a scalpel. It allows us to target a particular gene in a particular kind of cell. With CRISPR, we can change one kind of bacterium without affecting all the others. Another challenge is that less than 1% of the world's microbial species have been grown and studied in the lab.
Fortunately, we can now access the other 99% due to the pioneering research of my colleague Jill Banfield and her breakthrough technology metagenomics, which is a tool that allows us to figure out what species are present and what they're doing in a microbial community. Metagenomics creates a detailed blueprint of a complex microbiome, and that means that we can use it to figure out how to use gene editing tools in the right gene, in the right organism.
You might be wondering how we can take this new knowledge and harness it to solve real-world problems. Well, we're bringing together these two breakthrough technologies, metagenomics, and CRISPR to create a brand new field of science called precision microbiome editing. This will allow us to discover links between dysfunctional microbiomes and disease or greenhouse gas emissions. We can develop modified and improved microbiome editors and show that they're safe and effective.
and then we can then begin to deploy these optimized solutions that will be transformative in the future. So, how does this affect our health and the health of our planet? Specific microbiome compositions in livestock can actually reduce methane emissions by up to 80%, but doing that today currently requires daily interventions at enormous expense, and it just doesn't scale.
But with precision microbiome editing, we have an opportunity to modify a calf's microbiome at birth, limiting that animal's impact on the climate for its entire lifetime. In human health, asthma affects up to 300 million people around the world, a number that grows by 50% each decade, and it disproportionately affects lower-income children.
Our team has identified a promising link between a molecule produced in the gut microbiome and asthma development. With precision microbiome editing, we could offer a child at risk for asthma, a non-invasive therapy that would eliminate asthma-inducing molecules, changing her life trajectory.
And what's really exciting is that these same approaches in the future could help us treat or even prevent human diseases that are linked to the gut microbiome, including obesity, diabetes, and Alzheimer's. I think it's fascinating that we can now use CRISPR to edit the same tiny organisms that gave us CRISPR. In doing so, we're collaborating with the ultimate partner, Nature.
Together, we can use CRISPR-powered precision microbiome editing to build a more resilient future for all of us. Thank you very much.
That was Nobel Prize-winning biochemist Jennifer Doudna. You can watch all of her talks at TED.com. Thank you so much for listening to our episode, Augmenting Humans. It was produced by Katie Montilione, Rachel Faulkner White, James Delahussi, and Fiona Giron. It was edited by Sanaaz Mashkinpour and me.
Our production staff at NPR also includes Hersha Nahada and Matthew Clutier. Our audio engineers were Patrick Murray and David Greenberg. Our theme music was written by Romtine Ora Bluey. Our partners at TED are Chris Anderson, Roxanne Heilash, Alejandra Salazar, and Daniela Ballarezzo. I'm Manouche Zomorodi, and you've been listening to the TED Radio Hour from NPR.