In his 2008 book Outliers The Story of Success, Malcolm Gladwell popularized the 10,000-hour rule. The rule suggests that achieving world-class expertise in any field requires just 10,000 hours of deliberate practice.
Gladwell's book was an international bestseller and the rule became widely known. He spoke about it on news segments at conferences and dozens of self-help books, motivational speeches and business talks have referenced it. Chances are you've heard of the 10,000-hour rule before and chances are you believed it.
See, there is a psychological bias that makes the 10,000-hour rule more believable. This bias turned a lying fraudster into a millionaire health promoter. It caused one of the most infamous oil spills in history and has been used by marketers to sell for thousands of years. All of that coming up.
So, you want to be a marketer. It's easy. You just have to score a ton of leads and figure out a way to turn them all into customers. Plus, manage a dozen channels, write a million blogs, and launch a hundred campaigns all at once. When that's done, simply make your socials go viral and bring in record profits. No sweat. Okay, fine. It's a lot of sweat. But with HubSpot's AI-powered marketing tools, launching benchmark breaking campaigns is easier than ever. Get started at HubSpot.com slash marketers.
The 10,000-hour rule is widely believed. It's the idea that you can master any skill with 10,000 hours of practice. Yet the research it's based on was limited to just violinists, it didn't measure their skill, and it didn't even mention 10,000 hours.
In the 2016 Brexit referendum, buses paraded the claim that the European Union membership costs the UK $350 million per week. The actual figure was $250 million or $120 million after deducting for the amount the EU gives back to the UK.
Volkswagen marketed its diesel vehicles as environmentally friendly, labeling them as clean diesel. However, it was later revealed that the company had installed software to cheat emissions tests, allowing cars to emit pollutants far beyond the legal limits.
Danan's activity of yogurt was advertised as being clinically and scientifically proven to boost the immune system and regulate digestion due to its special bacterial ingredients. These claims were unproven, leading to a class action settlement of $45 million and orders to remove these claims. There are countless examples of misinformation like this, but what drives people to believe it?
To find out, I spoke with a professor of finance at the London Business School and the 2001 professor of the year, Alex Edmonds. My name's Alex Edmonds. I'm a professor of finance at London Business School, a non-executive director and author of a book Make and Tain Lies about bias and misinformation. Alex wrote his book Make and Tain Lies to address the problem I posed at the beginning of the show. What makes people believe misinformation?
So out of the spectrum of academics, I really like to apply research to real world problems rather than just sitting in the ivory tower. And so I'm often engaging with investors and policymakers and executives about the findings of research on business. But what I found was how people would respond to the research would depend not on its accuracy, but whether they liked its findings. So if it's research with a finding that they liked, it was the world's best research.
That's going over the top. I wouldn't even call my own papers, the world's best papers. And if it was something which gave an uncomfortable truth, then they was to dismiss it as being purely academic and irrelevant. People believe things that they like. It's not a shocking revelation, but it's one that causes us to believe lies and misinformation, something that Alex has experienced firsthand. A number of years ago, there was a House of Commons inquiry into corporate governance, the way companies are run.
prompted by scandals like BHS and Sports Direct. And what happened is they send some questions that they want to discuss. You can choose if you want to provide some written evidence to feed into those questions. And because I work in corporate governance, I thought this is something I could bring not only my own research to, but other research I was aware of. I did that. And based on my submissions, they kindly invited me to testify orally in the House of Commons. I was
bit nervous so I got early to the session so that I could just swat up on my notes but I had one year ear open just for the session before me and that ear pricked up, actually both of my ears pricked up because I heard some research where it sounded really striking. So the witness before me said the greater the gap between the CEO's pay and the workers pay the worse the company's performance and conversely if there's a smaller gap if you're paying your employees fairly
the company performs better. And that I really wanted it to be true because a lot of my own work is about the importance of treating employees well. My first book, Grow the Pie, was about the power of employee satisfaction among other things. And so I wanted to look this up. I wanted to find out, well, what was this paper so that I could quote it myself? But then when I found that paper, it said completely the opposite of what the witness claimed. It claimed that the higher
the gap between CEO pay and worker pay, the better the company performance. So inequality is actually good for profitability. And I thought, well, have I made a mistake? Maybe I'm so nervous about my own session and I miss reading this, but no, it was as clear as day. And then digging a little bit more deeply into my iPad, I realized what had gone on is that the witness had quoted a half finished version of that paper. Now, the finished version was already out in 2013.
and this hearing was in 2016. So they could have looked at the final findings, but no, they latched onto a half finished version because it gave the message that they wanted here. And so what is the broader lesson that we can draw from this one episode is that we often hear the phrase research finds that studies prove that evidence shows that. And we think that this is gospel, but you can almost always find evidence to support whatever you want to support.
even a half finished version of a paper where the final draft shows completely the opposite. The committee Alex testified to treated that overturned study as gospel. Witnesses stated it was clear academic evidence. Alex writes that partly due to this claim, the report recommended that every large UK company disclose its pay gap, which eventually became law.
The takeaway isn't about pay gaps themselves, but the need to scrutinise evidence more carefully. Reports can support any opinion, even using debunked research. You can find evidence for any point of view. Do I want an excuse to drink loads of red wine this evening? I'm sure I could find a paper showing that red wine leads to longer life. If I want an excuse to lay in the couch tomorrow morning and not go to my gym session, I could find some evidence suggesting that actually exercise does not.
lead you to lose weight. So just the existence of a study does not prove anything. What we really need to look at is the study's rigor.
We believe what we want to believe. We'll find evidence for what we want to believe, but disregard the same evidence if it states the opposite. The 10,000-hour rule is believed because people want to believe that the only thing separating them from greatness is a few years of hard graft. People don't want to believe that they'll never become world class at something because they're simply not talented enough. All of this is caused by a bias that many of you listening will be familiar with. It's the confirmation bias.
confirmation bias is the idea that we have a view of the world and if we see evidence which supports that worldview we lap it up uncritically. Why? Not because we're bad people, not because we're intentionally biased, but because we're people. We're humans and as humans we have a brain and one part of the brain is the striatum and that is activated when you see evidence you like it releases dopamine in your body
And that means that we just don't question what we see. Alex shares a quote in his book from Francis Bacon, one of the pioneers of the scientific method. Bacon observed that the human understanding, when it has once adopted an opinion, draws all things else to support and agree with it. Here's Alex with a fairly shocking example of just this. So I start the book with the example of Bell Gibson. So who is she?
She's a happy young Australian. She lives in Perth. She loves skateboarding. But then, sadly, at the age of 20, she was diagnosed with brain cancer. And she tried chemotherapy and radiotherapy to fight it. It didn't work. So she turned to diet, clean eating, exercise, and sheer, sheer willpower. And miraculously, this led to a complete recovery. She was completely
And this was inspirational. She was blogging about her experiences with clean eating and grit and willpower. And this led all the people to also Sean chemotherapy in search of a natural remedy such as clean eating. Bell's natural cure for cancer involved meditation, ditching meat and adopting a fruit and vegetable based diet. Her miraculous recovery influenced millions. In August 2013, Bell launched the whole pantry app.
which featured recipes, wellness guides and lifestyle support. The app hit number one on the Apple App Store in its first month with 200,000 downloads. It was awarded Apple's Best Food and Drink app of 2013. Apple invited Bell to their headquarters to collaborate on a secret project
which turned out to be the Apple Watch. In 18 months, the app and book earned Belle an approximate £218,000, most of which she donated to charity. This story sounds impressive, it sounds wholesome, but there's a dark side. Now, the sad thing was that this was all a lie, because Belle never had cancer. She made up the cancer story and she made up her cure, but why is it that supposedly sensible people
We're listening to the advice of a blogger rather than their oncologist and ditching chemotherapy for clean eating is confirmation bias. They just wanted it to be true. We are told as kids, you can do anything you put your mind to. So we would like to believe that willpower is enough to defeat a serious illness like cancer. We are also predisposed to believe that natural stuff is good. We'd like to believe that the solution is clean eating.
rather than the chemotherapy drugs fabricated by a giant corporation. And so even though in the cold light of day, it seems obvious to check the facts. Was this woman actually sick? Is there general evidence beyond this one case that clean eating works? But we don't think with the clear head. When we have dopamine going through our bloodstream, if you're a cancer patient and you're having chemotherapy, and it isn't really working, at least not obviously, and you're suffering from some symptoms,
Yes, when there is this alternative, then you might think, well, let me just believe this and latch onto it. Bell's claims had tragic consequences. One cancer patient following Bell's advice died within months, prompting a whistleblower to expose Bell. Bell's influence could have caused 100 more deaths, if not uncovered. But confirmation bias doesn't only influence medical decisions, it can shape criminal investigations as well.
Alex writes how U.S. prisoners have collectively spent 30,000 years in prison for crimes they were later exonerated of. Criminology professors Kim Rosmo and Jokalyn Pollock analyzed 50 of the most seriously overturned convictions to uncover what went wrong.
Now, while factors like media pressure, unreliable witnesses and flawed forensics contributed, none of those factors appeared in more than half of the cases. Confirmation bias, on the other hand, was the leading cause behind exoneration. It was found in 74% of the cases. We've seen examples of confirmation bias in action, but I wanted to highlight a reliable study demonstrating it. So I asked Alex if he could share one.
Yes, absolutely. So if there's really nice evidence where people take some participants, strap them up to an MRI scanner, so they look at brain activities. So they see how your brain responds to seeing information you like and information you dislike. And this shows how deeply wired these biases are within us. So we've already shared some evidence as to what happens when we see something we like. It lights up the striatum and this leads to dopamine being released.
But what about when we see evidence we dislike? So there are three neuroscientists, Jonas Kaplan, Sarah Gimbal and Sam Harris. What they did was they took some students and they found out their prior beliefs on certain issues such as gun control or abortion. And they gave them some statements and some of those statements were politically charged statements such as gun control. And they would then provide some evidence against that statement.
For example, it might be new study finds that actually gun control does not reduce crime. And so what would the reaction of a person be to that? Well, they didn't like it because they're pro gun control. What they found was that the amygdala in the brain lights up. Well, what's the amygdala? That's the part of the brain which is activated when you are, say, attacked by a tiger and you have a fight or flight response.
So this is why we will dismiss evidence we don't like. This might be why we block or unfollow somebody on social media who shares a different opinion. It may well be why we might attune some negative motives to these people. They might be climate change deniers or racists or sexist. We just don't want to listen to what they have to say. Now, what was interesting is the same researchers did a control where they took a non-political statement and they also contradicted it.
For example, they would say Thomas Edison invented the light bulb, and then they came up with some evidence suggesting that he didn't invent the light bulb, and there was no reaction of the amygdala. So this is really important because what it means is that even if you have a strong, deeply held belief that Thomas Edison invented the light bulb, which we're all told, if it's not something which is consistent with your value system, you don't mind it being contradicted.
So it's not that people just don't like listening to descending opinions, but when it's something which is in contradiction to your particular values, maybe on gun control or abortion, or the power of clean eating and mindset, that is something where you will be prickly or non-receptive to contradictory opinions.
When people's values are challenged, their brains amygdala, responsible for the flight or fight response, activates and it causes them to react as if they're facing a physical threat. But this reaction only occurs when someone's values are challenged. For example, I experienced confirmation bias when watching my favourite football team, but not when I'm watching a triathlon that I have no personal interest in.
This study, like all other studies included in the podcast, is cited in the show notes. So if you want to double check it for yourself and I advise you to, please do feel free. Confirmation bias led young cancer patients to abandon chemotherapy. It caused witnesses to give inaccurate testimonies and has contributed to 74% of wrongful criminal convictions. But could it also mean that the 10,000 hour rule is flawed? Well, find out after this short break.
Create, like the greats hosted by Ross Simmons, is brought to you by the HubSpot podcast network, the go-to audio destination for business professionals. In each episode, Ross dives into the stories behind some of history's greatest creations and creators. He unpacks the strategies, processes, and lessons that shaped them. His episodes are engaging, his insights are practical, and he's been living these principles that he shares for over a decade.
If you enjoy exploring creativity, the history of creators and actionable advice, this podcast is for you. Listen to create like the greats wherever you get your podcasts.
Hello and welcome back to Nudge with me Phil Agnew. The 10,000 hour rule is widely recognised. It's the concept that achieving world class expertise requires just 10,000 hours of practice. It was popularised by Malcolm Gladwell and has since become a cultural touchstone. Here is Gladwell explaining the rule during an interview on CNN.
And you talk about the 10,000-hour rule, that it's not just a matter of, well, this person's a genius, this person has amazing ability. It is actual practice and hard work.
You know, so a bunch, a group of really brilliant psychologists in the field of expertise research have sat down and tried to figure out, how long do you have to work at something before you become really good, right? And the answer seems to be, it's an extraordinary consistent answer in an incredible number of fields, and that is, you need to have practice to have apprenticed for 10,000 hours before you get good.
That statement feels accurate. It's something we intuitively believe. Even Alex, who literally wrote the book on misinformation, admitted that when he first heard the rule, he believed it too. This is the idea that you can become an expert in anything as long as you put in 10,000 hours of practice. And I first heard this because I was at a talk given by a Cirque du Soleil acrobat.
And he was asked, well, how did you become such an expert in this? Were you born double-jointed? He said, no, it was just practice. And this is the message that I, and probably lots of other people, would like to be true. Like we tell your kids, you can do anything you put your mind to, practice makes perfect. You'd love to believe that you're not a slave to genetics. You can choose whatever career you want as long as you have passion and drive. And I started to teach this to my students.
So I was a professor at Wharton at the time, and I teach boring finance for 11 out of the 12 lectures. And then the 12th lecture, the final half hour, I give sort of some life advice, a bit like professors give this sort of last lecture. And I would say, well, the next two years here during your MBA, just try to push yourself outside your comfort zone, challenge yourself, learn something new. And even if this is something you don't think you're naturally talented at, if you put in the time you'll get
the better at it, just look at the 10,000 hours rule. And then it was only a few years later that I had to give a public lecture devoted to that rule only, that I looked at the research more carefully. And I found that the evidence behind the rule was very weak, far weaker than Gladwell claims and most people were quoted for. Why was it weak? Firstly, it was nothing to do.
with many of the things that people try to improve at, the study that he quotes was specific to violin playing. Now, Gladwell claims this is a universal rule. You can become an expert in anything such as neurosurgery or chess. You could play for the England football team. You could have a number one hit as a singer. But just because it works in violin playing, it doesn't mean that it will work in football. Then you might need some natural talent. The same is true in singing.
But then is the evidence for violin playing actually strong? No, because what the study did is it took some students, which were in the Berlin Academy of Music, and they were aged about 18 now, and they were asked to say how much did they practice from age five? Now, age five is about 13 years ago. Can you remember what you did even last week, let alone 13 years ago?
And so here is a correlation issue. If you know that you are a great violinist, you will probably say, yeah, I probably practiced 20 hours a week when I was five years old. If you know that you're a mediocre violinist, you'll probably say, I didn't practice that much. You don't want to admit to yourself that I practiced a lot and it was to no avail. So it was probably your current success which drove your self-reported practice rather than your practice
driving your success. But people don't bother to read the underlying research when you see something you want to be true, written by an author that you figure like Malcolm Gladwell, lap this up and let's try to act on it. And there are some people who do try to act on it. I go through in the book an example of a guy who decides to become a professional golfer. He quits his job and just tries to spend 10,000 hours just playing golf.
And then he has to quit after about 6,000 because he has repetitive strain injury. And that was not being based on science, which is on the apparent self-cross training, just misapplying Gladwell's rule, which was based on shaky evidence to begin with.
Gladwell implied that 10,000 hours of practice are sufficient for success, suggesting that with enough effort, anyone could achieve true expertise even if they didn't have natural talent. However, the original study that Gladwell cites suggested that 10,000 hours of practice, or just hours of practice, because it didn't mention a number, that these hours of practice are only necessary for success.
What the authors of the paper declared was not that you need to just practice, it's that you need both practice and talent. Rather than guaranteeing success through practice alone, the study simply highlighted that you cannot be good unless you practice.
But practice is obviously just one component, and simply practicing for 10,000 hours is just not enough to ensure success. The book that popularized the rule, outlines, well it became an instant international bestseller, debuting at number one on the New York Times bestseller list.
thousands of highly intelligent people from CNN journalists to business leaders embraced the rule due to confirmation bias. The idea of the 10,000 hour rule, it just aligned with their existing beliefs. However, Alex explains that it wasn't just confirmation bias that made this rule so popular, another factor was at play. And this is the narrative fallacy. Yeah, absolutely. So what is the narrative fallacy? So this is we will try to weave in a
cause effect relationship, a narrative to something which might have some completely different explanations. So let's have a concrete example. So why is Steve Jobs so successful? Well, if you were to read his biography by Walter Isaacson, this argues that one big driver was the fact that he was adopted. So he was adopted and because he was adopted, he felt abandoned by his birth parents. This drove him to want to prove himself and he wanted to work really hard
and be obsessive about certain things like the quality of the design, and this drove him to become successful. And so that is a narrative. That is a story. Stories are really powerful. This is why we had the oral tradition, long before we had writing, people learned three stories. And it's a story which tugged at our heartstrings, be root for the underdog,
We like the fact that the person who made it rich was not somebody born with a silver spoon in his or her mouth. It was somebody who had to overcome adversity, such as being abandoned by your birth parents. But the problem with this exclamation is it's just not true. Right, Steve Jobs himself said, there's some notion that I was abandoned and I had to prove myself, but no, I've always felt loved. I've always felt accepted. But if you were to weave a story that people believe in,
If that story fits confirmation bias, then people will later up and accept it, and therefore your book will sell a lot, which is the case for Walter Isaacson's biography.
Alex says the narrative fallacy is our temptation to see two events and believe that one caused the other even if there were different causes or no cause at all besides luck. A 1975 study revealed how we create these explanations for random events. Organizational psychologist Barry Straw divided students into groups and gave them identical financial data for a fictitious company.
He then asked the groups to predict and forecast the future sales and profits. Afterwards, he told some groups that they were accurate, and he told other groups that they were incorrect. He did this completely at random, so some groups were randomly told they were right, other groups were randomly told they were wrong. In reality, there's no right or wrong answer.
He then asked the teams to evaluate their group dynamics. Now, the groups labeled as successful described themselves as communicative, cohesive and motivated, while those labeled as unsuccessful reported really poor group dynamics saying that they didn't communicate well and that they had a pad leader and all these things. The twist here is that there was no actual difference in the performance of the groups. The outcomes were fabricated.
Yet the teams reverse engineered some narratives to explain their results. The supposedly successful teams attributed their success to openness and collaboration, while the failing teams blamed their failure on disagreement or lack of focus. This shows how we can easily create stories to justify outcomes regardless of the truth.
Like the students creating narratives to explain random outcomes, we craft stories attributing Steve Jobs' success to being adopted or believing the 10,000-hour rule, regardless of the evidence. This confirmation bias and narrative fallacy can lead to some monumental business mistakes, one of which was the largest marine oil spill in history.
And so this was the Deepwater Horizon disaster where an oil rig, which BP leased, it exploded, killed engineers and damaged a huge amounts of coastline and oceans. And so why is it that this disaster happened? So why didn't they do checks before removing the oil rig to check that everything was safe? Well, the answer is they did do the checks. They did the standard check, which is known as a negative pressure test.
which is you remove the rig, you bleed the pressure and the pressure should go all the way to zero and stay at zero. Now they try this once. The pressure then rebounded to a huge number. They tried it twice three times every time the test failed by a huge margin. But people just did not want this to be true. And so they invented an explanation
for why the test may have led to this negative result. And so they claim something known as a bladder effect. They used a lot of engineering mumbo jumbo to invent this reason why the test was not reliable. And so this gave them an excuse to run a quite different test and eat water, rise and pass that test. They thought that the rig was safe, the well was safe, and it ended up exploding and leading to all of these disasters.
Alex says this is an example of blanket skepticism, rejecting claims we dislike by inventing alternative explanations. Instead of evaluating evidence objectively, we engage in motivated reasoning, grasping at far-fetched theories to justify our initial convictions and dismiss the evidence. So why do I think this is another relevant example? Because it highlights, well, who suffers from blanket skepticism?
It might actually be the smartest people. Why? Because smart people are able to engage in what's known as motivated reasoning. They're able to come up with ways to dismiss evidence they don't like and invent things like a bladder effect. And I think this is so striking as sometimes listeners might think, or readers of my book might think, well, okay, these are biases, but those are for less intelligent people. I'm a sophisticated executive. I will never make this mistake.
But actually, what the evidence suggests, I cite some evidence within the book, is that more smart and sophisticated people suffer more from blanket skepticism. Why? Because they could invent of the explanations for a particular finding. And I go through other examples, not just deep water horizon. We have Silicon Valley Bank, where the models predicted the bank would crash if interest rates rose. They said, well, well, let's just change the model, put in different assumptions here.
people who were warned that the subprime prices would happen, that they were too heavily into subprime loans. Again, they were able to give exhalations for why that this was not a bubble. So in fact, sometimes the smartest people might suffer most from liquid skepticism because they're most able to conjure up all terms of exhalation.
Confirmation bias shapes the world we see. It fueled the belief in the 10,000-hour rule. It distorted our understanding of major events like Brexit. It enables false advertising from companies like Volkswagen and Danan, and even had life or death consequences in Bell's cancer claims. Confirmation bias also impacts criminal investigations, drives us to invent stories to explain random outcomes, like crediting success to unrelated factors. But how can we fight back against confirmation bias?
To find out, I asked Alex. I think the first step is to recognize this. And just like the first step to combat an addiction is to recognize that we have an addiction here for confirmation bias. You might think, well, I'm a rational, I'm a successful person. How could I ever be biased that then I would not have got to where I am? Everybody is biased. I'm biased on this topic, despite me studying this all the time. And so then what does this mean if we see a study we like,
One useful practical tip to deal with confirmation buyers is imagine the study had the opposite result. How would we try to knock it down? And so this tries to make sure that we are accepting something not just because we like its finding. So let's give an example. So a lot of my work is on the benefits of sustainability. My first book, Grow the Pie, was about how sustainable companies do better.
And so if there was a study release, which says sustainable companies outperform, I would want to share this and shout it from the rooftops. Now, what does the idea of imagine the opposite imply? If the study had the opposite result, sustainable companies perform worse, that would trigger my amygdala. I would start trying to scrutinize it. I would ask, well, how many companies did they look at? Was it just one or two? Then it could be outliers.
Did they only measure performance over two months when what we care about is long term performance? How do they actually measure sustainability? Was these companies self reports of how sustainable they are or actual hard data of actual actions that they're taking? Do you control for industry? Maybe sustainable companies are in renewable energy and they just happened to have underperformed because Trump was elected. That's nothing to do with the long term prospects of the sustainability sector.
And so here, I've just already come up with five or six different explanations. Then ask ourselves, do these actualations still hold even though the evidence is in my favor? So in the pandemic, there were people saying, oh, it's sustainable companies have performed really, really well. This is evidence that sustainability pays off. But this was over just a very short period of just two months after the pandemic broke out. This could just be a sector effect. So airlines did badly in the pandemic.
nothing to do with their carbon footprint, but with just the fact that people are not flying. And so why I think the idea of imagine the opposite is so powerful is it shows that the skills and the tools of discernment are already within us. As if my book said, oh, the way to avoid being misled by statistics is to do a PhD in statistics. That's useless advice. Nobody has the time to do that. But what I'm highlighting is that you just by applying the skepticism you already have,
you can combat a lot of sources of misinformation. So I see this on LinkedIn, whenever I see a study shared that people don't like the sound of, there is no shortage of comments as to why correlation is not causation, why the setting might be quite different from what you care about violent playing versus say football, and it may well be that you've not controlled for other factors. Now those critical thinking faculties that suddenly turned off, when we see a study we do like, we lap it up uncritically,
So again, the idea of imagine the opposite, make sure that we're just as discerning as something we do like as something we don't.
to arm yourself against confirmation bias, just imagine the opposite. Rather than accepting Malcolm Gladwell's claim that 10,000 hours of practice leads to world-class technique, imagine the opposite. The opposite of the 10,000-hour rule is that success is largely determined by natural talent in eight ability or external factors, rather than deliberate practice and effort.
This challenges me because I want to believe that I can achieve greatness without relying solely on natural talent that, let's face it, I probably don't have. This discomfort makes me want to question the opposite belief. However, Alex argues that we should apply that same level of skepticism to ideas we agree with. Don't just accept something because it aligns with your beliefs. Instead, imagine the opposite and approach everything with the same critical thinking you use when questioning ideas you dislike.
10,000 hours of practice will not make me a world-class sculptor, it won't make me a world championship darts player or a world-beating boxer. Chance, natural ability and external factors are needed as well. So next time you hear something that you wholeheartedly agree with, take a step back, imagine the opposite and question it. It might just save you 10,000 hours.
Well folks, that is all for this week's episode of Nudge. A massive, massive thank you to Alex for coming on the show. His book, May Contain Lies, was easily one of the best books I read last year. I'd heavily recommend you pick it up. In fact, I have listed Alex's book in my new reading list. The reading list contains 25 books you should read in 2025 and five books I think you should avoid.
Alex's book is alongside some classic books you'll know by name, and some niche books that you won't have heard of. If you're looking for inspiration on what to read this year, do check out the nudge reading list. Just go to www.nudge.kit.com forward slash reading list, or just click the link in the show notes. You'll find the reading list there. It is completely free to download. Okay. Thank you so much for listening. I will be back on Friday for another episode of nudge. Cheers.