In this episode, Dr. Mark Hoofnagle discusses battling false science and how you can help patients discover the truth with some helpful communication tips.
“Real experts are wrong and will admit that they’re wrong. Actually one of the ways that you can identify an expert is, you can search them admitting to being wrong at some point in time. Because, you know, people who are never wrong aren’t experts, they’re jerks.” – Dr. Mark Hoofnagle
In episode 168, Master Certified Coach Jill Farmer welcomes Dr. Mark Hoofnagle to the podcast. Dr. Hoofnagle is assistant professor of surgery in the section of Acute Critical Care Surgery at the Washington University in St Louis. He has been writing about critical thinking and science denialism since 2007. In this episode, he explains the many ways that the public is duped by misinformation, why people are so susceptible to misinformation and what, as doctors, we can do when our patients believe something other than the correct science. He also gives us tips for meaningful communication with patients or anyone in our lives who may be misinformed.
Mark Hoofnagle MD/PhD trained at University of Virginia School of Medicine for medical school, University of Maryland Medical Center for General Surgery and University of Pennsylvania for Trauma and critical care and is now assistant professor of surgery in the section of Acute Critical Care Surgery at the Washington University in St Louis. He researches Deep Venous Thrombosis in trauma and gun violence at Washington University, and has written about critical thinking and denialism since 2007. His essay on “What is Denialism” on Scienceblogs from 2007 created framework for understanding the public dissemination of anti-science narratives, has been cited in the International Journal of Public Health and Nature, and provides context for understanding current disinformation and misinformation campaigns.
How many coaches do you think your favorite actors and athletes have worked with over the years in order to achieve such extraordinary success?
What if you had a team of trusted thinking partners, experienced coaches who have helped hundreds of physicians overcome obstacles and who know what works?
What if you were part of a community of like-minded physicians from across the nation, across specialties and career stages? Your collective brain trust, sharing ideas and experiences, so you would no longer feel like an island, surrounded by people yet alone?
What if you had small group coaching sessions, could interact with your coaches and community as often as you wish, and had virtual courses at your fingertips 24-7 that could help you with things like time and stress management, resilience, and mapping out your future to achieve what matters most to you?
What if you could have all of this for less than the cost of a single 1:1 coaching session per month?
DocWorking THRIVE is the Physician Coaching and Community Subscription Package that Guides You as a Doctor to Embrace Life in the way that is most meaningful to you, integrate that with your work so you can truly thrive, and be a valued member of our growing private community of doctors from across the nation.
Join our community by clicking here.
Please check out our Trusted Resources! The Trusted Resources businesses are paid advertisers or have an affiliate relationship with DocWorking. An affiliate relationship means that DocWorking may receive a commission if you use their service by clicking through our link. Thank you for supporting businesses that support DocWorking’s mission of prioritizing Physician wellness.
Are you a physician who would like to tell your story? Please email Amanda, our producer, at [email protected] to be considered.
And if you like our podcast and would like to subscribe and leave us a 5 star review, we would be extremely grateful!
Some links in our blogs and show notes are affiliate links, and purchases made via those links may result in payments to DocWorking. These help toward our production costs. Thank you for supporting DocWorking: The Whole Physician Podcast!
Occasionally, we discuss financial and legal topics. We are not financial or legal professionals. Please consult a licensed professional for financial or legal advice regarding your specific situation.
Podcast produced by: Amanda Taran
Please enjoy the full transcript below
Dr. Mark: Real experts are wrong and will admit that they’re wrong. Actually, one of the ways that you can identify an expert is, you can search them admitting to being wrong at some point in time. Because people who are never wrong aren’t experts, they’re jerks.
Jill: Hi, everyone and welcome to DocWorking: The Whole Physician Podcast. We are so glad you’re with us today. You’re in for a great conversation. As always, the podcast is brought to you by DocWorking THRIVE. Go to docworking.com today, to learn how you could earn up to eight CME credits, learn how to be less stressed and happier in your life and work, and get coaching on our online platform as well. Go to docworking.com for more.
Today’s conversation, I think, is so relevant to so many physicians, who are listening and practicing medicine in the state of the world that we are in as it relates to science denialism. The emergence that came up especially in these last two years since COVID reared its head in February/March of 2020. It’s become something that comes up a lot with my physician clients and I’m really excited to have the conversation today with Dr. Mark Hoofnagle.
Mark is a trauma surgeon who specializes in Acute care at Washington University in St. Louis in Barnes Jewish Hospital. He also is a critical thinker, who has an enormous Twitter following. He has written about critical thinking and denialism since 2007. His essay on “What is Denialism” on Science Blogs from 2007 created framework for understanding the public dissemination of anti-science narratives and it’s been cited all over the place including in an article that I saw most recently in a Canadian publication that sparked the conversation between Mark and I, which I thought would be wonderful to bring to all of you as well. Mark is a brilliant guy, a stand-up human, and a friend of mine. So, I’m excited for us to be able to have this conversation together. Mark, thanks so much for joining us.
Dr. Mark: Great. Thanks for having me.
Jill: Let’s talk a little bit about, going back to you writing about this back in 2007 before COVID and COVID denialism was anything that we would have been thinking about. What prompted you to begin thinking about science denialism and how it impacted the practice of medicine way back 15 plus years ago, now?
Dr. Mark: It’s funny. It’s how I wasted my time in graduate school. At the time, the Dover v. Connecticut case on evolution denial was going on. And I was a graduate student. I was writing a science blog. I was drawing connections between the different denialist movements, which shared a number of similarities. First, for very specific reasons, part of it is that these narratives are effective. They manipulate the way that we think of basic human heuristics of how we encounter information and how we deal with contrary information. These are really built-in things to our brains that allow us to reject things that we don’t want to hear or don’t want to believe.
Then the other thing is that they are very purposefully and systemically exploited by a number of groups, because they know that these narratives and these dialogues will work. That’s been happening ever since the 50s with the organized violence by the tobacco companies against believing that tobacco causes cancer. And in the 2000s, there was the evolution wars. At the same time, the rise of the internet, organization of anti-facts groups, other anti-science groups, global warming denialism and it’s all a piece.
I basically organized the framework into five tactics that are classically used by denialists. Conspiracy being the most important one. Because if you need to deny a body of science, you basically have to come up with an explanation for why all these scientists are lying. These very impossible, improbable conspiracies that would require an immense amount of organization, that is laughable, if you know any kind of scientists, that they would all get together and agree on one thing, followed by cherry picking of evidence, fake experts.
This is a really big problem in the time of COVID, that a number of people have come forward as public experts on health who are not, but they say things that are congruent with people’s ideology. They are accepted and given as much or more time than people who actually know what they’re talking about. And then after that, you have goalposts moving. People don’t want to change their mind. They’ll keep on pushing the goalposts along further and further and demanding more evidence. The reality is, it’s not evidence that’s what’s going to convince them. They are committed to this line of reasoning. You’ll find that even as you satisfy their demands, they’ll keep moving the goalposts. And then the last is just a host of federal logical fallacies that you encounter that are effective on people, but ultimately, empty rhetoric.
Jill: Fascinating. Say a little bit more, if you would, about the last point you made about the fallacies. What would be an example of that or how does that play out in life?
Dr. Mark: The first one I encountered was with global warming denialism. Everybody was making a big deal that Al Gore’s house is this big, huge, inefficient management. It has a bigger carbon footprint than anything else. What does that have to do at all with whether or not carbon affects climate? Nothing. But it’s one of these things that gets people all riled up is like, “Oh, he’s a hypocrite.” No matter if he’s a hypocrite. It doesn’t change the science. That’s just a very common example. And what we’ve seen with Russian disinformation campaigns, COVID disinformation campaigns is, there is a constant call back to what’s called whataboutism, where rather than talking about the issue, we’re talking about the data. They say, “Well, what about this?” It’s not actually relevant. Hypocrisy actually isn’t ultimately irrelevant. So, I’d say, whataboutism is probably the best current example of something that’s a logical fallacy and it’s usually an argument to quote away.
Jill: Yeah, and it’s kind of a red herring, right? Let’s have everybody focus on this as opposed to talking about the most important factor and we lose prioritization of what really matters. Is that what I’m hearing you say?
Dr. Mark: Yeah. It’s a distraction technique. It’s a way to derail the topic.
Jill: What do you think is different now in the way that we consume media, the change in social media, and the number of people that engage in social media conversations? How has that changed how these five tactics for science denialism and misinformation dissemination changed? What was happening when you were first looking at this as a graduate student back in the 2007 era?
Dr. Mark: A number of things have happened with the rise of social media being the most obvious one. But we’re also seeing big broad social changes that have been even like 50 years in the making. Major changes in the way that we have dealt with misinformation since the past., trust and experts, expansion of news media to cater to more and more outlandish views. There’s just been a huge broadening. And the scope of the problem as well is a technological ease of access, combined with a number of legislative things that have made it, so that misinformation isn’t just a thing that you found out from your loser cousin in the garage, but it’s actually profitable to do.
One of the things that unifies all of these various denialist organizations, groups, individuals is that they often are running a side gig based on it. They can monetize this. The most common thing being nutritional supplements and it’s basically a clearing house. You know that you have people that are susceptible to misinformation. It’s really easy to also at the same time, sell them products that are based on either something that’s ideologically congruent with what you’re up to. Alex Jones sells these various hyper masculine body building products and survivalist gear. It’s all part of the grift that if you bring them in, you can also select the people that will be susceptible to products that are sold on misinformation. That was really eased by the 1994 passage of, I think it was a hatched product of a bill that basically made it possible to basically advertise alternative medicines, again, without really any kind of regulation to keep them under control. Or you can basically advertise the supposed use aside from the actual product that basically, once again made selling people snake oil profitable.
One of the biggest issues that I think funds this, and drives this, and is underpinning this fraud, is that we have relaxed regulatory authorities and allowed broad-based consumer fraud to just persist everywhere on the internet. It is very hard to get under control. Whenever you look at any of the people that are selling disinformation, there’s usually a monetary benefit behind it. The fake experts become celebrities, they get media access, they get speaking gigs. The people that are running websites are selling products. They’re selling various supplements, scribal gear, whatever. But it’s a profitable business. They make money doing it and there’s no organized effort or way to stop them from doing so.
Jill: Just interesting, and sobering all at once, I think, for us to think about it in these terms. You are a scientist, when you’re not saving gunshot victims lives on the table as an expert trauma surgeon, you also do a lot of research for a living as well. It’s a big part of your life as an MD/PhD. I know that one of the things that’s got to be frustrating for you as a professional scientist is to hear how often people say, “Well, this scientist is saying this or there’s a study that shows XYZ as it relates to COVID.” There were a lot of different alternative treatments being thrown around and I would listen to people in my life, who would come up and be talking about, “Well, the doctors are saying this, but there’s other research and there’s science that shows this.”
I know for a lot of my physician clients, it left them flatfooted sometimes, when they were face to face with the patient, because they didn’t know immediately in that moment, even though it felt crazy what the person was saying, how to counter that piece of alleged science with something that was more credible. Can you talk a little bit about that for us?
Dr. Mark: Yeah, that’s ivermectin, right? I’m a COVID physician, too. I do critical care, I take care of very sick people, and I have had patient families bring up things like ivermectin. I’ve had patients try to refuse to have their families treated inside the hospital, because they’re convinced that the hospital is killing COVID patients, like, the problem is, is going to the hospital or you’re denying them lifesaving ivermectin. They didn’t want to have their family member admitted, because we wouldn’t give ivermectin. You’re just absolutely amazed, because it is a complete unraveling of the credibility of a system that we’re used to being found credible. Because generally, as physicians, when people come to see us, they are open to what we’re going to say. That’s why they’re seeking us out. They want our expert information and opinion. Physicians end up in a place where they think the problem is that there’s just not enough information. They just don’t have the right information. If they just had the right information, that would be the thing that would save them. But that’s not how this works.
Communication of science is itself a science and has been studied. That’s called the information deficit model. That’s wrong. People have access to endless amounts of information. We all are a keystroke away from whatever anybody wants to find out, true or false, information is not the problem. The problem is that, when you have people who are in ideological silos and they need to protect their identity with a set of beliefs, the beliefs that join us together aren’t so much the truths, but the lives that we share. If you have to counter that ideology, you basically have to counter who they are as a person. They don’t want to give up on that. They will believe anything other than something that will basically create a conflict with who they are, their community, or where they perceive their place in society.
The issue isn’t information. You can’t inform somebody out of this, and that’s a common misconception. This has been studied. The ways that you get people to come around after they’ve fallen for misinformation, if they’re still open, isn’t just by listing the facts at them, it isn’t by calling them stupid, it isn’t by suggesting that they’ve been duped, even if they have been duped. Nobody wants to hear that. It doesn’t work. People who have been scammed, do not want to accept that they’ve been scammed and will fight it tooth and nail, because that makes them feel foolish. And they don’t want to feel foolish. What you have to do instead, is you have to emphasize relationships, try to rebuild trust. If you are a family doctor and you’ve been seeing this patient for years, what you say to them is like, “You know me, I know you. I don’t think that you’re wrong to have believed this thing. But here’s the problem.”
And then one thing that is useful is to lay out the tactics of those that want you to believe this thing, point out their interest in having to believe it. Express care for them that you’re not trying to make them change who they are or believe different things, that this is the thing that we have the best evidence for, and I want to protect you, protect your family, prevent you from spreading a disease, things like that, and appeal to them as people, because we know the issue isn’t information, its ideology. Its modes of thought, it’s the communities that people become invested in, and you have to pull people into your community. You have to pull them towards you. At the same time, you gently push away the disinformation and say, “Well, they may have a bit of a vested interest in pushing this lie.” Or, for instance, with ivermectin, it’s really interesting when ivermectin started rearing up, because I went on Twitter and I described the exact path that ivermectin was going to take in terms of how, as the evidence fell apart, that they would resist changing their minds. They’re going to say, “Oh, well, you gave the wrong dose. Oh, well, you didn’t give it at the right time.”
The reason why I know that they’re going to do this is they did all the same things for vitamin C. It’s even the same people. It’s Paul Marik and Pierre Kory. You know how they’re going to behave, you know how they’re going to move the goalposts, and you just have to explain to them. It’s like, these doctors, they are writing a little high on ego, they’re ahead of the evidence, they’re pushing something that we don’t have good data for. We have very good data for this. I would rather you didn’t pursue this, because I care about you. I want you and your family to be safe. You have to reach out to people as people and not just batter them with data, or call them stupid, or say that they’ve been duped.
Jill: Yeah, that is it. Studying the psychology of challenging communication with people, you just hit it beautifully. It isn’t debate time. It’s relationship time. And that can be hard for doctors, who are used to looking at science. And I know this about Mark, you are a scientist and you’re okay with being wrong. But a lot of other people in the world, when you’re saying you’re wrong about this specific data point that they’ve been taken in by an “expert,” it’s challenging their identity, it’s challenging their sense of safety. I belong to this group of people who think this. And if you, Dr. Mark, and I’m in your office, are trying to tell me, “I can’t believe that,” you’re telling me, “I can’t belong essentially to the tribe of humans that I feel safest and most connected to.” And so, I really like the way that you laid that out. Am I understanding you correctly?
Dr. Mark: Yeah. Real experts are wrong and will admit that they’re wrong. Actually, one of the ways that you can identify an expert is, you can search them admitting to being wrong at some point in time. Because, you know, people who are never wrong aren’t experts, they’re jerks. They’re egoists. Because we’re all wrong sometimes. We all get things wrong sometimes. With COVID, there were a ton of unknowns, and experts were guided by the science of previous pandemics and what they knew about biology, but that didn’t necessarily mean that they were going to get everything right all the time. They were very early obvious blunders. That’s okay. They’re working with incomplete information and it’s a sign of the expert that they take the new information and incorporate it into a new framework and move ahead. You don’t just plant your feet and say, “No, I was still right.”
Speaking of the fake experts, this is an example of why they’re not real experts. Even taking individuals, Ioannidis, early on predicted that, “Oh, this was just going to kill a few thousand people.” And now, we’re at a million and he’s still backpedaling trying to defend that initial thing. It’s like, you just have to accept you’re wrong and you got this one wrong. Marty Makary, who’s often on Fox News has predicted a number of bizarre things since the beginning. That we’d have herd immunity by last April. That we’d have herd immunity if only 20% of people got vaccinated. They were having trouble clearing the 65% vaccination rate which is about where we are right now. And Omicron came through in this last wave and infected 50% of Americans, we’re nowhere near herd immunity. People have been predicting it over, and over, and over again, and wrong, and wrong, and wrong, and continuing to make the same prediction or refusing to acknowledge past errors. That’s one of the sure signs that you are not a serious expert.
The key to critical thinking isn’t having a ton of information, and knowing logic really well, and just having tons of data, because you can construct things to fool yourself, which brings up another thing that Richard Feynman said. “The first principle to critical thinking is realizing the easiest person to fool is yourself.” You will be fooled by people in your life, it will happen. It happens to us all. The way that people fool you is they manipulate your emotions, they make you angry, they flatter you, they give you the piece of information plus something that pulls you along. It’s that, that you have to look out for. You have to really engage your logical gears a little bit more strongly when somebody is making you outraged. Because when you get emotional, that’s when you are thinking less clearly.
The people that are trying to give you good information are the ones that are trying to keep you calm, and just lay it out, and they’re not trying to take anything from you, or asking for money. This is just the way our brains work. We like being flattered or we become irrational when we get angry. They know that if they can make you angry, they can make you do anything. That’s my quick critical thinking “how to.” The first thing that you have to watch for is your own emotions, and your own interaction with the information, and why you want to believe that. Should you? If you don’t interrogate your own feelings about information, you are going to get taken.
Jill: Beautiful. And I think when you can come to the patient communication and the patient relationship with that after processing your own stuff as a physician, as you said, and so that you can come in, not so that you’re robotic, and emotional, and caring, but you can be the compassionate witness to what might be motivating them to believe something that makes you want to pull your hair out, because it feels so crazy. If you can come at it from that perspective, slow yourself down, and think sometimes creatively about how to communicate it in a way that tells them, again, they’re not stupid, they’re not idiots for not believing, let me go into A+ smartest guy in the room or smartest girl in the room mode and tell you why you’re wrong by debating you, that thinking more about how to relationally convey how you can understand why somebody else’s misinformation will be believable, because the other person’s motivation is to make you believe it, but then to come back.
Another thing that I find when I’ve been coaching physicians is inviting them to think of stories, because when people are under threat, they can take in data. But we as humans are really good at listening and believing stories about other humans. So, when you can do that, that can be an effective way to communicate as well.
Dr. Mark: Yeah, you have to figure out people’s motivations for believing things. Because this is actually literally called motivated reasoning. They are reasoning in a way that is motivated by something else other than the facts and the logic. Another very helpful technique to help deprogram people is just use the Socratic technique. Not necessarily hit them with your side immediately, but be like, “So, why do you think this? Where did you hear this? Why do you think they want people to believe this? Why do you believe this? What do you think this means for you?” Just get them talking about it. Once people are forced to go through the logic of why they’ve come up with the decision, they themselves will often see the holes in it and then you can help them fill those gaps and be like, “So, you see, this is how you got there and that’s not wrong. There’s a logic to it.” But we recommend something else for these reasons. This is how we got there. And it’s less confrontational, it’s listening and sometimes, it’s really hard because often you’re listening to something that’s objectively bonkers.
Jill: It’s wackadoodle.
Dr. Mark: I’ve literally had families say that they didn’t want a loved one to come into the hospital, because we wouldn’t give them ivermectin. Wow, it’s a very difficult place to start from. You just have to slow down, don’t get angry, talk them through it, discuss things. It’s one of the most difficult things, but hopefully, we’re trained to be good listeners by now.
Jill: Finally, Mark, you developed these ideas around these five tactics, again, back in 2007, when you were in grad school and noticing patterns coming through science denialism, and fake information, and people’s propensity for taking that in. As you are practicing medicine now here in 2022 and have been practicing medicine on the frontlines really of the ICU as a COVID doctor, in addition to your work as a surgeon, how have you been able to take what you know about this and put it into practice in a way that has helped you feel good about practicing medicine and help you connect to patients?
Dr. Mark: Well, one of the advantages I have is that, often, there isn’t any debate with me about what people need. People show up and they’re shot, and they want care immediately, and there isn’t even a time often to discuss things. I’m in a little bit of a rarified position, where a lot of times, there isn’t a need for a long or protracted debate to discuss these things. When I’m interacting with folks more one on one, and they have certain sets of beliefs that are incongruent, I think it’s helpful knowing these strategies, because we run into conflicts all the time with patients when they believe things that we would prefer that they didn’t or things that we perceive to be harmful and you have to walk this fine line of paternalism and respecting their choices over what is objectively, factually incorrect.
Thinking about these things all the time and seeing how people compartmentalize things, I think it does help me understand how people as people come to decisions and give them guidance if it’s something as simple as a consent conversation or discussing why the treatment is going right or wrong, understanding how people interact with information. What they’re going to take away from what you say is really useful and it keeps you a little bit armored, because you know that you’re going to say certain things to a patient, and they are going to take away something, and it won’t necessarily be what you intended. That happens over and over again, and every physician has had that experience where somebody who’s in the hospital says, “You said you were going to do surgery on me.” “Well, did I say that?” This kind of thing and go, “Oh, the doctor said I had an 80% chance of dying.”
Actually, we never say things like that. But this is the way people hear things when people process information. If you study critical thinking, and the way people, and their heuristics for evaluating information work, and see how they fall in these traps, you can help guide them around them. That would be, I’d say, the most useful thing. But other than that, all it does is provide me a Twitter following and a citation once in nature. The way I wasted my time in grad school got me a citation in nature. None of my science is ever probably going to get there. [laughs]
Jill: I love that. Thank you so much for this conversation. I think it’s really rich and I think the downside and the disheartening side, and I know that it’s exhausting for physicians who have been used to having that credibility, and feeling some of that has been challenged unfairly and unnecessarily. I think the positive aspect of this, where there’s the potential for some growth, is that as physicians are willing to think about how to reshape communication, reshape the way that we do try to support patients and people’s ability to live healthy and long lives, getting creative about new ways to communicate meaningfully, even in challenging situations, I think serves everybody better in the long run and you’ve really helped us do that in this conversation today. Thank you.
Dr. Mark: Thank you for having me.
Jill: I want you guys all to go over to Twitter and check out Mark Hoofnagle, H-O-O-F-N-A-G-L-E. There’s just a lot of good compelling conversations there. If you liked this topic, you’ll see a lot more there and I want you to go check it out. Meantime, thanks, Dr. Hoofnagle for being with us and providing so much rich conversation with us and thank you all for being part of this conversation. Share it with friends and make sure you go to docworking.com today. Check out DocWorking THRIVE and see how we can help with your stress, your work performance, and how happy you are in your life. Until next time, I’m Jill Farmer.
Amanda: I’m Amanda Taran, producer of DocWorking: The Whole Physician Podcast. Thank you so much for listening. Please don’t forget to like and subscribe, and head over to docworking.com to see all we have to offer.
Coach Jill Farmer
Jill Farmer is an experienced physician coach who has been helping doctors live their best lives, increase their success, and move through burnout for well over a decade.
She has delivered keynotes, programs, and training everywhere from Harvard Medical School to the American College of Cardiology.
She has personally coached hundreds of physicians, surgeons, and other busy professionals to help them be at their best—without burning themselves out. Her coaching has supported professionals at places like Mass General Brigham in Boston, Washington University in St. Louis, Northwestern University in Chicago and too many others to list.
Jill wrote the book on time management for busy people. Literally. It’s called “There’s Not Enough Time…and Other Lies We Tell Ourselves” which debuted as a bestseller on Amazon. Her work has been featured everywhere from Inc. to Fitness Magazine to The Washington Post.
Nationally recognized as a “brilliant time optimizer and life maximizer,” Jill will cut straight to the heart of your stress to liberate you from its shackles. She has two young adult daughters. She lives with her husband and their poorly behaved dachshund in St. Louis, MO.