Scientists have been fighting "fake news" and misinformation for decades—and have valuable suggestions for the rest of us about how to handle such conversations, Julia Belluz reports for Vox.
Belluz spoke with Hilda Bastian, who first gained attention in Australia as an advocate for home births—then later reversed her position based on scientific evidence and now believes her crusade may have cost lives. Today, Bastian is a researcher at NIH, and her journey from medical skeptic to prominent researcher of scientific literacy and evidence-based medicine helps illustrate how to effectively convey scientific information to others.
Based on her conversation with Bastian and others, Belluz offers five key lessons for being an evidence-based advocate.
1. Be patient and explain why you hold your own views
It's not enough to just explain why a skeptic is wrong, Belluz writes. You also must carefully and humbly explain why you hold your position.
That's the type of experience that helped shift Bastian's position on home births. Her transformation began because of conversations with health care researchers who patiently and honestly walked her through the body of evidence related to home births—both its strengths and weaknesses. By doing so, Bastian explained, they established themselves as both "credible and trustworthy."
Because of that long-term relationship with researchers, Bastian eventually came out in support of home-birth regulations and guidelines. And that shift, Bastian said, would not have happened if the researchers she spoke with had not acknowledged that their arguments were not iron-clad.
The takeaway, according to Bastian, is that you can't fight bias with bias. You must be humble.
Separating fact from fiction about competency-based education
2. Use reliable and easy-to-access information
All research is not created equal, Belluz writes, and you need to have reliable, accessible information if you want to persuade someone else of your views. According to Belluz, "The evidence-based medicine movement, which started to catch on in the early 1990s, developed tools to do just that."
Belluz explains that in the early 1990s, "doctors were too often using single or cherry-picked studies, or what they learned in medical school or from their mentors, to inform their decisions about their patients' best care." In response, a group of doctors, researchers, and patients designed systems to get the best evidence into doctors' hands, such as by publishing so-called systematic reviews of literature relevant to specific medical issues. "The reviews used statistical methods to bring together and sort all the best science on specific medical questions, and presented that evidence in a coherent summary," Belluz writes, adding that the effort was "revolutionary" and caused major changes in treatment.
What really drives college costs? Hint: It's not climbing walls.
3. Engage with young people
However, Belluz writes, "Just making high-quality evidence more available doesn't always stop bogus claims from taking off." One key reason why, according to researchers, is that many people don't have time to learn new information related to scientific topics or may lack the critical thinking skills to evaluate evidence.
An exception is children, who often have more time to learn and are still receptive to learning how to think critically. Andy Oxman, a researcher based in Norway who has studied how to help people make informed health choices for more than 30 years, said his research suggests that age 10 is "the age to start."
John Ioannidis, a Stanford University professor, agreed with Oxman that it is important to build critical thinking skills in children so they can effectively evaluate claims throughout their lives. Trying to develop those skills later in life doesn't work as well—even for highly trained professionals like doctors—according to Ioannidis. "We need to start early on, to make people understand that basing decisions on fair tests, on science, on evidence is important," he said.
The billionaire dropout myth, debunked
4. Evidence isn't enough if you don't understand your audience
Sometimes researchers don't fully understand the needs of the people they are trying to help, according to Leonard Syme, a man whom Belluz says is "considered the father of social epidemiology."
He recounts his work on a 10-year, $555 million study of 350,000 people designed to reduce major risk factors related to disease and death. "When the results came up with no change at all—nobody changed behavior!—that was really shattering for me" he says.
The experience prompted Syme to look at the broader factors that affect people's behaviors, such as air pollution and poverty. In those contexts, he explained, "If you ask the people there what problems were on their mind, I promise you smoking would not be on their list."
Syme eventually realized that he and other experts "needed to meet people where they are and better connect to their contexts," Belluz writes.
3 grant myths holding up your student success efforts
5. Hold people accountable for spreading misinformation
"Sometimes there are high-profile misinformation peddlers who need to be held accountable," Belluz writes.
As Ben Goldacre, a British author and physician, put it, "mocking people who misuse science is a really useful gimmick for communicating how science works." And Goldacre would know: He's spent years holding peddlers of misinformation accountable, Belluz writes.
That said, it can sometimes be more useful to go after "people who facilitate the cranks," Goldacre said, such as journalists, editors, and policymakers. "They are used to being able to hide in the shadows, anonymously, and if you can call them out by name, I think that changes their behavior quite well," he explained.
This type of work is hard and takes patience, Belluz writes, adding, "But there is hope. Just remember Hilda Bastian" (Belluz, Vox, 4/14).
3 types of information critical to becoming a data-driven administration
Next in Today's Briefing
Even college degrees are a barrier between Democrats and Republicans