{"title":"Researchers confront a rising tide of cancer misinformation","authors":"Bryn Nelson PhD, William Faquin MD, PhD","doi":"10.1002/cncy.22909","DOIUrl":null,"url":null,"abstract":"<p>Some online articles have suggested, without evidence, that high-dose infusions of vitamin C can cure cancer. Others have promised, falsely, that baking soda can cure prostate cancer or that cannabis oil can cure breast or lung cancer. Well before ivermectin was infamously touted as an (ineffective) intervention for the coronavirus disease 2019, a podcast wrongly asserted that the antiparasitic medication offered a cancer cure.</p><p>Experts have long warned of the noxious effects of online misinformation aimed at swaying elections and public opinion. The swirl of misinformation around cancer treatment and prevention may be less well studied, but researchers have begun raising alarms about the considerable harm that can come from advice that is, in some cases, literally toxic.</p><p>Skyler Johnson, MD, an assistant professor of radiation oncology at the University of Utah’s Huntsman Cancer Institute in Salt Lake City, experienced the phenomenon firsthand when his wife was diagnosed with cancer in 2011 while he was still in medical school. The couple encountered so many fact-free claims and false assertions online that Dr Johnson decided to study the effects of this flood of bad advice. Even after his wife was declared cancer-free, he realized that such misinformation, even from well-meaning friends and relatives, can lead to serious and avoidable harm.</p><p>Most disturbingly, he discovered, it can kill. In a highly cited study, Dr Johnson and his colleagues found that patients who relied entirely on unproven alternative cancer therapies were significantly more likely to die within 5 years than patients who used conventional treatments, such as chemotherapy, radiation, immunotherapy, and surgery.<span><sup>1</sup></span> For a subset of patients with breast or colorectal cancer who used alternative medicine, the mortality risk jumped roughly 5-fold. No matter how advanced cancer treatments might be, he says, “if patients aren’t willing to take those treatments, then we’ve done no good.”</p><p>When she read Dr Johnson’s study, Briony Swire-Thompson, PhD, director of the Psychology of Misinformation Lab in the Network Science Institute at Northeastern University in Boston, Massachusetts, had an epiphany. The cognitive psychologist had previously studied general and political misinformation, but she immediately understood the unique challenge posed by cancer misinformation. “That was, I think, an aha moment where I realized this is a topic where belief really has impact in people’s lives,” she says.</p><p>Dr Swire-Thompson characterizes <i>misinformation</i> as an umbrella term for all false information and <i>disinformation</i> as a subset of false information that is spread deliberately. The high anxiety accompanying a cancer diagnosis, coupled with cognitive fatigue and the fear of side effects from chemotherapy, radiation, or surgery, she notes, can make a patient more susceptible to a huckster trying to capitalize financially. “People are willing to spend a lot of money, understandably,” she says.</p><p>Dr Johnson has devised a useful acronym to remember the main attributes of this harmful misinformation: CRAP (conspiracy, request, anecdotes, and publishers). The sources often invoke conspiracy theories or suspect claims, they frequently request money for a product or information, they use anecdotes instead of data to support their claims, and the publishers’ questionable websites often end in <i>.com</i> instead of <i>.org</i> or <i>.edu</i>. One form of the misinformation exploits fear and anxiety to push people away from conventional treatments. “They will refer to chemotherapy and surgery and radiation as ‘poison’ and ‘cut’ and ‘burn,’ and they’ll use those terms to tell people to avoid those treatments,” he says. Another form pushes people toward alternatives. Emphasizing an “outside of the box” or iconoclastic approach, Dr Johnson says, can make patients feel as if they have insider knowledge of a unique, emerging solution.</p><p>Even worse, people who seek out cancer information online can trigger algorithms that suggest misinformation-filled ads and articles. Dr Johnson recalls that after a close associate was diagnosed with breast cancer and posted about it on social media, she was befriended on Facebook by seemingly sympathetic individuals. In reality, they were pushing fake treatments.</p><p>Carma Bylund, PhD, a professor of health outcomes and biomedical informatics at the University of Florida in Gainesville, suggests that such exposure, in one form or another, is incredibly common. A new analysis by her research group estimates that 93% of all patients with cancer have been exposed to misinformation.<span><sup>2</sup></span> “It’s so pervasive now,” she says. “What we need to figure out is not just that they were exposed to it, but how much do they believe it?”</p><p>Dr Bylund is collaborating with Dr Johnson on how oncologists can best respond to this cancer misinformation. As they have discovered, there is no shortage of it online. With Dr Swire-Thompson, they examined the 50 most popular articles about each of the four most common cancers. Their analysis found that nearly one-third of the 200 articles shared on Facebook, Reddit, X (then known as Twitter), and Pinterest contained misinformation.<span><sup>3</sup></span> Nearly as many included information that was actually harmful. The articles containing misinformation, in turn, received significantly more engagement from online readers than factual ones.</p><p>More widespread regulations may be difficult, especially because some claims may be considered protected speech or worded vaguely enough to evade US Food and Drug Administration bans against false advertising. Calling out some bad actors, Dr Johnson warns, also can give them undue attention in online forums. “One thing I do worry about is this kind of ‘Barbra Streisand phenomenon,’ where if you try to censor, especially a lot of these people who spread medical misinformation, that actually earns them a lot more clout amongst these communities,” he says.</p><p>A big focus in the field, then, has been how to prevent or mitigate the harm through better physician–patient communication. In one new study, for instance, Dr Swire-Thompson and collaborators focused on how health care providers can discredit a source of disinformation. Although physicians have pursued some good strategies, the study found that consistently highlighting a source’s low expertise was among the most effective ways to reduce that source’s future credibility.<span><sup>4</sup></span></p><p>In collaboration with Dr Johnson, Dr Bylund has created a stoplight model of physician–patient communication to further explore best practices around handling online misinformation.<span><sup>5</sup></span> A green-light response validates a patient’s information seeking. A subtype dubbed green arrow validates the information gathering while directing patients to credible websites. A yellow light cautions patients to proceed carefully when seeking out the information. A red light, by contrast, nixes discussion around misinformation and discourages patients from seeking online information at all.</p><p>Unsurprisingly, Dr Bylund says, the vast majority of patients and caregivers urged against going online by a red-light warning still do: They just keep quiet about it, shutting down further communication. “The message that I’m always trying to get out is that this strategy doesn’t work,” Dr Bylund says. As for what does, she hopes more research can provide a better answer. “If an oncologist said to me, ‘I don’t know what to do when patients have read misinformation,’ I want to be able to answer that with an evidence base,” Dr Bylund says.</p><p>Despite the unknowns, Dr Swire-Thompson and Dr Johnson argue that cancer misinformation is a good model for studying the wider phenomenon.<span><sup>6</sup></span> Researchers are debating how much belief affects behavior, and Dr Swire-Thompson points out that cancer can offer objective measures of behavior change, such as treatment adherence and mortality rates. Consistent follow-up will be required, but she sees enormous potential for improving the outcomes of patients exposed to dangerous misinformation. “I’m really lucky I stumbled across it, because I feel like this impacts so many individuals,” Dr Swire-Thompson says. “We need all hands on deck here.”</p>","PeriodicalId":9410,"journal":{"name":"Cancer Cytopathology","volume":"132 10","pages":"603-604"},"PeriodicalIF":2.6000,"publicationDate":"2024-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/cncy.22909","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cancer Cytopathology","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/cncy.22909","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ONCOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
Some online articles have suggested, without evidence, that high-dose infusions of vitamin C can cure cancer. Others have promised, falsely, that baking soda can cure prostate cancer or that cannabis oil can cure breast or lung cancer. Well before ivermectin was infamously touted as an (ineffective) intervention for the coronavirus disease 2019, a podcast wrongly asserted that the antiparasitic medication offered a cancer cure.
Experts have long warned of the noxious effects of online misinformation aimed at swaying elections and public opinion. The swirl of misinformation around cancer treatment and prevention may be less well studied, but researchers have begun raising alarms about the considerable harm that can come from advice that is, in some cases, literally toxic.
Skyler Johnson, MD, an assistant professor of radiation oncology at the University of Utah’s Huntsman Cancer Institute in Salt Lake City, experienced the phenomenon firsthand when his wife was diagnosed with cancer in 2011 while he was still in medical school. The couple encountered so many fact-free claims and false assertions online that Dr Johnson decided to study the effects of this flood of bad advice. Even after his wife was declared cancer-free, he realized that such misinformation, even from well-meaning friends and relatives, can lead to serious and avoidable harm.
Most disturbingly, he discovered, it can kill. In a highly cited study, Dr Johnson and his colleagues found that patients who relied entirely on unproven alternative cancer therapies were significantly more likely to die within 5 years than patients who used conventional treatments, such as chemotherapy, radiation, immunotherapy, and surgery.1 For a subset of patients with breast or colorectal cancer who used alternative medicine, the mortality risk jumped roughly 5-fold. No matter how advanced cancer treatments might be, he says, “if patients aren’t willing to take those treatments, then we’ve done no good.”
When she read Dr Johnson’s study, Briony Swire-Thompson, PhD, director of the Psychology of Misinformation Lab in the Network Science Institute at Northeastern University in Boston, Massachusetts, had an epiphany. The cognitive psychologist had previously studied general and political misinformation, but she immediately understood the unique challenge posed by cancer misinformation. “That was, I think, an aha moment where I realized this is a topic where belief really has impact in people’s lives,” she says.
Dr Swire-Thompson characterizes misinformation as an umbrella term for all false information and disinformation as a subset of false information that is spread deliberately. The high anxiety accompanying a cancer diagnosis, coupled with cognitive fatigue and the fear of side effects from chemotherapy, radiation, or surgery, she notes, can make a patient more susceptible to a huckster trying to capitalize financially. “People are willing to spend a lot of money, understandably,” she says.
Dr Johnson has devised a useful acronym to remember the main attributes of this harmful misinformation: CRAP (conspiracy, request, anecdotes, and publishers). The sources often invoke conspiracy theories or suspect claims, they frequently request money for a product or information, they use anecdotes instead of data to support their claims, and the publishers’ questionable websites often end in .com instead of .org or .edu. One form of the misinformation exploits fear and anxiety to push people away from conventional treatments. “They will refer to chemotherapy and surgery and radiation as ‘poison’ and ‘cut’ and ‘burn,’ and they’ll use those terms to tell people to avoid those treatments,” he says. Another form pushes people toward alternatives. Emphasizing an “outside of the box” or iconoclastic approach, Dr Johnson says, can make patients feel as if they have insider knowledge of a unique, emerging solution.
Even worse, people who seek out cancer information online can trigger algorithms that suggest misinformation-filled ads and articles. Dr Johnson recalls that after a close associate was diagnosed with breast cancer and posted about it on social media, she was befriended on Facebook by seemingly sympathetic individuals. In reality, they were pushing fake treatments.
Carma Bylund, PhD, a professor of health outcomes and biomedical informatics at the University of Florida in Gainesville, suggests that such exposure, in one form or another, is incredibly common. A new analysis by her research group estimates that 93% of all patients with cancer have been exposed to misinformation.2 “It’s so pervasive now,” she says. “What we need to figure out is not just that they were exposed to it, but how much do they believe it?”
Dr Bylund is collaborating with Dr Johnson on how oncologists can best respond to this cancer misinformation. As they have discovered, there is no shortage of it online. With Dr Swire-Thompson, they examined the 50 most popular articles about each of the four most common cancers. Their analysis found that nearly one-third of the 200 articles shared on Facebook, Reddit, X (then known as Twitter), and Pinterest contained misinformation.3 Nearly as many included information that was actually harmful. The articles containing misinformation, in turn, received significantly more engagement from online readers than factual ones.
More widespread regulations may be difficult, especially because some claims may be considered protected speech or worded vaguely enough to evade US Food and Drug Administration bans against false advertising. Calling out some bad actors, Dr Johnson warns, also can give them undue attention in online forums. “One thing I do worry about is this kind of ‘Barbra Streisand phenomenon,’ where if you try to censor, especially a lot of these people who spread medical misinformation, that actually earns them a lot more clout amongst these communities,” he says.
A big focus in the field, then, has been how to prevent or mitigate the harm through better physician–patient communication. In one new study, for instance, Dr Swire-Thompson and collaborators focused on how health care providers can discredit a source of disinformation. Although physicians have pursued some good strategies, the study found that consistently highlighting a source’s low expertise was among the most effective ways to reduce that source’s future credibility.4
In collaboration with Dr Johnson, Dr Bylund has created a stoplight model of physician–patient communication to further explore best practices around handling online misinformation.5 A green-light response validates a patient’s information seeking. A subtype dubbed green arrow validates the information gathering while directing patients to credible websites. A yellow light cautions patients to proceed carefully when seeking out the information. A red light, by contrast, nixes discussion around misinformation and discourages patients from seeking online information at all.
Unsurprisingly, Dr Bylund says, the vast majority of patients and caregivers urged against going online by a red-light warning still do: They just keep quiet about it, shutting down further communication. “The message that I’m always trying to get out is that this strategy doesn’t work,” Dr Bylund says. As for what does, she hopes more research can provide a better answer. “If an oncologist said to me, ‘I don’t know what to do when patients have read misinformation,’ I want to be able to answer that with an evidence base,” Dr Bylund says.
Despite the unknowns, Dr Swire-Thompson and Dr Johnson argue that cancer misinformation is a good model for studying the wider phenomenon.6 Researchers are debating how much belief affects behavior, and Dr Swire-Thompson points out that cancer can offer objective measures of behavior change, such as treatment adherence and mortality rates. Consistent follow-up will be required, but she sees enormous potential for improving the outcomes of patients exposed to dangerous misinformation. “I’m really lucky I stumbled across it, because I feel like this impacts so many individuals,” Dr Swire-Thompson says. “We need all hands on deck here.”
期刊介绍:
Cancer Cytopathology provides a unique forum for interaction and dissemination of original research and educational information relevant to the practice of cytopathology and its related oncologic disciplines. The journal strives to have a positive effect on cancer prevention, early detection, diagnosis, and cure by the publication of high-quality content. The mission of Cancer Cytopathology is to present and inform readers of new applications, technological advances, cutting-edge research, novel applications of molecular techniques, and relevant review articles related to cytopathology.