Between 2013 and 2017, Australian wellness influencer Belle Gibson achieved Instagram fame by claiming she had cured her terminal brain cancer naturally. Gibson amassed a vast and lucrative following, encouraging her supporters to avoid chemotherapy and other medical treatments. But her story began to unravel when she was exposed in March 2015, with journalists from The Australian and Fairfax Media revealing she’d fabricated her diagnosis. By then, however, she’d already profited massively from her media empire—a wellness brand entitled “The Whole Pantry,” complete with a mobile app and cookbook—and many followers had embraced her (often dangerous) claims. In 2025, Netflix released a dramatized series about her life, Apple Cider Vinegar, reigniting discussions about medical misinformation online.
Unfortunately, Gibson’s brazen behavior is not unique. During a February 3 panel at Harvard’s T.H. Chan School of Public Health—held in collaboration with the Zhu Family Center for Global Cancer Prevention, for World Cancer Week 2025—moderator Malika Marshall, a medical reporter for WBZ-TV and CBS News Boston, noted, “A lot of us consult Dr. Google when we're looking for health information online.” Although credible information exists, “There can be information that is confusing, outdated, and potentially harmful,” she continued. People with a cancer diagnosis may be “particularly vulnerable to this type of information.”
“The thing that concerns me most about cancer misinformation,” said Skyler Johnson, assistant professor in the department of radiation oncology at Huntsman Cancer Institute and University of Utah School of Medicine, “is how it can appear reputable.” He continued by acknowledging that “Many of us who have had experiences with family members who have had cancer have probably gone online themselves to look up information.” For those with a high level of medical literacy, “Dr. Google” may not be dangerous—but if readers don’t understand the science, misrepresented facts can quickly have serious ramifications.
“It's not just things like ‘You should eat more pickles’ or ‘You should go to sleep at 8:00 at night,’” added Marshall. “These are really life-and-death decisions that people are making based on the information they’re seeing online and trusting.” Some misinformation discourages early screenings, potentially leading to late-stage diagnoses and delaying treatment until a point when options have become limited. Other myths suggest that certain lifestyle choices or unproven remedies can cure cancer, misleading patients into avoiding evidence-based treatments and reducing their chances of survival.
Social Media’s Role
Stacy Loeb, professor of urology and population health at New York University, has been studying the spread of misinformation across social media for the past several years. In many cases, she says, there’s “Actually more engagement with the worst-quality cancer information.” Examining how different types of cancer diagnoses (for bladder and prostate malignancies specifically) are presented on YouTube, her team discovered that there’s “a substantial proportion of misinformation for both” types. What’s more, these videos were seen by millions of viewers, producing “misinformation at a really wide reach.” The researchers also examined Instagram and TikTok content related to prostate cancer. Across the three platforms, “About 40 percent of the content had any kind of objective information, [with the rest containing] misinformation or some kind of inaccurate information.” On podcasts and Pinterest, they found that around 13 percent contained misinformation.
Johnson has recently conducted similar work exploring and debunking the prevalence of misinformation on social media. His team has examined the most popular cancer articles on platforms like Facebook, Twitter, Pinterest, and Reddit. The findings, he says, indicate that around one in three articles contains misinformation—and, of those articles containing misinformation, the vast majority contain “harmful” misinformation: “Things telling people ‘Don’t get chemotherapy, don’t have a biopsy, don’t get screened, avoid treatments that have evidence to support their use.’” In other words, one-third of the content was dangerous.
How commonly are people reading these types of articles? “We looked at the engagement [for these posts],” Johnson continued, “And these articles that were false and harmful received more engagement than articles that were true and safe”—mirroring Loeb’s findings. “The problem is widespread,” Loeb amplified, and getting worse. Once a post attracts some attention, there’s a snowball effect: algorithms prioritize engagement, allowing false, shocking, and sensational claims—such as the (incorrect) idea that “rectal ozone therapy can cure cancer,” as Johnson illustrated, or that prostate cancer is “caused by not ejaculating enough or ejaculating too much,” said Loeb—to gain traction. “Things that draw engagement…create a perverse incentive for people to share misinformation,” he explained.
Financial gain, too, incentivizes the spread of medical misinformation. Individuals and businesses promoting alternative medicine may find a foothold online where they wouldn’t in everyday life.
Unintentionally Spreading Cancer Misinformation
Beyond algorithms spreading misinformation (and financial actors profiting from online marketplaces), it can be spread by well-meaning individuals. “Some patients might believe cancer is the end for them, and instead of consulting reliable sources, they turn to family members or online communities for guidance,” said Milagros Abreu, president and founder of The Latino Health Insurance Program. When misinformation is passed from loved ones, it can be even harder to refute.
Abreu notes that normative values—the data doctors use to determine what is “normal” for a population and to identify deviations—can be shaped by culture. “These values are passed on through families, through the medical system, and too often, online. “It’s important to understand the diversity of trust,” he explained, “and also to recognize the normative values that can help us improve communication with patients—so they don’t rely unnecessarily on alternative sources, even family members or friends.”
Patients are more likely to believe health information from sources they identify with culturally. For instance, “I come from an African American family,” Marshall said, “[And] I have many family members who are very critical of science and suspicious of things that are recommended to them.” This fear and distrust are exacerbated by the fact that “many outcomes [for black populations] are not as good as [for] other racial groups.”
Loeb further shared research with different demographics interacting with cancer misinformation on social media. “In focus groups with Black patients on prostate cancer,” she said, “[Participants] described how they never saw anyone that looked like them in the online content.” As a result, the participants reported the belief that “Black males don’t get prostate cancer,” or that they “must be at a lower risk—because they never see any black faces.” And these focus groups had the same “script,” Loeb continued, about either getting screened for cancer or participating in clinical trials. In other words, because they didn’t see diversity reflected in online content relating to prostate cancer, they held the incorrect medical belief that they weren't capable of being diagnosed with the disease.
This underscores a different point: representation can help encourage trust in medical facts and the medical establishment, especially for populations susceptible to worse medical outcomes and in instances where seeing someone “like” you may elicit the belief that you’re not alone with a diagnosis.
The Risks of Unproven and Disproven Treatments
Finally, the experts discussed the dangers of alternative treatments found online. Johnson shared the findings of a broad study of patients across the United States who were using unproven and disproven cancer treatments. The researchers in Johnson’s team compared those patients’ outcomes to individuals who were using physician-recommended cancer treatments for curable cancers—and among individuals using unproven or disproven treatments, there was an “up to five-fold increased risk of death,” said Johnson (emphasis added), depending on cancer type. “And that’s concerning,” Johnson continued, “because these are individuals who would have received an evidence-based approach that would have provided a better chance of a cure.”
He also emphasized the need to distinguish between complementary therapies and outright misinformation (read more in “How to Prevent Cancer through Nutrition”). Some nontraditional therapies for improving quality of life during cancer treatment —like meditation and exercise—are supported by strong evidence. But many so-called “natural cures” lack any scientific backing. “And when you use [these therapies] instead of proven cancer treatments, you put your health at risk,” Johnson stated. In some cases, he added, “If you use these [therapies] combined, it could potentially make cancer treatments more toxic or less effective, depending on the interactions they have.” The risks extend beyond direct harm. “Sometimes patients follow extreme diets, cutting out all sugar, and they lose so much weight that we have to interrupt their chemotherapy for nutrition intervention,” he noted. “They end up putting their cancer cure at risk.”
Cancer misinformation is not just an unfortunate byproduct of the digital age—it’s a pressing public-health problem, proliferating in the same online spaces that serve as lifelines for patients seeking answers. Combating it demands more than just fact-checking; it requires physician leadership, public education, and media literacy.