Last month, amid the social media frenzy of Bird Box challenges, which saw viewers of the Netflix film attempt everyday tasks blindfolded, YouTube announced that it had changed its community guidelines to ban content that encouraged people to do dangerous, even life-threatening, things.
Yet despite this crackdown on pranks, there is an area of harm it is seemingly taking very little action on: cancer quackery. A search for such treatments on YouTube unveils a Wild West of ineffectual recommendations and “cures” promoted on channels with hundreds of thousands of subscribers, some of which have even given verified status by the platform.
A robotic voice on the verified “Home Remedies” channel tells the story of Ann, who claims drinking two-and-a-half kilos of carrot juice a day cured her stage four cancer without chemo or radiotherapy. “There’s no harm in postponing chemo or radiation and trying the carrot treatment,” the video insists. In reality, perhaps the most likely effect of doing so is discolouration of the skin from all of the carotene.
A second video, which has over 500,000 views on the verified “Home Cooking and Home Remedies” channel, suggests baking soda and caesium chloride can successfully be used to treat tumours, despite little evidence to support either. It claims the latter, a non-radioactive salt, has “better results” than both chemotherapy and radiotherapy, adding: “This is an option doctors don’t often give their patients.”
Yet this is for good reason: the substance has been linked to patient deaths. One, a 61-year-old woman from Colorado, had been consuming it for about a year when she directly injected it into her breast to treat a tumour, causing cardiac arrest. She died a week later.
Videos interviewing those who professionally promote cancer myths are no less extreme. The “iHealthTube” channel has an interview with an alternative medical practitioner who claims all cancers can be cured in weeks – some, he says, in just minutes. It’s this particular video that David Colquhoun, an emeritus professor of pharmacology at University College London and an outspoken opponent of quackery, describes as “the most grotesque of the lot”, having branded the genre as a whole “just ghastly”.
The video’s interviewee, Leonard Coldwell, trots out common falsehoods – healing body pH levels, raw food diets, oxygenating blood – alongside a brazen insistence that table salt is often one-third glass, which scratches the arteries. The video has over seven million views on a channel given a veneer of credibility by its verified YouTube status. Unsurprisingly, Colquhoun has strong feelings about such unscientific nonsense: “All of these videos would be illegal in the UK under the Cancer Act 1939.” The legislation bans advertising cancer treatments to the general public, including on UK-accessible websites.
As a get-out clause, videos often carry a small disclaimer insisting they do not offer professional guidance, or that cures are merely options to be discussed. Paradoxically vlogs are loudly dispensing content that appears to be medical advice, but which they quietly insist is not medical advice; vouching for “treatments” while disavowing responsibility for potential side effects.
For desperate viewers, vlogs spreading cancer misinformation can seem more authoritative than medical professionals. “People who are affected by cancer… can be quite upset; they’re searching for answers,” says Martin Ledwick, head information nurse at Cancer Research UK. “They’re quite vulnerable.”
An ecosystem of videos pushing conspiracy theories also cultivates distrust among viewers about conventional therapies. For viewers who have watched several such videos, YouTube’s algorithm will recommend yet more. Soon, viewers are sucked into a wormhole of videos painting a starkly bleak picture: patients as unwitting cogs in the machine of profit-obsessed big Pharma. (This may begin to change. YouTube recently vowed to recommend fewer conspiracy videos, including videos promoting a “phony miracle cure for a serious illness”.)
In contrast, “natural healing” vlogs trade on the reassuring appeal of the familiar. Wonder cures are often ingredients found in kitchen cupboards: lemons, baking soda, honey. The vloggers who personally hawk such treatments tend to be fit, with pearly white teeth and glowing complexions. “You look like a really healthy guy,” reads a top comment on one such video. “[M]akes it easier to believe your videos and what your [sic] saying.”
It’s hard to know how many take-ups of ineffectual treatments the videos have led to: only a fraction of calls to Cancer Research UK’s helplines are to query the benefits of an alternative wonder cure. This, though, is just those who take the trouble to seek a second opinion. “What worries me is the people who aren’t ringing us,” says Ledwick.
One quality that typifies medical professionals who work closely with cancer is honesty, he adds. If a tumour cannot be treated by conventional means, Cancer Research UK employees are upfront about it and the options available. The cha maintains a webpage of alternative and complementary therapies detailing where possible the solid evidence that underlies them. It also publishes its own YouTube videos debunking common cancer falsehoods. Ultimately, Ledwick says, if an alternative offering “sounds too good to be true, it probably is.”
It’s also tricky to tell whether alternative cancer treatment vloggers personally believe in the methods they endorse, or whether it’s just a means to make money. Watching vloggers promoting alternative cancer therapies, you are struck by the earnest tone of their on-camera appeals. It is as if they feel they have escaped the medical Matrix, and the rest of us are still stuck inside the simulation.
Yet even the least cynical person would note the links to online shops selling products on the back of cancer myths. (The products offered are not necessarily to treat cancer itself.) “In some cases, people are making profit out of cancer patients, and that’s just wrong,” Ledwick says.
One wellness-vlogger with close to 150,000 subscribers tells viewers with terminal stage four cancer she believes they can reverse it “in a very short time” with green juicing protocols. Her sermons, delivered to the camera from the comfort of her kitchen and living room, are featured prominently on her personal website, where she sells self-penned smoothie recipes, herbal supplements and a $395 “complete healing package” for numerous ailments. A message I sent her, hoping to ask about the product range, bounced back.
Another YouTube personality is even more uncompromising: “What I’m about to tell you is not my opinion, or theory,” she says, in a slickly produced video entitled “Natural Cancer Cures and Why You Don’t Know Them”. “It is fact.” Natural cures for cancer are being suppressed, she tells viewers in a maternal manner, claiming chemo and radiation kill two-thirds of those who try them. The vlog has over 600,000 views. A linked shop for alternative health products sells healing sound CDs, recipe books and a pair of glasses with slices of cucumbers in the frames.
Cancer misinformation can launch careers and make fortunes. Belle Gibson, the disgraced Australian wellness blogger, claimed to have managed various cancers through alternative therapies, earning her international acclaim and a recipe book promoted by Apple. The money reportedly paid for a lavish lifestyle. She later admitted in Australian Woman’s Weekly magazine that she never had cancer and has been fined A$£220,000 by an Australian state court for falsehoods about her charity donations.
This isn’t to suggest cancer conspiracy theorists on YouTube are conscious scammers: but a space where wild cancer claims are given ticks of approval is a space where a new Belle Gibson could thrive. The next cancer charlatan scandal may well start with wellness vlogging, not blogging.
I send three of the verified channels’ videos to YouTube, asking whether they break its community guidelines. It replies that the videos “are not in violation of our politics as the content outlines how these treatments can cure based on medical research or feature personal stories of cancer survivors”.
This is despite the videos’ health claims having, in the words of Professor Edzard Ernst, one of the world’s foremost experts in complementary medicine, “no validity whatsoever”. Patients taking their advice, he warns, “might find themselves paying with their lives”.
“YouTube’s Community Guidelines prohibit content that’s intended to encourage dangerous activities that have an inherent risk of physical harm,” a YouTube spokesperson says. “When flagged videos are in violation of our policies, we work quickly to take action. In some cases we age restrict flagged material that, while not in violation of those guidelines, contains images that may be unsuitable for younger users.”
It is true that some vlogs peddling myths are told through the stories of ostensible cancer survivors – although it can be impossible to know whether the person whose story is being shared is actually real. But there is a balance to be struck between the right to share personal experiences and the responsibility not to seriously mislead.
Under current rules, telling vulnerable patients to embrace treatments without proper evidence and with potentially fatal consequences – often with products to sell looming in the background – don’t seem to break the site’s guidelines. Many, from this, would conclude that those guidelines need changing.