YouTube will remove content that promotes “cancer treatments proven to be harmful or ineffective” or which “discourages viewers from seeking professional medical treatment,” the video platform announced today, The Verge reported.
According to The Verge, the enforcement comes as YouTube is attempting to streamline its medical moderation guidelines based on what it’s learned while attempting to tackle misinformation around topics like covid-19, vaccines, and reproductive health.
YouTube posted on the YouTube Official Blog an Inside YouTube titled: “A long term vision for YouTube’s medical misinformation policies”. It was written by Dr. Garth Graham and Matt Halperin. Here is part of the YouTube blog post:
“In the years since we began our efforts to make YouTube a destination for high-quality health content, we’ve learned critical lessons about developing Community Guidelines in line with local and global health authority guidance on topics that pose serious real-world risks, such as misinformation on COVID-19, vaccines, reproductive health, harmful substances, and more. We’re taking what we’ve learned so far about the most effective ways to tackle medical misinformation to simplify our approach for creators, viewers, and partners…”
“…Moving forward, YouTube will streamline dozens of our existing medical misinformation guidelines to fall under three categories – Prevention, Treatment, and Denial. These policies will apply to specific health conditions, treatments, and substances where content contradicts local health authorities or the World Health Organization (WHO).”
Here’s what the framework will look like:
Prevention misinformation: We will remove content that contradicts health authority guidance on the prevention and transmission of specific health conditions, and on the safety and efficacy of approved vaccines. For example, this encompasses content that promotes a harmful substance for disease prevention.
Treatment misinformation: We will remove content that contradicts health authority guidance on treatments for specific health conditions, including promoting specific harmful substances or practices. Examples include content that encourages unproven remedies in place of seeking medical attention for specific conditions, like promoting cesium chloride as a treatment for cancer.
Denial misinformation: We will remove content that disputes the existence of specific health conditions. This covers content that denies people have died from COVID-19.
YouTube continued: Starting today, and ramping up in the coming weeks, we will be removing content that promotes cancer treatments proven to be harmful or ineffective, or content that discourages viewers from seeking professional medical treatment. This includes content that promotes unproven treatments in place of approved care or as a guaranteed cure, and treatments that have been deemed harmful by health authorities. For instance, a video that claims “garlic cures cancer,” or “take vitamin C instead of radiation therapy” would be removed.
CNN reported that YouTube’s Dr. Garth Graham said that cancer treatment fits YouTube’s updated medical misinformation framework because the disease poses a high public health risk and is a topic prone to frequent misinformation, and because there is a stable consensus about safe treatments from local and global health authorities.
YouTube says its restrictions on cancer treatment misinformation will go into effect today, and enforcement will ramp up in the coming weeks. The company has previously said it uses both human and automated moderation to review videos and their context.
In my opinion, it is good that YouTube wants to takedown videos that are posting misinformation about cancer treatments. People seeking information about cancer treatment on YouTube should not have to see the videos that are clearly misinformation.