YouTube starts taking down videos promoting harmful or ineffective cures for cancer. The video streaming platform also will be taking action against videos that discourage or stop people from seeking professional medical treatment as it reportedly sets out its health policies going forward.
YouTube Videos Promoting Harmful or Ineffective Cures for Cancer
YouTube will now take down content that promotes “cancer treatments proven to be harmful or ineffective” or the ones which “discourages viewers from seeking professional medical treatment,” the video streaming platform announced today. The enforcement in question comes as YouTube is relatively attempting to streamline its medical moderation guidelines simply based on what it has learned while attempting to tackle misinformation surrounding topics such as covid-19, vaccines, and even on reproductive health.
Google’s video platform going forward, states that it will apply its medical misinformation policies when there is a high public health risk, when there is publicly available guidance from health authorities, and also when a topic is prone to misinformation. The platform, YouTube hopes that this very policy framework will be flexible enough to help cover a broad range of medical topics, all the while finding a balance between minimizing harm as well as also allowing debate in the process.
How YouTube Plans To Take Action Against Misinformation on Its Platform
YouTube in its blog post stated that it would be taking action both against treatments that are actively harmful as well as those that are not yet proven and are therefore being suggested in place of established alternatives. A video on the platform could not, for instance, encourage users in to take vitamin C supplements as an alternative to radiation therapy.
YouTube’s Updated Policies
The updated policies of YouTube come a little over three years after it reportedly banded together with some of the biggest tech platforms in the world to make a shared commitment to fighting covid-19 misinformation. Although the video platform had prior to this taken action against vaccine misinformation such as taking down ads from anti-vax conspiracy videos, it however strengthened its approach in light of the pandemic, thus removing videos with Covid vaccine misinformation back in October 2020 and then continuing to banning vaccine misinformation from its platform entirely in late 2021.
The platform also has taken action against other videos that are deemed harmful under its medical misinformation policy, and this is including those that provide “instructions for unsafe abortion methods” or get to promote “false claims about abortion safety.”
How Tech Companies Are Fighting Against Internet Misinformation
While the major tech platforms got to stand united in early 2020, their exact approaches to covid-19 misinformation have differed ever since that initial announcement. Most notably of the platforms, Twitter stopped enforcing its Covid misinformation policy in late 2022 following its reported acquisition by Elon Musk. Meta also has softened its very own moderation approach just recently, and in the process rolling back its Covid misinformation rules in countries (such as the US) where the disease in question is no longer considered a national emergency.
MORE RELATED POSTS
- Facebook Moves to Address the COVID-19 Misinformation to Include Kids
- Facebook Climate Science Information Centre: Facebook Launches Climate Science Info Centers In Nigeria and South Africa
- Task Manager Is Finally Getting a Search Bar In The Future
- Facebook Business: Ways to Use Facebook in Promoting Your Business
- Facebook Ads – Facebook Advertising Format | Facebook Ads ??