30,000 YouTube videos removed for making false claims about COVID-19 vaccines

30,000 YouTube videos that shared false information regarding the COVID-19 vaccines have been removed from the platform over the last six months, YouTube spokesperson Elena Hernandez said. YouTube removed those videos following the updated policies to address the misinformation being relayed.

YouTube started including vaccination misinformation in its COVID-19 medical misinformation policy in October 2020. Since February 2020, YouTube has taken down more than 800,000 videos containing coronavirus misinformation. The removed videos were flagged by either the AI systems or human reviewers. The removed videos contradicted vaccine information from the World Health Organization (WHO). The YouTube accounts violating the policies are subject to a “strike” system, which can result in accounts being permanently banned.

Other social platforms, including Facebook and Twitter, have also implemented policies to reduce the spread of misinformation. Twitter introduced a strike system against misleading tweets about COVID-19 vaccination and five or more strikes will result in permanent suspension of the account.