Youtube announced on Wednesday that it will now add misinformation about coronavirus vaccines to the current rules about lies, propaganda and conspiracy theories about the coronavirus pandemic.
Via Reuterssays the video giant now Prohibit content about coronavirus vaccines that “[contradicts] Expert consensus from local health authorities or the [World Health Organization]”- as Schein claims that the vaccine is an excuse to detain people with tracking chips, or that it would kill recipients and / oror secretly sterilize them. The company also told Reuters that it would limit the distribution of content that borders on breaking the rules, but without going into detail on how it would happen.
Google’s rules already covered topics related to the treatment, prevention, diagnosis and transmission of the virus, although the previous rules only specifically mentioned vaccines in the context of false claims that one was “available or a guaranteed cure”.
“A COVID-19 vaccine could be imminent, so we’re making sure we have the right policies in place to remove any misinformation related to a COVID-19 vaccine,” said YouTube spokesman Farshad Shadloo said The Verge.
YouTube has historically fought to prevent misinformation about the pandemic that was mounting countless millions of views on the site during 2020.
A study published in September The Oxford Research Institute and Reuters Institute, which partially covered the period from October 2019 to June 2020, found that coronavirus misinformation videos on YouTube were shared more than 20 million times on Facebook, Twitter and Reddit. That means CNN, ABC News, BBC, Fox News and Al Jazeera’s combined reach on these sites outperformed them over the same period (15 million). During this period, the researchers were able to identify only 8,105 videos removed from YouTube that contained “covid-related misinformation,” which is less than 1% of all coronavirus videos.
Interestingly, the researchers also found strong evidence that Facebook was the primary driver of coronavirus viral videos on YouTube, rather than subscribers to the YouTube channels themselves. This may also help content bypass enforcement of YouTube’s community standards, which depends to a large extent on user reports. Facebook has implemented some sketchy rules on anti-vax content in ads However, there are no rules against organic or unpaid anti-Vax posts. From the study:
Misinformation videos shared on Facebook triggered a total of around 11,000 reactions (likes, comments or shares) before being deleted from YouTube. The Oxford researchers also found that of the 8,105 misinformation videos shared on Facebook between October 2019 and June 2020, only 55 videos had warning labels from third-party fact checkers, less than 1% of all misinformation videos. This fact-checking failure helped Covid-related misinformation videos spread on Facebook and find a large audience.
Oxford researchers found that, despite YouTube’s investment in curbing the spread of misinformation, it took an average of 41 days to remove Covid-related videos with incorrect information. Misinformation videos were viewed an average of 150,000 times before being deleted from YouTube.
YouTube has also been a hub for anti-Vax content in general. During research last year (before the pandemic) found it on the retreatthe anti-vax movement is far from it forced from the construction site. Unsurprisingly, a February study by the University of Pennsylvania’s Annenberg Public Policy Center found that those who relied on traditional media to learn more about vaccines were less possible Believing in anti-Vax claims than those who did so on social media. A recent Pew poll found that 26% of adults in the US receive news on YouTube and that the content they are consuming is more likely loaded with conspiracy information.
Producers and consumers of misinformation are adept at avoiding raids. According to WiredYouTube’s in-house teams tasked with finding and removing videos with false claims about the virus found that the recommendation system, which was successfully optimized in 2019 to promote significantly less conspiracy content, became increasingly obsolete to large Amounts of traffic to misdirect claims about the coronavirus. Instead, on other sites like Facebook and Reddit, they had seen a significant increase in the number of videos uploaded and quickly promoted off-site, via a “mix of organic link sharing and astroturfed, bot-driven advertising.”
YouTube told The telegraph in September, the Oxford and Reuters study used outdated data. A speaker said the guard On Wednesday, the company removed more than 200,000 videos since early February. However, many of these could have been re-uploaded, auto-generated, or otherwise published in corners where they had no chance at all of going viral.
Another Recent study The Berkman Klein Center for Internet & Society at Harvard found that social media was subordinate to the spread of conspiracy theories about postal voting. The main reason for this was false claims made by Donald Trump and Republican allies, which were then reinforced by traditional media coverage. This seems to be true Results by researchers from Oxford and Reuters in April who made only 20% of the claims but generated 69% of social media engagement in a sample of 225 statements deemed false by fact checkers.
Platforms like YouTube have had some successes that have limited the spread of some misinformation efforts, such as a sequel to the infamous Plandemic video that was viewed more than 8 million times in May (however, the sequel was released announced in advance). In September, YouTube deleted clips from a Hoover Institution interview with White House coronavirus advisor Dr. Scott Atlas, who has raised doubts about the effectiveness of social distancing and winking, and nudged the Trump administration on a dangerous “herd immunity” strategy.
According to the Guardian, YouTube will announce further steps in the coming weeks to limit the spread of vaccine misinformation on its website.