قالب وردپرس درنا توس
Home / Innovative / YouTube claims that its crackdown on borderline content actually works

YouTube claims that its crackdown on borderline content actually works



After making changes to its recommendation algorithm to reduce the proliferation of "borderline" content – videos that fall exactly between what is acceptable and what violates the YouTube Terms of Service – YouTube has released one Shortening of the replay time by 70 percent record these types of videos of non-subscribers.

Since January 2019, more than 30 changes have been made to the way videos are recommended. This emerges from a new blog post from YouTube that outlines how the company wants to tackle border issues. YouTube does not specify what has changed, nor does the blog post describe how many videos were recommended before and after the changes were implemented. Instead, YouTube's new blog post describes how external facilitators go through certain criteria to determine if a flagged video is borderline. This information is then used to inform machine learning tools that YouTube relies on to monitor the platform.

"Each rated video receives up to nine different opinions, and some critical areas require certified experts," the blog post says. "For example, physicians provide guidance on the validity of videos on specific medical treatments to limit the spread of medical misinformation. Based on the evaluators' consensus input, we use well-tested machine learning systems to create models.

Some of the Criteria that Moderators Use Viewing was recently demonstrated in an interview with YouTube CEO Susan Wojcicki at 60 Minutes . Wojcicki led the reporter Lesley Stahl through a few videos that may be borderline. A video rated as violent by Wojcicki focused on Syrian prisoners but was allowed to continue running as it was uploaded by a group attempting to uncover problems in the country. In another video, World War II footage was used, and although this may be considered acceptable to many in the historical context, Wojcicki showed how it can be used by hateful groups to spread white rhetoric of domination. It was forbidden.

YouTube has recently changed its hate policy to address issues such as white nationalism, which now violates the YouTube Terms of Service. People could make that believe by explaining some supremacist statement that might lead to a ban. That does not necessarily have to be right. Wojcicki defended YouTube's stance by judging the content of a video by context, adding that it would be acceptable if a video simply said that "whites are superior" if there was no other context.

"Nothing is more important to us than ensuring that we live up to our responsibilities," adds the blog post. "We continue to focus on maintaining that delicate balance that allows different voices to thrive on YouTube – even those that others disagree with – while protecting viewers, creators, and the entire ecosystem from harmful content."

Part of the way YouTube addresses the issue is more authoritative sources for topics such as" news, science, and historical events where accuracy and commitment are critical. "Try this by selecting three different but to address related topics: emergence of more authoritative sources such as The Guardian and NBC in the search for news topics, providing more reliable information during current news events, and providing additional context for users besides the video.

That is, when looking for topics such as "Brexit" or "vaccine protection", the best results should be obtained from viewing reliable, authoritative news sources – even if the engagement rate is lower than other videos that deal with the topic deal. By doing so in breaking news, such as mass shootings or terrorist attacks, YouTube has "seen a 60 percent increase in consumer adoption on the channels of key news partners." The problem is that in this new blog post – or any other public interview that Wojcicki and executives have given – it's not clear what these numbers mean overall. It is important that the number of users viewing borderline content from channels that they do not subscribe to decreases by 70 percent. It recognizes that journalists, academics and former YouTube engineers have been quoting the rabbit-building effect for years. The question remains whether this results in a significant number of viewing hours. YouTube's blog post does not say so.

"Content that is close to, but not in excess of, our Community Guidelines accounts for only a fraction of 1 percent of the content viewed on YouTube in the United States," the blog post said.

500 hours of content uploaded to YouTube every minute. That's 720,000 hours of content per day. It would take 30,000 days to watch every video uploaded to YouTube in just one day. These are a lot of videos, many of which are viewed in the US. It's good that fewer people are watching borderline content. But until YouTube publishes certain numbers, it's difficult to judge what that really means.


Source link