Home / Innovative / Mozilla wants your help to fix terrible YouTube recommendations

Mozilla wants your help to fix terrible YouTube recommendations



YouTube’s recommendation algorithm can lead you into some very weird rabbit holes and suggest videos that feel weirdly personal and off-target at the same time. Today Mozilla will be introducing RegretsReporter, a new browser extension that aims to do crowdsource research on users’ “unfortunate recommendations” so that users can better understand how YouTube’s recommendation algorithm works and provide details on the patterns it has recognized.

Mozilla started collecting stories from users last year about the videos YouTube recommended them. A user searched for videos about Vikings and received content about white supremacy. Another searched for “fail” videos and got recommendations for gruesome videos of fatal car accidents.

However, Ashley Boyd, Mozilla̵

7;s vice president of advocacy and engagement, said there hasn’t been a really big, independent effort following YouTube’s recommendation algorithm to understand how it determines which videos to recommend.

“So much attention is paid to Facebook – and rightly so – when it comes to misinformation,” Boyd said. “But there are other elements in the digital ecosystem that have not been considered before, and YouTube was one of them. We looked at what YouTube said, how they curated content, and found that they responded to concerns about the algorithm and made progress. But there was no way to verify their claims. “

A YouTube spokesman said yes in a statement The edge that the company is always interested in researching its recommendation system. “H.ÖHowever, it is difficult to draw full conclusions from anecdotal examples and we are constantly updating our recommendation systems to improve the user experience, “the spokesman said, adding that YouTube” made over 30 different changes over the past year to address this reduce recommendations for borderline content. “

Boyd points out that the Google-powered video platform made multiple promises to tweak the algorithm, even though the company’s executives knew they were recommending videos with hate speech and conspiracy theories.

Mozilla

The browser extension sends data about how often you use YouTube, but without collecting any information about what you are looking for or what you are watching, unless you specifically offer it. You can send a report on the extension for more detailed information on each “unfortunate” video you come across in the recommendations. This allows Mozilla to collect information about the video you reported and your arrival.

Mozilla hopes the extension will make the “how” of the YouTube recommendation algorithm more transparent. For example, what type of recommended video leads to racist, violent, or conspiratorial content and identifies patterns of how often harmful content is recommended.

“I would love it if people were more interested in how AI, and in this case, recommendation systems, affect their lives,” Boyd said. “It doesn’t have to be mysterious, and we can say more clearly how you can control it.”

Boyd emphasized that user privacy is protected throughout the process. The data Mozilla collects from the extension is associated with a randomly generated user ID, not a user’s YouTube account, and only Mozilla has access to the raw data. No data is collected in private browser windows, and if Mozilla shares the results of its research, it will be done in a way that minimizes the risk of user identification, Boyd said.

Mozilla doesn’t have a formal agreement with Google or YouTube to investigate the recommendation algorithm, but Boyd says they have been in contact with the company and are committed to sharing information.

However, YouTube said the methodology proposed by Mozilla was “questionable,” adding that, among other things, it couldn’t properly verify how to define “unfortunate”.

Mozilla plans to collect information from the extension for six months. The results are then presented to users and YouTube. “We believe they are committed to this problem,” Boyd said of YouTube. “We would love if you could learn more from our research and make some viable changes to build more trustworthy content recommendation systems.”


Source link