قالب وردپرس درنا توس
Home / WorldTech / Sorry for the big tech to let hate go viral

Sorry for the big tech to let hate go viral



Photo: Getty

Every dreadful moment amplified by social media follows the same basic script, and the deadly terrorist attack of Friday in New Zealand, which killed at least 49 people and injured dozens, hits all the usual plot points I hate it, to go viral online.

The gunman posted a live stream of his massacre on Facebook and finally streamed for 1

7 minutes. Facebook eventually removed the stream after police tagged it. However, the video spread beyond its original target audience on YouTube, Reddit, Twitter, and other platforms, where it was re-released at countless times, spreading violence and the hater-filled ideology of the shooter to unknown millions of people. Of course, the companies say they are working furiously to break down the newly released videos.

We have seen this again and again and now we have to ask ourselves: Why are these big tech companies so bad at keeping the new? Sealand Mosque shooter videos from their decks? Why do they fail to keep the radicalized, racist and hateful content from saturating their websites? Why can not Silicon Valley monitor its platforms effectively? The problem goes far beyond a mass murder video – and it goes straight into the riches and power of Silicon Valley.

"The scale is the problem," said writer Antonio Garcia Martinez, a former Facebook advertising manager, on Twitter on Friday when was asked why .

For Silicon Valley, the scale of their platforms is what makes them multibillionaires. The unprecedented scale is a deliberate creation designed for unprecedented profit. For executives, shareholders. and engineers looking for a bonus and a raise, scale is everything.

Scale is always their excuse.

These companies, in their view, are too big to effectively moderate their platforms, especially in a case like Friday's explosive video of mass violence. If they are too big to solve this problem, maybe it is time to reduce their size.

"The rapid and widespread distribution of this hate-filled content – live streamed on Facebook, uploaded to YouTube, and boosted on Reddit – shows just how easily the biggest platforms can still be misused," Senator Mark Warner said in an email Gizmodo. "It's becoming increasingly clear that YouTube in particular has not yet grappled with the role it played in alleviating radicalization and recruitment."

When technical executives recognize in their own roundabout that their businesses are too big Maybe we should listen to them.

The exact extent of these companies is difficult to determine due to their lack of transparency. Facebook has not responded to a request for a comment on the issue, but the company claims that it has 30,000 workers and artificial intelligence to remove hateful content. YouTube, which seems to be uploading 500 minutes of video every second, said in its latest quarterly report that 49,618 videos and 16,596 accounts were removed in the fourth quarter of 2018 for violating our Violence or Violent Extremism policies. They have also removed 253,683 videos that violate our graphic violence policy.

Last week, YouTube boss Susan Wojcicki said exactly the same thing when journalist Kara Swisher asked senior executives why neo-Nazis continue to grow and grow on YouTube Comments are a notoriously hideous place used by pedophiles for networking, for example has been. Your answer: The scale is the problem.

In defense of their company, Wojcicki said that more than 500 hours of video are uploaded to YouTube every minute and that millions of bad comments are removed each quarter. "But you have to be aware that our volume is very large, and we're working to give the creators more tools to manage," she said.

Wait, whose responsibility is it to manage the proliferation of groups like Nazis and pedophiles on sites like YouTube? Is it the users? At least on YouTube, the answer seems to be "yes".

When YouTube's YouTube account was announced Friday that the company " broke away from the New Zealand killings" it triggered a response.

I'm sorry, but it does not matter to anyone if YouTube is heartbroken, "said Jackie Luo, an engineer in the Silicon Valley technology industry. "Lives were lost, more will be. YouTube is complicit – not so much because of yesterday's footage, but because of the great role it has played and continues to play in normalizing and spreading that kind of violent rhetoric. It is annoying to see a company that benefits enormously from sending ordinary people into rabbit holes to radicalize them to such beliefs and then to mourn in social media if this model has real and terrible consequences. It's a hard "no" from me.

YouTube's notorious radicalization problem fuels fuel that sets standards. The goal is for more and more people to see, upload, view, upload and consume ads. That's the whole business. Radicalization is simply a particularly effective and profitable way to achieve the growth goal.

"That's not because a coach of YouTube engineers intends to drive the world off the beaten track," wrote scientist Zeynep Tufekci last year. "One likely explanation relates to the link between artificial intelligence and Google's business model. (YouTube is owned by Google.) Despite its sublime rhetoric, Google is an advertising broker that sells our attention to companies that pay for it. The longer people stay on YouTube, the more money Google makes.

Getting users to the next radical voice is part of the basic business – eyeballs plus hours equals ad dollars.

At a time when legislators in Washington are talking about breaking up big tech, it's very interesting to see that a tech manager realizes that his platform is too big to take responsibility for himself Has.

I've had my own little experience with YouTube's radicalization business model, albeit in a much more harmless way. I'm a runner who loved fast 5km races until the last few years. When I started using YouTube more to watch live videos, the recommended and auto-play videos keep pushing me. Strangely enough, it was assumed that a 5K runner would like to run marathon and then ultramarathon. Before I ran a half-marathon last month, I looked dutifully.

When it comes to watching more videos, YouTube wants to make ultra-marathon radicals of us all because radicalization is likely to lead to reliable and profitable content. On Friday, more than 12 hours after the incident, videos about the armed gunmen in Christchurch, New Zealand, where Muslims praying in a mosque are being used as propaganda tools on sites such as Facebook, YouTube and Twitter right-wing white nationalists.

"Scale is the main problem for both FB and YouTube," media scientist Siva Vaidhyanathan tweeted . "They would be harmless at <50 million [users]."

The reaction from New Zealand where the attack took place is less forgiving.

"Failure to deal with it quickly and decisively means a total renunciation." Responsibility of social media companies, "said Tom Watson, vice chairman of the New Zealand Labor Party. "That happened too often. If these videos are not taken down immediately and others are not uploaded, this is a decay of decency.

The argument that scaling is the problem and can never be completely solved is implicitly packaged with the idea that things must be so. The apology is in good agreement with the fact that fast pace, growth and scale make these already profitable businesses even more money.

"I'm sorry that the status quo that makes us wild is quite possible, so horrible god," they seem to say. "But what can we do?"

Scale is a Rorschach test in Silicon Valley. On the one hand, it's a goal when a tech manager waits for profit. On the other hand, it is an excuse when it comes to a technical speaker who apologizes to the media about the mistakes of the company.

If technical leaders themselves recognize in their own way that their businesses are too big, we may want to listen to them

In Europe, regulators are intensively examining the definition of social media platforms that do not contain extremist content within remove one hour. "An hour is the crucial window of opportunity for the greatest damage," said Jean-Claude Juncker in his last speech to the European Parliament last year.

"We need strong and focused tools to win this online battle." "Justice Commissioner Vera Jourova said

The problem is also catching on in the United States.

" We need strong and focused tools to win this online battle. "

American politicians are increasingly wanting to know more about the scale and power of tech companies." Democratic presidential candidate recently suggested to Senator Elizabeth Warren that he break up the big tech companies, "separating America's big technology companies." pushing the market to offer better products and services. "

Silicon Valley companies" have a substantive moderation problem that is fundamentally beyond the scope they can handle, "Becca Lewis, a Stanford University researcher, said The Washington Post think tank says, "There are financial incentives to keep content first and, above all, monetization, and all responses to negative consequences are reactive."

At the moment, the incentives are lined up so that Silicon Valley will continue to barely monitor the platforms they create be changed. In the language that Silicon Valley understands, there are innovative ways to break the downward spiral of the social media industry. One is regulation.


Source link