Home / Technology / The anatomy of a fake headline

The anatomy of a fake headline



When confrontations broke out across the country earlier this month between Black Lives Matter protesters and the police, some Oregoners, mostly the elderly, saw a Facebook ad that published a headline about how a Republican politician “wants martial law.” controlled the Obama Soros Antifa super soldiers “.

Needless to say, neither an army of left “super soldiers” marched through Oregon, nor was former President Barack Obama and billionaire George Soros known for funding anything related to Antifa. And the politician in question didn̵

7;t really say there were “super soldiers”. The headline, which originally comes from the often sarcastic, progressive Blog Wonkette, should never be understood as a mere message.

The whole thing was a mishap that arose from the modern age of news. The headlines you see are not decided by a stubborn front page editor, but by several algorithms that choose which news to show and who to show it to. This system can work well, but in this case it created a whirlpool of misinformation that already inspired some Westerners to take up arms and protect their cities from the largely non-existent threat from outside the city from Antifa troublemakers.

This was just a headline that led to a feeling of paranoia that was amplified by rumors from many sources. However, the exact deconstruction of the genesis provides insight into how easy it is for a marginal conspiracy theory to accidentally slip into the mainstream online news ecosystem.

The trouble started when SmartNews took up the mocking story of Wonkette. SmartNews is a news aggregation app that drives users to place almost $ 1 million worth of ads on Facebook. This emerges from the published data from Facebook. According to the startup’s mission statement, its “algorithms evaluate millions of articles, social signals and human interactions to deliver the best 0.01% of the stories that matter most today.”

The company, which says that “news should be impartial, trendy and trustworthy”, typically chooses normal local headlines for its Facebook ads – possibly from your local television news broadcaster. Users who install the app receive headlines about their home area and interesting topics curated by SmartNews’ algorithms. This time, however, the headline was drawn from Wonkette in a story that ridiculed Jo Rae Perkins, Oregon’s candidate for the Republican U.S. Senate, that sparked controversy over her support for conspiracy theories.

In early June, when protests against police violence surfaced in rural cities across the country, Perkins recorded a live Facebook video calling for “tough martial law” to “crush” the “Antifa thugs” who are said to be different cities visit in Oregon. It also grounded protesters unfounded with shared right goals: “Many, many people believe that George Soros pays them,” she said, “and this is the army that Obama assembled a few years ago.”

Perkins never said “super soldier” – the term appears to be a Twitterverse joke, which Wonkette added to his heading in this case to mock Perkins’ apparent fear of protesters. For someone familiar with his dead sardonic style, seeing the hyperbolic headline on Wonkette’s website wouldn’t raise an eyebrow – regular readers would know that Wonkette mocks Perkins. But it’s 2020, and even island blog headlines can travel outside of their readers’ RSS feeds and use social media to venture into areas where wonchain isn’t widely known. SmartNews embodied this phenomenon when it automatically removed the headline of its association with wonchain and presented it neutrally in the ad.

SmartNews’ algorithms selected this heading for ads to appear on people’s Facebook feeds in almost every county in Oregon. A banner like “Charles County News” corresponds to the name of the county in which the ad was placed. This strategy is used by the company thousands of times a day.

Rich Jaroslovsky, vice president of SmartNews, said that in this case the algorithms would have done nothing wrong by choosing the funny headline that should be shown to existing readers. The problem, he says, was that the headline was shown to the wrong people.

SmartNews, he said, focuses “a lot of time, effort, and talent” on its algorithms to recommend news to users of SmartNews apps. These algorithms would have directed Antifa’s history to “people who are believed to have a proven interest in the things Wonkette specializes in”. For these readers and in this context, the story is not problematic.

“The problems occurred when it was pulled out of its context and placed in another” for Facebook advertising that is not based on a criterion other than geography. “Of course, this shouldn’t have happened and we are taking a number of steps to ensure that we address the issues you are addressing,” said Jaroslovsky.

Jaroslovsky said won chain stories will no longer be used in ads in the future.

SmartNews targets ads in specific geographic areas – in this case, 32 of Oregon’s 36 counties.

However, Facebook had other ideas: According to data published by Facebook on what it deems to be political, the “Antifa Supersoldier” ads were mostly shown by people over 55 years of age. Undoubtedly, many of these viewers ignored the ad or were not fooled by it, but the population chosen by Facebook is a population that, according to a recent study by New York University, is more likely to spread misinformation on social media.

This choice by the Facebook algorithms is powerful: Scientific work has shown that Facebook rates the content of advertisements and then sometimes directs them disproportionately to users with a certain gender, a certain race or a certain political perspective. (The newspaper didn’t examine age.)

Facebook also does not make it possible to know exactly how many people saw SmartNews’ Antifa Super Soldier ad. The company’s transparency portal states that the ad was shown between 197 and 75,000 times in about 75 variations (based on Android and iPhone and number of counties). Facebook declined to provide more specific data.

Facebook does not believe that the ads have violated company rules. Ads are “primarily” checked by automated mechanisms, Facebook spokesman Devon Kearns told The Markup. Therefore, a person is unlikely to have seen the ads on Facebook before they were shown. “However, ads that appear with satirical headings that are taken out of context can be reviewed,” Kearns said, and ads that are found to be incorrect are removed. (Typically, satire and opinion in advertisements are exempt from being labeled as “misinformation” according to Facebook’s fact-checking guidelines unless presented outside of the context.)

Rebecca Schoenkopf, editor of Wonkette, told The Markup that she did not know that SmartNews was promoting the content of her website with Facebook ads, but was not necessarily against it. In theory at least, this could lead to more readers being drawn to your website.

Ironically, Wonkette has a limited Facebook presence. In recent years, the reach of won chain posts on the platform has shrunk to almost nothing.

If you follow the chain of information from Perkins’ Facebook video to the SmartNews ad, you can easily see how a number of actors have captured and expanded the same original content – a video by Perkins that conspiracy theories – based on their own motives. Each of these links created a potential for misinformation where the context required for understanding could be removed.

Lindsay Schubiner, program director at the Western States Center in Portland, Oregon, who works to combat right-wing extremism in the Pacific Northwest, told The Markup that while social media has a democratizing effect on information, it is also an ideal format for disseminating Misinformation.

“The SmartNews ad – accidentally or not – joined a chorus of false, misleading, and racist contributions by white nationalists in response to protests against Black Lives Matter,” Schubiner wrote in an email. “Many of these posts dealt with anti-Semitism, which has long been a point of contact for white nationalists trying to explain their political losses.”

In that case, she said, she may have announced to the common right trope that the Jewish Soros is funding mass protests for left purposes.

“These bigoted conspiracy theories have helped boost extreme right-wing activities and organization,” she continued. “It is quite possible that the advertisements contributed to the extreme right-wing organization in Oregon in response to false rumors of small town anti-fascist gatherings.”

These conspiracy theories had real consequences. Firearms residents in nearby Washington State harassed a multiracial family who camped in a converted school bus, detained them with felled trees, and apparently mistakenly believed they were Antifa. The family could only escape with the help of some local students armed with chainsaws to clear the way to freedom.

This article was originally published on The Markup by Jeremy B. Merrill and Aaron Sankinand and republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.

Originally published on themarkup.org

Phew, hey you!

Would you like to receive the funniest daily tech newsletter in your inbox for FREE every day? Of course: Sign up for Big Spam here.


Source link