After careful selection of a semi-automatic weapon from the trunk of his car, one of at least three, a 28-year-old Australian, quietly went to the front door of the mosque Masjid Al Noor. He shot a man standing in the doorway before he entered the mosque and opened fire on fleeing Muslims. Everything was shot with a head-mounted camera.
The video with a total of 17 minutes of footage shows a man on a mission. The shooter was calm, collegial and business in his job. He shot down husband, wife, and child before returning to his car to reload, eventually finding his way to a nearby mosque to repeat his actions. 49 people were killed. Dozens were hospitalized.
The material quickly found its way to Twitter and then to Facebook. Later, find the full unprocessed stream on YouTube and then Reddit. For three out of four, this is an insult to their terms of service. Twitter, Facebook and YouTube prohibit this type of content in its unprocessed form.
But on Reddit, the bloody footage found a home on a subreddit intended to house just that kind of content ̵
A Reddit spokesman said TNW :
We are very clear in our Terms of Service posting content that incites or glorifies violence ruled out Reddit's users and communities. Subreddits that do not comply with these site-wide rules will be banned.
The statement is theoretically great. In practice, however, that is not so easy.
Reddit has been keeping an eye out for prohibited content in the past, including r / watchpeopledie. In terms of incitement to violence, it took months to ban the popular QAnon subreddit that routinely demanded attacks on political opponents. Ditto is one of Reddit's biggest subreddits, r / The_Donald – a pro-Donald Trump fan page serving as a powder keg for young conservatives waiting for the blast.
Reddit only prohibits when people notice the following: r / creepshots vr / CandidFashionPolice, r / beatingwomen vr / beatingwomen2, r / jailbait vr / SexyAbortions
– Neetzan Zimmerman (@neetzan) September 7, 2014  At Reddit, it has become a joke that the only way to get rid of a subreddit is to do something that is so obviously stupid that the press picks it up and notes Conde Naste – Reddit's parent company takes. Until this happens, the posting of underage girls in provocative robes, pro-domestic violence, and "Creepshot" photography seems to be "A-OK!"
If Reddit prohibits an offensive subreddit, it's often active minutes later under a different, albeit similar name.
Following the ban on r / creepshots – a subreddit designed to portray sexualized images of women without their knowledge – Reddit allowed its replacement, r / candidfashionpolice, for three years, until 2015. The same goes for r / beatingwomen, which was banned just to take its place quickly r / beatingwomen2. The sequel was also banned, although there are still similar content and subreddits on the entire platform. (You have to find them yourself, we refuse to name active subreddits promoting this type of content.)
At Reddit, it can be said that his ban on r / watchpeople's previous actions on other attacks reflected subreddits. That is, the problem was largely ignored until it was no longer possible.
But if we are honest, r / watchpeopledledie is not the problem. In fact, r / watchpeopledie belongs to the spectrum of problematic content found at Reddit, at the bottom. This is not to say that bloody videos about the deaths of real people should be encouraged or that it is tasty, but it is not the main cause of mass shootings – though they give unfounded insight into these people, some might well agree with the shooters desensitize their cruel nature.
What responsibilities should these companies have? On Reddit, one of the most popular websites on the Internet, the video was told in a forum called "watchpeopledie". After more than an hour, this was posted: pic.twitter.com/C8nmt7CZgh
– Drew Harwell (@drewharwell) March 15, 2019
The real problem with Reddit is that works exactly as it was developed. It brings people together through curation, though some of these topics focus on dark, bloody or hateful material. It is a tightrope walk that reconciles the needs of the fringiest fascination of the Internet with the greater benefits of the platform and the Internet as a whole. Even subreddits that cause anger among the vast majority of platform users, such as r / beatingwomen, have others that call censorship if taken away.
However, the most problematic content of Reddit is not a forum where people die. It's the brand new subreddits like r / QAnon, r / The_Donald, and r / beatingwomen that suppress free thinking under the guise of open discussion. It is the normalization of upskirt shots on creeping heads or the lack of anxiety when discussing the breasts of a 13-year-old openly in R / Jailbait. It is the separation of reality in r / incels.
Reddit's biggest problem is not offensive content per se, it does not recognize the types of content that lead to the spread of disinformation or radicalization.
The logic dictates that a young, disenfranchised man, who is said to be ugly, that women laugh at him, and that he could never marry or start a family, would lead him to dark places. However, this type of language is not generally prohibited by Reddit. Therefore, it is incredibly difficult to ban a group that was founded to propagate this group. Although this type of language was directly overboard, it was directly responsible for a Canadian who drove a van on a busy Toronto sidewalk, killing ten people and injuring more than a dozen.
That's problematic. Most users of Reddit can clearly see how beating women or r / rapingwomen – a subreddit with instructions on how to discriminate against sexual violence against women – is bad. But it takes a differentiated discussion to read between the lines whether r / The_Donald or r / incels could be just as damaging.
And at Reddit, nuance was never a strong lawsuit.