Social Media

Will The Kyle Rittenhouse Verdict Change How Facebook Treats Gun Violence—Again?


When Kyle Rittenhouse fatally shot two men and wounded a third in August 2020, Facebook took relatively swift action. A day after the incident in Kenosha, Wisconsin, it removed his Facebook and Instagram accounts, removed posts praising him and blocked his name from the apps’ search function.

The moves came as part of another new Facebook policy around violence and mass shootings that the company said debuted that same week, though it’s unclear whether it dropped before or after Rittenhouse shot those men. And as part of its decision to reduce Rittenhouse’s profile on the platform, the company officially designated Rittenhouse as a “mass shooter.”

But the steps immediately drew criticism from within the company. In a post to Facebook’s internal Workplace message board days later, one employee wrote: “If Kyle Rittenhouse had killed 1 person instead of 2, would it still qualify as a mass shooting? Can we really consistently and objectively differentiate between support (not allowed) and discussion of whether he is being treated justly (allowed)?”

The Workplace post went on: “Can we really handle this at scale in an objective way without making critical errors in both under and over enforcement?”

That comment really hits the nail on the head. Facebook has been caught up in a years-long reckoning over what type of content to regulate and how it should do that. It has been roundly criticized for not doing enough by liberals and for doing too much by conservatives. As a result, it is pulled in both directions, more often than not pleasing neither side.

Recently, it has been pressed recently to take a stronger stance against violent content and posts that might lead to violence, which you might think could draw universal support. It hasn’t. And on Friday, it became even more complicated for Facebook: A jury Rittenhouse not guilty, reigniting outcries from right-wing pundits that Facebook had unfairly rushed to penalize him. (His lawyer had successfully convinced the jury that he acted in self defense that August evening in Kenosha, Wisconsin, a city then engulfed in protests over the police shooting of a 29-year-old Black man, Jacob Blake.)

Facebook has long been reluctant to make judgement calls about what belongs on its site—and when it has prohibited material such as violent content, it has not always been successful in keeping it from appearing on its platform. One of the most dramatic examples: the March 2019 Christchurch shooting in New Zealand, where the shooter livestreamed his onslaught on Facebook and YouTube. No one reported the video to Facebook until 29 minutes after it went live, and no part of the video triggered Facebook’s automated moderation software, according to a internal Facebook report on the Christchurch shooter. Facebook eventually shut off the feed, but it would spend the next day taking down 1.5 million copies of the video. In response, Facebook changed a number of its policies related to live videos, including speeding up how quickly its software moved to review new live videos. (Before Christchruch, a broadcast would typically have lasted for 5 minutes before the software noticed it; subsequent changes lowered that to about 20 seconds.)

As with many Facebook policy changes, these were reactive, and the more recent past has seen more of Facebook trying to keep up with currents events unfolding on its platform. In August 2020, shortly after the Rittenhouse shootings, Facebook CEO Mark Zuckerberg acknowledged the company had erred in not taking down a Facebook event page that encouraged a militia to form in the same Wisconsin city where Rittenhouse shot the three men. Facebook users reported the militia group 455 times before Facebook removed it. And then in January, Facebook only took measures against posts related to the U.S. Capitol riots only in the aftermath to the insurrection even though sentiment deligitimizing the election had been allowed to blossom on Facebook in the months after Joe Biden’s victory, another internal Facebook report shows.

The Rittenhouse verdict raises a whole new set of questions. When should a “mass shooter” label get affixed to someone—before or after a trial? How exactly should Facebook tamp down on posts? Should it scrap the policy entirely?

Over the weekend, Facebook, which didn’t return requests to comment, was again sent backtracking. It lifted its block around searching for “Kyle Rittenhouse” Facebook and Instagram, helping posts about Rittenhouse from right-wing media personalities like Ben Shapiro and Dan Bongino to attract tens of thousands of comments, reshares and reaction emojis, the signals that then boost posts further up users’ Facebook feeds. One Facebook group, American Patriots Appeal, is advertising a T-shirt that shows Rittenhouse crouched G.I. Joe style holding a semi-automatic rifle. It costs $27.99 and comes emblazoned with this phrase: “Kyle Rittenhouse Did Nothing Wrong.”

The internal Facebook documents cited in this story come from the documents that Facebook whistle-blower Frances Haugen turned over to the SEC; redacted versions have gone to Congress and a consortium of news organizations, including Forbes. They’re popularly known as The Facebook Papers.  





Source link

Articles You May Like

Google Adds Image Thumbnails to Local Searches on Mobile
Google Starts Rolling out its Mobile First Index
Lost Without BoardBooster? Here’s How to Succeed on Pinterest with Tailwind
The Best seoClarity Product Innovations in 2021
How to Grow Your Website Traffic

Leave a Reply

Your email address will not be published. Required fields are marked *