Facebook and Twitter are taking action. It’s too little, too late.
Propelled by the nation’s stunned reaction to last week’s violent siege of the U.S. Capitol, social media companies have sought to separate themselves from President Trump and lawmakers who were complicit in the riots.
Twitter banned Mr. Trump from its platform, while Facebook indefinitely suspended him and YouTube prevented new uploads for a week. Other tech companies stopped doing business with Parler, where would-be insurgents had found a comfortable home.
The actions, a long time coming, are sure to limit the appearance of some of the most inflammatory posts and tweets, particularly leading up to next week’s presidential inauguration. But until social media companies are willing to fundamentally change their sites by making them far less attractive to people seeking to post divisive content, deeply troubling posts will continue to spread quickly and broadly.
Facebook, Twitter and YouTube are trying to claim the mantle of champions of free speech and impartial loudspeakers for whoever has a deeply held conviction. The truth is that they are businesses, driven by quarterly results and Wall Street’s insatiable desire for ever greater sales and profits.
That’s the central tension at the root of the troubles with social media and civil society: The most divisive, misinformed content tends to keep users on the sites longer — which is essential for harvesting data that leads to highly targeted advertising.
There’s nothing wrong with making a buck, of course. But until Facebook, Twitter and the rest view their platforms as something more than just businesses, policing the sites will be a perpetual game of Whac-a-Mole.
Consider, for instance, that it was only on Monday that Facebook announced a purge of content promoting the false election fraud claims behind the campaign known as “stop the steal.” That’s been a rallying cry since Election Day, more than two months ago. Facebook’s co-founder and chief executive, Mark Zuckerberg, who is the controlling shareholder of the company, has said he believes that politicians should be allowed to knowingly lie on Facebook.
Twitter, for its part, rolled out a new five strikes and you’re out scheme for violations of its policies. How many other businesses permit a customer to break their house rules five times before kicking them out?
Sure, the social media companies already append warning labels on posts containing misinformation and limit users’ ability to share certain tweets. They’ve given some accounts a timeout and changed their rules to account for emerging social norms, and they frequently, if less publicly, delete content or accounts that don’t align with their business interests. But these aren’t fundamentally corrective measures — they are Band-Aids.
The beating heart of social media companies is their vaunted algorithms, which surveil users and shuffle them into categories that dictate what content they see most prominently and often. The companies’ incentive is the higher price marketers will pay for advertising that preys on users’ interests. That’s why Facebook moved to make it easier for the like-minded to form groups on its site — even the like-minded who would plan the storming of the Capitol or the kidnapping of Michigan’s governor.
These companies have consistently ignored warnings about how their very structure foments misinformation and division. A Facebook-ordered civil rights audit released in July effectively gave the company a failing grade. “The auditors do not believe that Facebook is sufficiently attuned to the depth of concern on the issue of polarization and the way the algorithms used by Facebook inadvertently fuel extreme and polarizing content,” the auditors wrote, after a two-year study.
In other words, Facebook and other social media companies very well could have seen the ransacking of the Capitol coming. And they ought to be horrified by the role their services played in the blitz.
“Without any impetus to serve the public good, these companies are going to keep amplifying extremist positions,” said Jesse Lehrich, a co-founder of the nonprofit organization Accountable Tech.
Mr. Lehrich said Facebook should make a chronological news feed the default, rather than an algorithm that shows users what it thinks is most relevant. And it ought not to thrust users unwittingly into groups or toward certain pages that align with what the software thinks will interest them. Users could still opt into those services.
There are other changes the platforms could make, like more closely monitoring — with human moderators — accounts that have the widest reach, particularly those run by politicians or other prominent individuals. They could delay posts in lieu of closer scrutiny and strengthen their confusing and sometimes misleading warning labels.
With the shuttering of Mr. Trump’s accounts, some will point to Big Tech’s tremendous reach as well as concerns about curtailing free speech. But these companies have throttled speech for years, when it serves their purposes. “We’re not neutral,” Adam Mosseri, the head of Instagram, wrote on Twitter this week. “We try and be apolitical, but that’s increasingly difficult, particularly in the U.S. where people are more and more polarized.”
The companies aren’t likely to surrender the power they’ve accumulated any time soon — that’s why Facebook, which also owns Instagram, faces twin antitrust lawsuits from the Federal Trade Commission and 48 attorneys general. There are risks to shifting to an online world where polarization isn’t baked in. Users may threaten to leave. Advertisers may curtail their spending. But this summer’s advertising boycott over Facebook’s response to hate speech showed that companies don’t stay away for long: Facebook had a blockbuster third quarter. About the only other place for marketers to go is Google.
So for the improvement of society, we have to appeal to the better senses of these tech executives. Has preying on consumers’ desires for more likes, shares and followers yielded a better world online or just a more profitable one?
The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.
Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.
"now" - Google News
January 13, 2021 at 05:00PM
https://ift.tt/3bD8BFg
Opinion | Now Social Media Grows a Conscience? - The New York Times
"now" - Google News
https://ift.tt/35sfxPY
Bagikan Berita Ini
0 Response to "Opinion | Now Social Media Grows a Conscience? - The New York Times"
Post a Comment