Search

Facebook Can’t Be Reformed - The New York Times

kotortopo.blogspot.com

“We know we have more work to do.”

That was the line from numerous Facebook representatives last week in reaction to the #StopHateForProfit advertising boycott campaign. Intended to pressure the company to curb hate speech and misinformation, the boycott has been joined by several high-profile brands, including Unilever and Verizon, and could make a rare dent in Facebook’s ad revenue.

The campaign seems to be having an effect. On Friday, Facebook announced that it would add labels to content about voting and expand its hate speech policies. The company also added a “newsworthy” tag for hateful content from political figures that violates rules but is allowed because of its news value. Facebook stressed that all these moves were part of a continuing cleanup. “We know we have more work to do,” the statement read.

We Know We Have More Work to Do (let’s call it W.K.W.H.M.W.T.D. for short) is the definitive utterance of the social media era, trotted out by executives whenever their companies come in for a public shaming.

In just eight words, it encapsulates the defensive posture that Facebook has been crouched in ever since the 2016 election, when it became clear that its tolerance of hate-filled communities on its platforms turned them into witting vectors for disinformation and propaganda.

The phrase is both a promise and a deflection. It’s a plea for unearned trust — give us time, we are working toward progress. And it cuts off meaningful criticism — yes, we know this isn’t enough, but more is coming.

In Facebook’s case, what is most dangerous about W.K.W.H.M.W.T.D. is that it glosses over the fundamental structural flaws in the platform. The architecture of the social network — its algorithmic mandate of engagement over all else, the advantage it gives to divisive and emotionally manipulative content — will always produce more objectionable content at a dizzying scale.

Facebook frequently uses its unfathomable amount of content as an excuse for inaction. “We’ve made huge strides,” a Facebook spokesman, Nick Clegg, said on CNN last Sunday. “But, you know, on an average day, there are 115 billion, 115 billion messages sent on our services around the world, and the vast, vast, vast majority of that is positive.”

But Mr. Clegg’s defense is also an admission: Facebook is too big to govern responsibly. There will always be more work to do because Facebook’s design will always produce more hate than anyone could monitor. How do you reform that? You can’t.

Lately, my thoughts on Facebook have been influenced by two separate movements: prison abolition and the push to defund police. There are complex policy issues involved, but the central premise of these movements is elegant in its simplicity. The bloated and corrupt institutions that they critique are beyond reform. As Mariame Kaba wrote in a recent Times Op-Ed essay on defunding the police, “We need to change our demands.”

To be clear, there is no one-to-one comparison between Facebook and the police or the carceral state. Modern policing, as Ms. Kaba notes, has its origins in slave patrols. Facebook’s origins are obviously much different.

Still, the movements provide a helpful lens through which to view Facebook. Despite the exhausting debates around content moderation policies and constant incremental tweaks to its rules and policies, glaring problems persist. All signs point to a system beyond reform.

“You see lots of people putting forth a hopeful idea of a new, humane social media platform to rescue us — one that respects privacy or is less algorithmically coercive,” Siva Vaidhyanathan, a professor of media studies at the University of Virginia, told me recently. “But if we’re being honest, what they’re really proposing at that point is not really social media anymore.”

In other words, the architecture is the problem.

“I think social media have been bad for humans. And we shouldn’t keep trying to imagine we should either fix or reinvent what is fundamentally a bad idea,” he said.

Ifeoma Ozoma, who helped lead public policy and social impact at Pinterest and worked on public policy efforts at Facebook and Google, argues that Facebook’s flawed architecture and its leadership are inextricably linked.

“We’re not going to see real change if we’re just asking for edits on the edges,” she told me. “The platform will reflect the values of the people that make the decisions. If you have people working at the platforms that are bought into perpetuating a system of white supremacy or unwilling to reckon with it, then that’s what it’ll start looking like.”

Ms. Ozoma, who designed policy changes to limit the spread of medical misinformation and hate speech on social media platforms, speaks from her own experience in Silicon Valley. She has publicly criticized social media companies’ leaders for inequitable pay, and in one recent thread, said that her managers undermined the work of people of color with “racism, gaslighting and disrespect.”

“Even if you came up with a framework that reconfigured the platform structurally, you’re not going to see the reform or implementation,” she said. “You’re not going to see Facebook’s leaders admit what they did over the last decade is wrong and harmful.”

Few who know Facebook really believe that Mark Zuckerberg will dismantle his company or relax his grip on the board, placing conversations like this one more in the realm of thought experiment than in reality.

But for those of us who are at the whims of the company’s power, the status quo also seems untenable. Small reforms are crucial, but they also suggest that the current iteration can be saved — that there’s more work to do. Facebook cannot be reformed. We need to change our demands.

The#StopHateForProfit campaign is one such change, but there are others. Mr. Vaidhyanathan told me that he is thinking less about policing Facebook’s platforms; he is trying to imagine ways to help us live in a world dominated by Facebook.

“We probably have to start thinking more radically about what kind of information ecosystem we need to survive as a democratic republic,” he said. His ideas include what he described as “boring” but essential things like investing in libraries and public schools.

There are other ideas, like declaring “platform bankruptcy.” This would involve platforms resetting all of their user and group follower counts to zero and rebuilding communities from the ground up, with the platforms’ current rules in place. There’s no shortage of ideas on this subject, as my Opinion colleague Annalee Newitz wrote last year: “We need to stop handing off responsibility for maintaining public space to corporations and algorithms — and give it back to human beings.”

I put the question to my Twitter followers, asking for their best ideas to fix tech platforms and received over 1,000 responses in a few hours.

Some were simple: “Design distribution around a different principle than virality.” Others were wonkish: “Cross-company/platform data and research collaborations between trust and safety teams.” Many were about fundamental transformation: “Ban algorithmic amplification; require proof of safety, efficacy, freedom from bias before product intro; classify personal data as a human right, not an asset.”

There were calls to get rid of metrics and for strict verification of real identities, for the companies to slow down the speed of information. There were privacy solutions, ideas for more tailored community networks.

Many were more blunt: Just shut it all down and start over.

Some of these ideas feel almost too utopian to type, too simple, improbable. But there’s elegance in simplicity; these are visions of an internet we actually want to live on. Facebook sold us a utopian vision of a more connected world and left us with our current dystopia. Why can’t those of us who are left to clean up its mess have our own shot at utopia? Either way, we know we have more work to do.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email:letters@nytimes.com.

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.

Let's block ads! (Why?)



"can" - Google News
July 02, 2020 at 08:48AM
https://ift.tt/38lov3x

Facebook Can’t Be Reformed - The New York Times
"can" - Google News
https://ift.tt/2NE2i6G
https://ift.tt/3d3vX4n

Bagikan Berita Ini

0 Response to "Facebook Can’t Be Reformed - The New York Times"

Post a Comment

Powered by Blogger.