Why did it take so long for Reddit and Facebook to block racist groups? - iWONDER

  03 July 2020    Read: 3900
 Why did it take so long for Reddit and Facebook to block racist groups? -  iWONDER

This week, in a matter of just 48 hours, several social media companies made major changes to how their platforms are and can be used. Reddit deleted a group, or “subreddit”, called “The Donald” that was known for encouraging targeted harassment and hate speech. YouTube banned videos from white supremacists like David Duke and Richard Spencer. And Facebook cracked down on a wide swath of dangerous content, including groups devoted to the “boogaloo” movement, which hopes to spark a race war in the United States.

These developments signal a significant shift in how these companies see their role and responsibility in the world. Until extremely recently their leaders repeatedly declared that “free speech” was their primary value, and trumped other values like safety, dignity and democracy.

Now, without declaring they had been wrong all along, these companies seem to have all decided it was time to declare a different way of dealing with dangerous, extremist content – at least on the surface. Why all this action, and why now?

The first half of 2020 was a perfect storm of factors that made many of these companies reconsider how they want to represent themselves to the world and how they want to treat their users. The flood of misinformation about the Covid-19 pandemic endangered lives. The bold movements for social justice that rose up in the wake of the martyrdom of George Floyd heightened sensitivity and awareness of the dangers of white supremacy in the US like nothing else in recent years. And the re-election effort of Donald Trump has grown increasingly dangerous, with the president and his followers frequently deriding public health efforts and celebrating state and vigilante violence against Black people and their allies.

In this environment, corporate leaders at Google, Twitter, Facebook, Reddit and other companies had to take much more seriously the question of how they influence the world. Of course, social media scholars have been calling for this level of attention for almost a decade. Since around 2017, many journalists have, as well. But it took more than scholarship and journalism to make a difference.

Companies as rich, powerful, and ubiquitous as Facebook and Google only have two real soft spots. One is labor. Both Google, which owns YouTube, and Facebook face a constant shortage of highly qualified and experienced workers. Many people who work for these companies have other employment options in ways that most American workers will never enjoy. Technology workers command high salaries and have unusual flexibility in their career plans and life choices. They are also in constant communication with each other, meaning that Silicon Valley workers, when they choose to, have a lot of power in terms of collective message-making. Recent months have seen growing expressions of disgust among workers at major tech companies who are frustrated at their companies’ refusal to respond more assertively to problems with how their platforms are used. The CEOs and COOs of these companies are now, belatedly, realizing they have to take these concerns seriously.

The other soft spot is advertising. Advertisers have even more power than workers. This week several major global advertisers, apparently led by Unilever, announced that they are suspending advertising on Facebook until the company has stronger protocols against the use of its platform for hate speech and disinformation.

Unilever may be one of the few institutions on earth that Facebook needs more than it needs Facebook. In fact, the consumer products conglomerate has a fairly strong record on matters of social responsibility. Yet it’s been somewhat surprising when companies beyond the usual list of do-gooders step up to make a stand against racism and fascism. Advertisers as powerful and diverse as Clorox, Ford, HP and Adidas have decided that they do not want their products and logos associated with racism, calls to violence, or other trappings of emerging fascism in the United States. The number of boycotting companies has grown to more than 300, and even Prince Harry and Meghan Markle were reportedly phoning their corporate friends to encourage them to stop advertising on Facebook.

Why did it take so long, however, to get major tech companies to change their tune? Why weren’t the leaders of Reddit, YouTube and Facebook willing to take such steps in the aftermath of the 2017 genocide of the Rohingya minority in Myanmar? Why didn’t they act in 2016, when Facebook actively supported the Trump campaign as the failed businessman pledged to build a wall to keep Mexicans out of the US and ban Muslim immigration? Why did it take the visual evidence of the killing of George Floyd and all of the fallout within just one country, the US, to move advertisers, labor and management to take such action? Didn’t all the algorithmically stoked violence in Sri Lanka, India, Kenya, Myanmar and the Philippines matter?

We shouldn’t celebrate this moment. Let’s mark it, instead, as a potential turning point in the history of Silicon Valley and its relationship with movements for global social justice, and try to understand its limitations. We still don’t know to what extent these recent moves to de-platform extremism will make a difference over time.

It’s likely these advertisers will return to Facebook – still the best advertising platform ever created – after the US election in November. There is also a very good chance that the extremist actors who are pushing for violence and racism will just find a way to re-platform themselves under new aliases. Other social media services might help these movements promote themselves and generate not only wider audience participation, but also the sense of victimhood and indignation that is one of the chief drivers of fascism.

So these moves could backfire or yield only marginal results over time. Nonetheless we should review the Great Deplatforming of 2020 as a potentially positive shift in the awareness of powerful people about the plight of the powerless and the rising forces of fascism in the United States.

If only these companies, and the democracies they operate in, took the global threats more seriously. We still have much work to do.

Siva Vaidhyanathan is a professor of Media Studies at the University of Virginia and the author of Antisocial Media: How Facebook Disconnects Us and Undermines Democracy

 

 

The Guardian


More about: Reddit   Facebook  


News Line