In February of last year, UK Prime Minister David Cameron praised social media at a speech in Kuwait amidst the events of the Arab Spring, calling online connections “a powerful tool in the hands of citizens, not a means of repression.”
Later that year, as riots fueled by discontent over taxes and racial issues broke out in London, Cameron’s sentiments toward these tools changed — he announced that he was considering blocking communication on sites like Facebook and Twitter that were alleged to have aided rioters. Though he never followed through, Cameron’s hypocrisy was widely noted, as was a new censorship hypocrisy in which oppressive measures could somehow be justified in democratic nations, but were ridiculed in non-democratic ones.
It turns out that Cameron’s decision to not censor social media sites was probably the right one — but not just from a free-speech perspective: new computer-modeled research concludes that censoring social media during times of civil unrest may actually result in more sustained periods of violence despite the fact that social media is often a key tool by which violence is fostered in the first place.
The study, to be published this month in the Bulletin of Sociological Methodology, came about when French sociologist Antonio Casilli and his colleague Paola Tubaro noticed that the same media said to convey democracy in North African countries was blamed for channeling anarchy in Westernized ones. Geopolitics and human-rights issues aside, they wondered, what effects did social media actually have on public violence?
Casilli and Tubaro used computer modeling to try and answer the question. The idea, basically: use real-world data from situations of civil unrest to create rules about how people in those scenarios behave, enter those rules into a computer, and then see how your computer populace behaves under conditions of your choosing. Casilli and Tubaro simulated the potential effects of censorship by playing with a variable called “vision” — awareness of social and spatial surroundings during a riot. High vision corresponds with low censorship, and low vision with high censorship. Think about vision geographically: If a citizen has a vision of diameter of one mile, that means they can only see their immediate surroundings. But social media can drastically increase the diameter of vision — citizens can be aware of what happens one, five or 50 miles from their location. During a riot, this can help people decide whether to converge to another neighborhood, to attack a target or to help others resist police.
The model showed that high-vision situations without censorship still featured sharp peaks of violence, but these were followed by longer periods of social calm than low-vision scenarios. The researchers speculated that these calmer periods might be due to the fact that citizens who are more aware of their situation are more likely to initiate positive civic action than those who are left in the dark. “In the case of the London riots, people living in different areas of the city were able to get in touch and communicate [through social media],” Paola Tubaro told me. “That was both active protesters triggering others, but also people getting together to put things to rest, clean up the mess and bring the community back to normalcy.”
Like any model, it’s nowhere near a perfect picture of reality. The results are preliminary and highly theoretical, and the model isn’t meant to be predictive, they say, as much as it is to spur further discussion about the role of government censorship of communication during violent periods. One of their study’s secondary conclusions seems particularly likely to provoke argument: The model showed that moderate censorship actually resulted in more stability than no censorship at all — but at the cost of having more citizens in jail. One interpretation of this finding, Casilli says, is that social media sites could hide specific content from specific people during violent periods, filtering by sex, age, political or religious creed. Then again, hearing that these types of surreptitious decisions are being made by politicians or social media sites is just the kind of thing that makes people want to riot.
- Donald Trump's campaign chief Stephen Bannon said "he doesn't like Jews," according to his ex-wife.
- Facebook shifts to algorithms to decide which stories to highlight in its trending news section after allegations of liberal bias at the company.
- Far-right protesters climbed Berlin's famous Brandenburg Gate to protest against refugees. About 1.1 million asylum-seekers arrived in Germany last year.