They want you to watch.
Terrorism is a performance. Don't watch.
And, Facebook, stop making us.
By James Densley and Jillian Peterson
Like a car crash, violent performances such as the terror attack at two mosques in Christchurch, New Zealand, make us look before we look away. Some of us enjoy the show; we find it thrilling, like a horror movie. Most of us are sickened and saddened by it, particularly when we identify with the victims as if they were ourselves or our loved ones.
Driven by curiosity and a perfectly normal combination of sadomasochism and voyeurism, we are drawn to hurting or being hurt and to what is hidden and secret.
Media producers understand this well. "If it bleeds, it leads," wrote journalist Eric Pooley in 1989. Owing to the public's fear of, and fascination with, violence, murder and mayhem commanded much higher ratings. In the digital age, however, there are a new breed of media producers that we should be worried about.
The mass shooter who killed at least 50 worshipers livestreamed his rampage to Facebook from a head-mounted camera. His show was low-budget, but big-box-office. Propaganda by deed. It was his pièce de résistance. The shooter's coming-out party. A performance.
Such performances do more than put on a display. They are what sociologist Mark Juergensmeyer calls performative, meaning they deliberately try to change the social order by affecting the viewer and altering the viewers' perceptions of the world.
If we watch, the terrorists win.
The problem is, in the digital age, it is hard to avoid watching when we are incessantly bombarded with the sights and sounds of mass shootings autoplaying in our feeds.
Our proximity to violence has mental health implications. Being forced to watch unedited, unpredictable and disturbing violence on continuous loop or reminded of all the lives cut short by gunfire, we experience vicarious trauma or a sense that our neighborhoods and communities are unsafe, that our odds of being a victim are much greater than they really are and that the world is a dangerous place. This is the fear that drives gun sales and gang recruitment. The anxiety and cynicism that destabilize democracies — precisely what the terrorists want. We risk becoming desensitized to such violent acts; or worse, some may learn to think of them as a viable solution to life's problems.
Some people are inspired by the violence they see online. Terrorists capitalize on this fixation, posting videotaped decapitations and choreographed GoPro footage from where the action is as propaganda for recruitment. This, and other forms of violence online, can create a copycat or snowball effect, whereby people act out the violence they see on a short path to notoriety. If we share the video, we are part of this.
We're criminologists who have been studying cyberviolence for over a decade. We've interviewed teenagers about their experiences with cyberbullying, have watched countless hours of YouTube rap videos to understand how gang members broadcast badness, and even have built a database to track how mass public shooters leak their violent intent to their peers and the public online.
Our research confirms that terrorists are performing violence to go viral. We are their audience. But social-media companies like Facebook, YouTube and Twitter prove once again that Shakespeare was eerily prophetic — all the world's a stage.
Silicon Valley needs to take a hard look at itself. By creating the platform, they are more than just enabling these violent performances, they are actively participating in them. If livestreaming cannot be sufficiently monitored and regulated, why subject us to livestreaming? If we can time-delay the Oscars, to mute the profanity of celebrities, why can't we immediately "hash" the shootings to silence the persuasions of terrorists?
Every violent image leaves a digital fingerprint. With time, visual recognition algorithms will be able to flag violent imagery as it is uploaded and prevent it from being reposted. Much like how Netflix can predict what movies you like to watch, deep-learning algorithms will be able to read for signs that people wish to do themselves or others grievous harm and alert law enforcement or get them connected with the services they need. But right now, the technology is too slow to react and companies are too afraid to admit that they are complicit in harm.
Without social media, would 50 people have been fatally shot last week? Maybe. But maybe not. We can all pledge not to watch the footage from this horrific attack, but tech companies must also pledge to stop compelling us.
James Densley is a professor of criminal justice at Metropolitan State University. Jillian Peterson is a professor of criminal justice at Hamline University. They are researching mass public shootings for a project funded by the National Institute of Justice, the research arm of the U.S. Department of Justice.
about the writer
James Densley and Jillian Peterson
Despite all our divisions, we can make life more bearable for each other through small exchanges. Even something as small as free snacks on a flight.