Spark

Facebook violence and responsibility

The trouble with live video.
Recent brutally violent acts broadcast on Facebook Live have brought the company's responsibility into question.

Ever since Facebook launched its live video feature just over a year ago, several violent acts have been broadcast on the site.

The social network is currently facing a backlash after a Cleveland man uploaded a video of himself shooting someone, and followed it with a Live video confessing to the alleged murder.

Just this past week, a Thai man allegedly killed his 11-month-old daughter on Facebook Live.

And here in Canada, two teen girls allegedly posted a Facebook video of brutal beating linked to the death of Manitoba teen Serena McKay.

Is this all Facebook's fault?

Will Oremus is a senior technology writer at Slate. "Facebook's stance is that they do want to take some responsibility," Will says.
Will Oremus

"But Facebook does not want to take the level of responsibility that would require it to prevent these acts from happening in the first place."

Facebook may not be directly to blame for someone's violent act, but in a number of these cases, it's clear that the power to broadcast live was part of what was motivating people.

"Facebook's position is: we are offering this very powerful tool to the world, there's no possible way we can police everything in real time," Will says.

"People will misuse tools, that doesn't mean we should take away the tool all together."

Will believes fewer and fewer people are buying that reasoning.

"Maybe you need to rethink the nature of the tool...if you are potentially encouraging people to commit and broadcast violent acts," he says.

Facebook says it's now reviewing its practices.

"They are acknowledging a certain level of responsibility here, but as far as what they are going to do about it it's unclear," says Will.

Might there be a technical fix for this problem?

"I don't think it's completely impervious to technical solutions," says Will.

"In terms of algorithms being able to detect violence or other policy violations in real time in streaming video, that's a few years away at least."