How TikTok's design helps turn ordinary people into villains
"Toxicity is a direct outcome of how a space is managed," says researcher
Social media has become not only the new judge of character, but the entire court of law, and that has a lot to do with how these online spaces are designed.
Over the past couple of years, TikTok has quickly become a place for everyday people — more than a billion monthly users — to create and connect with others. People post cooking and DIY videos, as well as share personal stories. And unlike other social media platforms, going viral on TikTok can happen to anyone — regardless of how many followers they have, thanks to its unique recommendation algorithm.
While sites like this can provide the kind of fellowship we get in face-to-face groups, it's not all positive. Harmful conspiracies and witch hunts have also become commonplace on TikTok. In recent months, a trans woman was wrongfully accused by viewers of being a serial killer after she posted a video of herself on the app dancing in her basement to Shania Twain.
The latest example of the unintended consequences of virality on TikTok is 'West Elm Caleb', a New York man who was identified by a number of women on the platform as someone they'd met on the dating app Hinge — and who they felt treated them badly.
"One of the interesting things about these situations is that now especially they tend to become really cross-platform, TikToks being shared on Reddit and Twitter, for example," Casey Fiesler, an assistant professor of information science at the University of Colorado Boulder, told Spark host Nora Young.
Fiesler, who researches technology ethics and online communities, said the case of 'West Elm Caleb' demonstrates how seemingly well-meaning attempts at accountability can escalate beyond imagination and how the internet scales punishment.
The videos about the serial 'ghoster' went viral on the app, garnering the interest and anger of many beyond the women he had dated. He was eventually doxxed and lost his job.
The disproportionate reaction seen in response to stories like this, Fiesler noted, has to do with people forming parasocial relationships with content creators they follow and the subjects of the videos they watch, while also engaging with the content like it's fiction, without considering the real repercussions of resharing or chiming.
"One of the issues with someone becoming a symbol is that then it's no longer about that person's individual actions. It's about everyone who has ever done this, and the fact that they have not been punished," said Fiesler.
In her recent book, Should You Believe Wikipedia? Online Communities and the Construction of Knowledge, social computing researcher Amy Bruckman looks at how the design of online spaces shapes how we engage and learn from one another.
"You go into one restaurant and there are wooden tables and sawdust on the floor and the hockey game on the TV. Compare that to another restaurant, where there are linen tablecloths and servers in formal wear and low lighting. You implicitly know how to behave in each of those settings," said Bruckman, professor in the School of Interactive Computing at the Georgia Institute of Technology,
She said online communities are no different. "As designers of online sites, we pick features and use them to try and create a mood, maybe you want it to be more like a pub, or maybe you want it to be more like an elegant soirée. You pick features that help people to learn how to behave."
One of the problems facing TikTok is its content moderation system, which Fiesler described as opaque, when it comes to what is and isn't taken down.
This, she said, becomes a larger concern in the case of hate speech, where there are a lot of false negatives and false positives. Because of these errors, while some videos displaying hate speech remain on the platform, videos of people describing incidents of hate speech are taken down. "No matter how you calibrate the algorithm, more false positives or more false negatives, it's hurting the same people," said Fiesler.
Content moderation is also important to combat the spread of false information, Bruckman said.
"What you want to look at in a platform is how well things are reviewed, and what sources of support you have for the validity of a particular piece of information. So stuff on TikTok is a lot less reliable than stuff on Wikipedia, and a place like Reddit is somewhere in the middle," she said.
This is what Fiesler calls ethical debt: the cost of not thinking about the harmful social implications of a technology while it's being built, and assuming they can be addressed after they arise. "The problem with that is that the bugs are a very different kind of bug," she said.
"People who are really good at thinking about how technology might be used to harass people are people who get harassed a lot, which means women and queer people and people of colour, and these are all groups that are underrepresented in tech. So I think that can have an impact on the ability to imagine the harms of your technology," said Fiesler.
Bruckman argues that platforms need to move away from a purely capitalist motivation and design spaces that are entitled to provide positive value for individuals and for communities.
"And that's why I love Wikipedia. Everything I know tells me it shouldn't work, and yet somehow it does. An old joke says Wikipedia can't work in theory, it only works in practice. So our theory needs to catch up with our practice a little bit," she said.
Written by Samraweet Yohannes. Produced by Samraweet Yohannes and Michelle Parise.