Spark

The trouble with advertising algorithms

The reliance on algorithms to manage unprecedented amounts of information have repeatedly had consequences for YouTube and other social media.
YouTube says there are no specific words that its algorithm seeks when stripping ads from videos, but it looks for offensive content.

How much control do we have over the spaces we occupy online?

We put our lives on Instagram, Facebook, and Twitter. And they've become important places for debating political and social issues. But sometimes we get a pretty stark reminder that those platforms are owned by someone else.

This past week there were a few high-profile examples of people having their content removed, or their profiles suspended.

On October 11, Twitter suspended the account of actor Rose McGowan. Initially, it appeared to be because of tweets related to the allegations of sexual harassment and assault against Harvey Weinstein.

And rapper Lil B had his Facebook profile suspended, Facebook says, for violating their rules on hate speech, after he made comments surrounding the issue of gun violence.

It was initially unclear why exactly these posts had been flagged, or what rules they had broken. And even when answers were provided, to many, they felt incomplete and unsatisfying. In both cases, it was unknown whether these posts were flagged by staff, other users, or the algorithms that power the sites.

Most of us know that when we are dealing with online publishing platforms, we are dealing with algorithms. But what prompts those algorithms to make their decisions is opaque --- to the average user, it's just not clear what the algorithms are doing.

2017 has been the year YouTube has gone from popular video sharing site to controversial newsmaker.

It all started in February, when a Super Bowl ad from Hyundai, that hailed U.S troops, reportedly ran ahead of a YouTube video supporting militant group, Hezbollah.

In March, The Times of London reported ads for popular brands such as Pepsi, were appearing alongside controversial videos - like ones by white supremacists.

There was intense criticism, and spooked advertisers began to pull their ads from the site.

YouTube responded by promising to make it easier for companies to control where their ads appear.

Which sounds great, right? Except, as the year went on and YouTube continued to make changes, video creators began to notice something strange.

The algorithm responsible for deciding which videos should -and shouldn't- have ads, seemed to be flagging videos that didn't seem controversial or offensive.

That means many individual content creators were suddenly left with videos that were no longer monetized.

Angelina Ledrew-Bonvarlez is a Toronto-based YouTuber. She discovered that a video of her wedding had been demonetized by the platform's algorithm.

Jamie Byrne, Director of YouTube Creators, affirmed that terms like LGBTQ do not trigger demonetization on YouTube.

"It's important to understand that our algorithms at YouTube do not have bias built into them, and they don't target individual groups or content or look for any specific keywords," he says.

Angelina Ledrew-Bonvarlez (LinkedIn)
Angelina successfully appealed to YouTube to bring back advertising on her wedding video, and it was reinstated within 12 hours.

The ability for YouTubers to appeal when ads are removed from their videos is an important part of the puzzle: Videos with fewer than 1,000 views in a week don't have that right, which may make it difficult for new users or creators posting on niche issues to appeal.

With four hundred hours uploaded every minute to YouTube, the algorithm is going to get things wrong sometimes.

It's an issue we're seeing all over social media.

Relying on algorithms to understand the difference between offensive and innocuous content is necessary, especially where advertisers are concerned.

But it isn't perfect yet.

Dan Olson, who is based in Calgary, runs the Folding Ideas YouTube channel. His channel has over 150 thousand subscribers, and his videos on filmmaking and storytelling frequently get hundreds of thousands, even millions, of views.
Dan Olson (YouTube)

Dan has called YouTube's advertising algorithm one of the company's best -- and worst -- features.

Now, as YouTube spokesperson Jamie Byrne said, YouTube maintains that the algorithm does not use keywords to make decisions.

But the systemic abuse of marginalized people that happens in YouTube comments sections, and online generally is a broader issue.

And ultimately it has an effect on individual YouTube creators. It has a chilling effect, Dan says.

Many LGBTQ YouTubers feel as though they're being targeted when they see that ads have been removed from their videos and they aren't sure why.

"There are possibilities that an advertiser could exclude themselves from certain types of content. That is the advertiser's right. The advertisers do have the ability to decide, like they do in any media, to decide where they want their advertising to run," Jamie says.
 

The issues in this story go beyond YouTube as well. The reliance on algorithms to manage unprecedented amounts of information have had consequences for other social media too.

Multiple media reports after the Las Vegas shootings, for instance, pointed out that results on Google, Facebook, and yes, YouTube included prominently placed rumours and outright falsehoods.

The Financial Times has reported (paywall) that YouTube has changed their algorithm in response "...because so much misinformation was appearing high up in its search results."