This Researcher's Observation Shows The Uncomfortable Bias Of TikTok's Algorithm

TikTok's recommendation system can create a biased "feedback loop," this researcher said.

A little experiment by an artificial intelligence researcher is raising questions about how TikTok's recommendation algorithm suggests new creators to users.

Specifically, the question is whether that algorithm is sorting suggestions based on the race of the creator — something TikTok denies it's doing intentionally. But it's another example of the need for more scrutiny into how the app and other social media platforms promote particular creators or content.

Marc Faddoul is a researcher at the University of California Berkeley School of Information who studies AI and disinformation. He was checking out TikTok to look for disinformation when he noticed something curious about how the app recommends new creators to follow.

In the app, when a person follows a new account, they can click an arrow that then recommends other accounts to follow. Faddoul noticed that when he did this, the recommended accounts tended to look just like whoever he'd just followed — right down to ethnicity and hair color.

"I saw this very clear pattern that was happening," he told BuzzFeed News. "When following an account, the suggestions are very similar-looking."

He made a fresh account to try it out again, and these were his results:

A TikTok novelty: FACE-BASED FITLER BUBBLES The AI-bias techlash seems to have had no impact on newer platforms. Follow a random profile, and TikTok will only recommend people who look almost the same. Let’s do the experiment from a fresh account: 1/6

Faddoul cautioned that this was a casual experiment, not actual research, but he said the results are still interesting.

BuzzFeed News tried a similar experiment with a new account and got similar results.

Following hijabi creator @jiggybush caused the app to recommend other women who wear a hijab.

And following @uwayeme, a black woman, prompted recommendations for other black women.

At first, Faddoul suspected TikTok was employing AI technology that studied people's profile photos when making recommendations. Other tech companies, like Netflix, use this to determine what thumbnails a user is most likely to click on. That's why a person's Netflix homepage may have different images than another person's to promote the same show.

But, according to TikTok, there's a simple answer to the questions raised by Faddoul.

TikTok told BuzzFeed News that it uses what's known as collaborative filtering. Basically, the app recommends new accounts based on whom the people who follow that user are also following.

"We haven't been able to replicate results similar to these claims," a TikTok spokesperson told BuzzFeed News.

"Our recommendation of accounts to follow is based on user behavior: users who follow account A also follow account B, so if you follow A you are likely to also want to follow B."

That also explains why, when following a big creator like Addison Rae, the app recommends other creators from the Hype House, rather than creators who look like Rae.

Or, in another example, following an artist returns recommendations for other artists.

It's a classic system used by tech companies from YouTube to Netflix, but that doesn't mean it's not without problems, Faddoul said.

"Collaborative filtering may also reproduce whatever bias there is in people’s behavior," he said.

"People who tend to like blonde teens tend to like a whole lot of other blonde teens. In that sense, it's kind of expected."

What that means is TikTok recommended BuzzFeed News' test account women who wear hijabs because people who follow one hijabi tend to follow other hijabis.

But, Faddoul said, this can create a feedback loop where people are only ever recommended a particular type of creator, leading to a lack of diversity in their feed.

A risk is to reinforce a 'coverage bias' with a feedback loop. If most popular influencers are say, blond, it's will be easier for a blond to get followers than for a member of an underrepresented minority. And the loop goes on...

For example, he said, if the most popular creators on a platform are white, and the app keeps recommending other white creators, it makes it hard for creators of color to gain followers and popularity — even if that's not the intention of the algorithm.

"Then it means it's easier for a white person to get recommended than someone from an underrepresented minority," he said. "So that’s something that can be happening, regardless of its facial feature or collaborative filtering."

Social Media have been known to create filter bubbles for political opinions. TikTok seems like the first major platform to create such clear physiognomic bubbles.

Of course, this is hardly unique to TikTok. All social media platforms that use algorithms can create bubbles where people only see content that confirms their biases. Think of, for example, how a Facebook feed may be biased toward a particular political viewpoint.

"This is not a scientific research methodology, just anecdotal evidence that kind of highlights a phenomenon that seems pretty clear and distinct and encourages further investigation," Faddoul said.


Skip to footer