This is an excerpt from Please Like Me, BuzzFeed News’ newsletter about how influencers are battling for your attention. You can sign up here.
An MUA on TikTok broke down after accidentally deleting her footage, revealing the intricacies and work that goes into one video or look
About two weeks ago, Jessica Hibert recorded over three hours of footage of herself carefully putting together a makeup look, which she planned to edit down to a 60-second TikTok. However, when Hibert tried to rerecord the last part, she accidentally deleted the entire file.
“I got really frustrated and almost cried,” she told me later. But Hibert tried to hold it together at the time. It was already midnight, and she felt the pressure to post new content for her 400,000-plus followers to maintain her brand and a high level of engagement, which meant she would have had to redo her tutorial from the beginning — including every line “explaining details, products, tips, and tricks” in the look.
But before she set out to rerecord the whole thing, Hibert decided to record a TikTok explaining to her followers why her video would be delayed. She was “overcome with emotions,” she said, and began tearing up on camera.
“This is what people don’t see,” she says in the TikTok. “This look literally took me three hours. It’s, like, 12 o’clock right now. ... and...I deleted the whole thing.”
“Please just come back tomorrow to watch this look again,” she says.
Even though Hibert eventually did post the makeup video that she intended, she said it was important to talk about the behind-the-scenes moments and real-life frustrations of her job “because some people only see the positive things” about being a makeup content creator.
“Your supporters expect you to post content, TikTok wants you to post content, there is pressure to constantly create new content, and that pressure increases with followers,” she said. “We — and when I say ‘we’ I mean all content creators, big and small — work really hard, especially makeup artists and beauty influencers.”
On top of the stress of consistent output, Hibert told me that on the day she recorded the original video, she had little to no “energy or motivation,” so mustering up the drive in the first place was a struggle. Fixing the damage done by one quick and careless mistake felt mountainous.
Her TikTok explaining the ordeal mostly received supportive and empathetic comments from fans. One person made the analogy of “typing a whole essay and then the power goes out.” However, Hibert told me she also received comments and messages from people who thought she was “whining,” and that her job creating and sharing beauty looks for views didn’t warrant the level of panic or distress she felt.
“Makeup is an art form. It is a way to express ourselves, like a musician or tattoo artist. Makeup artists create looks for many reasons, not just to look pretty,” Hibert said. “It takes on average three to four hours to create a 60-second makeup video. [That’s] not including daily routine, finding new palettes and products, etc.”
Hibert said she doesn’t consider herself an established influencer yet. That only adds to the pressure she, like other content creators, feels to make a name for herself in an already saturated market.
No matter how you feel about the meritocracy of the influencer job, or the beauty industry, hard work is hard work. Belittling someone’s hardship not only is unkind but completely misses the point. In this hustle economy we live in — one that influencers sometimes romanticize in a problematic way — it’s refreshing to see someone be vulnerable and pull back the curtain on the stress of “making it.”
Instead of pointing fingers at people who are hurting and falling under the pressures of capitalism, maybe point the fingers at the people who perpetuate this unhealthy structure? (To those who ridiculed Hibert’s video, or anyone who wants to diminish it as “whining,” maybe...that’s you? Something to ponder.)
In fact, the more seamless a content creator makes their short videos look, the more work they’re actually putting in to do that. Something more to ponder.
Not breaking news, but I don’t think automated content moderation is working, like, effectively at all
This week, I published a story about the messiness of TikTok’s moderation process and how it’s affecting Black creators on the platform. I won’t rehash all the complicated details of that story — you can read it here.
Essentially, I spoke to three content creators who were growing their accounts when they were abruptly banned, leaving them extremely confused. The reasons TikTok gave them for the bans didn’t correspond to the things they were posting, so many of them felt like they were being punished for either mistakes or biases in the algorithm.
I’m not going to litigate whether their bans were intentional — I don’t know exactly how TikTok’s algo is written, and after speaking to reps there on background, I still wasn’t able to get a clear idea of the mechanics. “We’re committed to seeing that our policies and practices are fair and equitable,” they said in a public statement.
What we do know is that TikTok moderates content with some mix of automation and human support. Like other social media companies, I understand it’s difficult (costly) to invest in a large team of people to monitor, assess, and make critical decisions about nuanced content on their platforms, which is why they often use AI. But in the cases of Siete, Cahleb, and Minnie, the punishment they received arguably did not fit the violations. And they told me their bans made little to no sense to the context of what they had posted and the controversies it sparked.
That context is so important!!! And it takes actual people — albeit sensible and compassionate people — to be able to discern what’s going on when content is flagged and make sound decisions about whether it’s a punishable offense.
If a creator is left confused and frazzled about what happened, it usually means the moderation either was not communicated well, if at all, or wasn’t just. And the only way we can begin to build a better system for this is to prioritize human moderation by humans who prioritize ethics first.
*steps down from my soapbox*
Until next time,