This week Instagram is swarming with artsy illustrations of your friends suddenly looking like a watercolor painting from the Impressionist movement or a character from a new Marvel film. But as people share digital portraits of themselves created with the app Lensa, some experts have raised concerns about data privacy, artist rights, and how the app seems preoccupied with giving huge boobs to every woman who uses it.
The app, created by photo-editing tech company Prisma Labs, uses artificial intelligence and photos uploaded by the user to generate unique images based on their likeness. You’ve gotta pay up front — $7.99 for 50 pictures, up from $3.99 just a few days ago. But it’s not the rarity of having to pay to participate in a meme that has people worried.
Here are some of the reasons why people are concerned about the impact AI art generators might have on users and artists.
Uploading your photos to Lensa gives the company access to your face data
As with many of the photo-altering apps that have gone viral in recent years, users are concerned about how much data Lensa retains about them once they submit their photos.
Prisma’s terms and conditions state that users “retain all rights in and to your user content.” At the same time, using the app grants the company “perpetual, revocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable, sub-licensable license to use, reproduce, modify, adapt, translate, create derivative works” with your photos.
That means the company owns the rights to any digital artwork made from using your selfies in the app.
A spokesperson from Prisma Labs told TechCrunch that the company deletes user photos from its cloud services after using them to train its AI. To some, that’s actually one of the most worrying parts of the agreement.
“The concerning part to me is actually that you are giving the right to use your image for training their AI,” another Twitter user wrote.
Users are not compensated when the company uses their images for AI training, per Lensa’s user agreement.
Artists say AI art technology uses their work without compensation
AI art may seem like it magically appears from scratch by benevolent robots, but it’s more complicated than that. It’s not just Lensa, either. DALL-E, MyHeritage, and the AI Art TikTok filter have all gone viral for their ability to generate art based on user prompts.
To generate these seemingly unique photos of people, Lensa uses what’s called Stable Diffusion, a model “trained” to learn patterns through an online database of images called LAION-5B. Once the training is complete, it no longer pulls from those images but uses the patterns to create more content. It then “learns” your facial features from the photos you upload and applies them to that art.
The learning isn’t the concern for artists. The source of the knowledge is. LAION-5B pulls publicly available images from all over the internet — Google Images, DeviantArt, Getty Images, Pinterest, and so on. Dozens of artists have spoken out about not getting paid or credited for work that appears in the database. Some alleged that it is stealing.
“Artists dislike AI art because the programs are trained unethically using databases of art belonging to artists who have not given their consent,” Claire, a digital artist who didn’t want to use her last name to protect her career, told BuzzFeed News. “This is about theft.”
Prisma Labs CEO Andrey Usoltsev told BuzzFeed News that what Lensa generates “can never be described as exact replicas of any particular artwork.” He said the app creates images from scratch without borrowing existing pieces of art.
“As cinema didn’t kill theater and accounting software hasn’t eradicated the profession, AI won’t replace artists but can become a great assisting tool,” Prisma Labs wrote in a tweet. “We also believe that the growing accessibility of AI-powered tools would only make man-made art in its creative excellence more valued and appreciated, since any industrialization brings more value to handcrafted works.”
Sometimes artists use AI art tools to inspire them and to help them visualize their next creative project. Now they’re afraid it could put their careers in jeopardy. One artist tweeted that AI art generators are “fundamentally anti-creator.”
“These AI seem like harmless fun but they are predatory and intend to replace artists,” voice artist Jenny Yokobori wrote in a Twitter thread.
AI Art Has Sexualized Women And Anglicized People Of Color
One of the most delightful things about art is the way its creators bring their own perspective and style to each piece. That doesn’t exist with the pattern-based art AI generates. The CEO of Prisma Labs told BuzzFeed News it “does not have the same level of attention and appreciation for art as a human being.”
You can see this when AI-generated art produces too many fingers or soulless eyes. But it can also be more sinister than that, with Lensa creating sexualized and racially inaccurate art.
Of the 50 images Lensa generated from my selfies, seven of them exaggerated my body to make me both curvier and thinner than I actually am. And by that I mean despite submitting photos that didn't show my body below the shoulder, it created cursed images of a cartoon-like cleavage and tiny waist. And I know I’m not alone.
Activist Brandee Barker tweeted that it seemed to her that AI selfie-generator apps were “perpetuating misogyny.” Her results looked similar to mine, with sexualized poses and exaggerated features. According to Refinery 29, some people have said apps have helped them experience gender euphoria, but they can also do the opposite by taking gendered features to the extreme.
“the fact that it depicts femme bodies completely naked with ZERO prompt is creepy and can absolutely be detrimental to you one day when someone claims the AI art is based on a n*de you sent them idk man,” one Twitter user wrote.
In a story for Wired, writer Olivia Snow said she was able to generate nude images of herself using her childhood photos. Women of color told her that Lensa whitened their skin and anglicized their features. Some said they felt “violated.”
Other people have said that the app failed to capture their facial features, making them appear whiter than they really are. Fat people noted that the app makes them look thin.
As more people give AI art apps consent to study their faces, it’s possible that the technology could become better at producing more accurate features.
It’s also possible that it could become much worse.