People Are Freaking Out That iPhones Categorize "Brassiere" Pics

Some women are creeped out by how Apple's image recognition AI works.

This week, someone noticed that if you search the term "brassiere" in your iPhone's photos, it will actually find all your bra pics.

ATTENTION ALL GIRLS ALL GIRLS!!! Go to your photos and type in the ‘Brassiere’ why are apple saving these and made it a folder!!?!!?😱😱😱😱

When people tried it, they realized it did indeed find photos of them in a bra:

@ellieeewbu it could definitely be much worse

When I tested this, it did find photos of me wearing a bra (don’t @ me, they were mostly all photos of my stomach when I was pregnant). It also found some photos of people wearing tank tops and a dress with spaghetti straps — I guess that makes sense; a robot scanning for markers would visually identify those kinds of things as a “brassiere.” There were a few other misfires: my friend in her strapless wedding dress, and a group of women in matching crop tops and shorts from a competitive wing-eating event. It also found a photo I took of my elbow when it got a weird infection:

Kinda looks like a boob?

If you start typing into your iPhone's photo search, you’ll see it suggests all sorts of categories that it is using to organize your photos. It categorized by date and location, but also weird terms like "fisticuffs" and "pooches." Not just dogs, pooches.

So it appears that Apple is teaching its AI to recognize and sort photos in a number of ways. If they’re teaching their image recognition to scan as many types of photos as possible, it makes sense to teach them to recognize items of clothing. And I did find the AI suggests these types of clothing categories:

  • Jeans
  • Denim
  • Denims
  • Blue jeans
  • Dinner jacket
  • Jacket
  • Lab coat
  • Swimsuit
  • Shoe
  • Tennis shoe
  • Shawls
  • Fedora

But they didn’t have any other “racy” types of clothing or underwear. It couldn’t find the terms like underwear, bikini, panties, nudes, nude, naked, shorts, bathing suit, penis, breasts, vagina. For some reason, "brassiere" is an outlier.

According to Bart Selman, a professor of computer science at Cornell University who specializes in artificial intelligence, this IS in fact… pretty weird of Apple. “It does seem odd that Apple has a category for it because they don’t seem to have many categories at all, Selman said. “I imagine this choice may be due to an overeager Apple coder. Although Apple uses machine learning techniques to identify objects in images, I believe the final choice of categories most likely still involved human approval.”

It's worth mentioning that Apple isn't "saving" or stealing your nudes.

This is just a way of sorting the photos that exist only on your iPhone or in your iCloud. BUT, if you're really worried about the security of your racy pics, just remember to take some precautions with your account, like locking your phone and doing two-factor authentication for your iCloud. Sext safe, my friends.

Skip to footer