Skip To Content
BuzzFeed News Home Reporting To You

Utilizamos cookies, próprios e de terceiros, que o reconhecem e identificam como um usuário único, para garantir a melhor experiência de navegação, personalizar conteúdo e anúncios, e melhorar o desempenho do nosso site e serviços. Esses Cookies nos permitem coletar alguns dados pessoais sobre você, como sua ID exclusiva atribuída ao seu dispositivo, endereço de IP, tipo de dispositivo e navegador, conteúdos visualizados ou outras ações realizadas usando nossos serviços, país e idioma selecionados, entre outros. Para saber mais sobre nossa política de cookies, acesse link.

Caso não concorde com o uso cookies dessa forma, você deverá ajustar as configurações de seu navegador ou deixar de acessar o nosso site e serviços. Ao continuar com a navegação em nosso site, você aceita o uso de cookies.

People Are Freaking Out That iPhones Categorize "Brassiere" Pics

Some women are creeped out by how Apple's image recognition AI works.

Posted on October 31, 2017, at 5:35 p.m. ET

This week, someone noticed that if you search the term "brassiere" in your iPhone's photos, it will actually find all your bra pics.

ATTENTION ALL GIRLS ALL GIRLS!!! Go to your photos and type in the ‘Brassiere’ why are apple saving these and made it a folder!!?!!?😱😱😱😱

When people tried it, they realized it did indeed find photos of them in a bra:

@ellieeewbu it could definitely be much worse

When I tested this, it did find photos of me wearing a bra (don’t @ me, they were mostly all photos of my stomach when I was pregnant). It also found some photos of people wearing tank tops and a dress with spaghetti straps — I guess that makes sense; a robot scanning for markers would visually identify those kinds of things as a “brassiere.” There were a few other misfires: my friend in her strapless wedding dress, and a group of women in matching crop tops and shorts from a competitive wing-eating event. It also found a photo I took of my elbow when it got a weird infection:

Kinda looks like a boob?

If you start typing into your iPhone's photo search, you’ll see it suggests all sorts of categories that it is using to organize your photos. It categorized by date and location, but also weird terms like "fisticuffs" and "pooches." Not just dogs, pooches.

So it appears that Apple is teaching its AI to recognize and sort photos in a number of ways. If they’re teaching their image recognition to scan as many types of photos as possible, it makes sense to teach them to recognize items of clothing. And I did find the AI suggests these types of clothing categories:

  • Jeans
  • Denim
  • Denims
  • Blue jeans
  • Dinner jacket
  • Jacket
  • Lab coat
  • Swimsuit
  • Shoe
  • Tennis shoe
  • Shawls
  • Fedora

But they didn’t have any other “racy” types of clothing or underwear. It couldn’t find the terms like underwear, bikini, panties, nudes, nude, naked, shorts, bathing suit, penis, breasts, vagina. For some reason, "brassiere" is an outlier.

According to Bart Selman, a professor of computer science at Cornell University who specializes in artificial intelligence, this IS in fact… pretty weird of Apple. “It does seem odd that Apple has a category for it because they don’t seem to have many categories at all, Selman said. “I imagine this choice may be due to an overeager Apple coder. Although Apple uses machine learning techniques to identify objects in images, I believe the final choice of categories most likely still involved human approval.”

It's worth mentioning that Apple isn't "saving" or stealing your nudes.

This is just a way of sorting the photos that exist only on your iPhone or in your iCloud. BUT, if you're really worried about the security of your racy pics, just remember to take some precautions with your account, like locking your phone and doing two-factor authentication for your iCloud. Sext safe, my friends.

  • Picture of Katie Notopoulos

    Katie Notopoulos is a senior reporter for BuzzFeed News and is based in New York. Notopoulos writes about tech and internet culture and is cohost of the Internet Explorer podcast.

    Contact Katie Notopoulos at

    Got a confidential tip? Submit it here.

A BuzzFeed News investigation, in partnership with the International Consortium of Investigative Journalists, based on thousands of documents the government didn't want you to see.