When Amazon said it was offering a $25 gift card to people in New York City willing to let it make a 3D scan of their bodies for undisclosed “research” on an “internal project,” most people reacted with a firm no thanks. No one in their right mind is going to strip down to their skivvies to allow Amazon to hoover up your actual biometric data to do what — make a Terminator out of you?
Except me. I’ve only dreamed of two things: $25 and Jeff Bezos seeing my nudes. So I signed up for the first available 30-minute timeslot to get my body scanned. Unfortunately, my dreams would never come to pass: Amazon refused to scan me unless I signed a nondisclosure agreement (the dreaded NDA) saying I wouldn’t write about the experience of getting scanned. I couldn’t agree to that, but fortunately I found someone who did get scanned and was willing to talk to a reporter about it.
The online form to sign up didn’t say exactly what would happen during the scan, other than:
The form didn’t reveal too many details about what the scan was for. “Participants’ comments and data will be used exclusively for internal product research and not for marketing purposes,” it said. A rep for Amazon declined to share information with BuzzFeed News about the project. Are they making a line of humanoid Terminators? A new RoboCop-type army based on our bodies as an add-on to the Ring doorbell? CLONES?!??!
Aside from an army of Bezobots3000s, clothing is the most reasonable guess. In 2017, Amazon acquired a startup called Body Labs (TechCrunch heard the price tag was around $70 million) that builds 3D models of human bodies, based on real humans, that can then be used for things like designing clothing or creating video games or animations. (Body Labs also has slightly dystopian origins in the early ’00s working with law enforcement to “accurately identify a human being through computer vision techniques.”) Speculation is that Body Labs fits Amazon’s ambitions in the fashion market. In 2017, Amazon launched the algorithm-powered Echo Look, a “style assistant” with an Alexa-connected camera. Maybe Amazon’s working on a custom clothing product, designing its own clothing line, or even creating a tool that helps people figure out what size to buy based on a photo.
I showed up to Amazon’s Body Labs office 15 minutes early, per the instructions. The office was small, in an unfancy building near Union Square. I checked in with a friendly receptionist. She pointed me to a table with several tablet devices. Another woman instructed me to enter my name and follow the instructions on the tablet.
I asked if I could skip the NDA, explaining I was a journalist planning to write an article about the experience. No dice. I asked if they could cut me a break if I skipped the $25 gift card. No dice. The woman at the front desk made a call to someone at another office to tell them there was a journalist there asking to skip the NDA. The person on the phone suggested I contact Amazon’s public relations team and get press-approved for a scan.
So I left their offices and emailed Amazon. A friendly press person told me I absolutely could not do the scan without signing an NDA.
Despite my pleas, Amazon was absolutely refusing to look at my nude bod. I could never be Mr. Bezos’s alive girl.
So I didn’t get the scan. Someone who did get the scan told me they had to fill out additional questions about their clothing size, height, and weight. Next, they were escorted to another room where someone actually measured their height and weight (there’s no lying to Amazon!) and had them step up on a special body fat scale, which sends an electric current through your body and calculates your body fat percentage.
Next, someone took several photos using a regular smartphone of the volunteer in their street clothes. Then they went into a room with a 360-degree camera setup and took more photos in street clothes. The volunteer said they were honestly unimpressed with the camera rig, which looked like standard equipment, not some cutting-edge, new Amazon gizmo.
They were then given some fitted, skimpy garments to put on (which Amazon promised were washed). More smartphone photos, then more photos in the 360-degree room. The person was instructed to do several poses: standing straight, standing arms slightly akimbo to make an “A” shape, and a “fun” pose. “They said some people have trouble coming up with a fun pose,” the volunteer said.
Then they put their clothes back on and received a $25 gift card. The whole thing took less than the 30 minutes allotted.
The volunteer wasn’t allowed to see the photos Amazon took, and they didn’t share anything about what the scans would be used for, not even what model Terminator would take the form of their body.
“From a research ethics standpoint, Amazon should be disclosing as much as it can about the purpose of the research and the intended uses of the data without compromising the integrity of the study, so that people understand what they’re signing up for,” Natasha Duarte, a policy analyst at the Center for Democracy & Technology, told BuzzFeed News.
I asked Amazon’s latest archived body owner if they worried about privacy. “I’m not insecure about my body, so it’s whatever,” they told me. The person figured that all their data is already out there anyway, so what does it matter.
This sense of fatigue of dealing with our data privacy is understandable. Amazon already knows tons of stuff about me; what does it matter if they also know the exact curvatures of my body? Honestly, knowing what toothpaste brand I buy is still probably more valuable to Amazon than what my butt looks like. The amount of data Facebook and Google possess about us is so vast and unknowable it’s hard to even really care at some point; there’s nothing we can do, so why bother fighting it? Just lay back and let the waves wash over you.
The volunteer’s reasoning that a lack of body insecurity helps you not worry about privacy felt relatable to me too. Before my adventure in body scanning got canceled, I did feel a little weird about being half nude in a room with strangers. But it doesn’t feel emotional or tangible to expose the kind of biometric data Amazon was collecting. There’s no visceral response to Amazon owning and storing a literal copy of my desexualized, nearly nude body, some anonymized rendering of ones and zeros that’s useful only in large quantities.
This is the paradox of digital privacy: It’s embarrassing if someone walks in on us in the bathroom, but we have no qualms about giving the daily details of our cervical mucus to our iPhones. The concept of a massive corporation having access to so much data about us is so unknowable and hard to wrap our heads around, we might as well just get (nearly) naked for it.