This week, Twitter has been inundated with pictures generated by ImageNet Roulette. The photos are typically of a person's face and feature a green box with a black and green label listing an automatically generated label for the person, based on the software's best guess of what social role they occupy.
ImageNet Roulette is an art project created by Berlin-based developer Leif Ryge, working in collaboration with artist and researcher Trevor Paglen and AI researcher Kate Crawford. It's part of an exhibition called Training Humans, currently showing in the Osservatorio Fondazione Prada in Milan, which explores how datasets and automated systems represent and classify humans.
ImageNet Roulette, a web extension of the exhibition, is designed to demonstrate it. But in practice it sucks, turning out mostly gibberish. Worse, when people of color put their images into it, the app can spit back out shockingly racist and vile labels based on their ethnicities.
Ryge and Paglen's studio has not yet responded to a request for comment.