We created real-time exception spotting technology that scanned crowds of people, gathered inputs, and identified the anomalies and exceptions to the "norm."
In the week leading up to the activation on September 6, computer vision technology captured statistics about fashion choices of thousands of pedestrians in New York's Soho neighborhood. On the day of the event, this data was used to identify who the exceptions to the norm were.
For the activation, we set up a camera on a corner in Soho that constantly streamed its video into an AI system that analyzed the clothing.
The computer vision technology recognized details like patterns, styles, and accessories, as well as colors and shapes. Unlike editorial street photography, the technology removed bias from finding the individuals who stood out from the crowd. During the real-time activation, the pedestrians identified as unique were broadcast on a large screen and given a pair of New Balance sneakers.
1. Detect human figures, and cut each detected person into a separate image.
2. Use facial detection to determine whether the person was facing the camera or not. We didn't want people missing the experience as they passed by!
3. The technology then finds the person's hips, shoulders, and knees to cut out images of body regions to identify different articles of clothing. For example, the shoulders to the hips would be a person's top.
4. The images are sent through a a set of "style detection" models to determine color, pattern, shape, and more.
5. These results are then compared with the data gathered the week before the event to determine how uniquely dressed the person was.