5 Ways Computer Vision Is Giving the Fashion Industry a Makeover

If you had any doubt about how fashionable computer vision has become as a marketing tool, just look to the fashion industry itself.

Over the past few years, fashion brands and retailers have been rapidly implementing computer-vision-powered solutions. And given that fashion is a $2.4 trillion global industry, those brands and retailers have the potential to, in turn, transform computer vision—because computer vision applications that work at scale in a sector as large as fashion, which is inherently visual to boot, are likely to gain traction across other sectors as well. Here are five cutting-edge examples of how computer vision is transforming fashion today:



Just in time for the fall fashion shows, the Tommy Hilfiger brand updated its TommyNow Snap App to combine image recognition and augmented reality (AR) to make for a smart and immersive mobile shopping experience. See a Tommy Hilfiger outfit or item in a store, in an ad or even in a runway photo? Snap it with the app and TommyNow lets you buy it online (at or save it to your own lookbook—and also use computer vision-enabled AR to have your chosen avatar virtually model the look on your own virtual catwalk. Watch a quick YouTube demo here.

The marketing implications: If the rise of e-commerce sites meant you could shop anywhere, computer vision and AR means the world around you will become shoppable and, ultimately, more and more storelike, even if you aren’t actually inside a physical store. Likewise for e-tailers; products no longer have to be “trapped” in two dimensions in the form of a product shot on a website—a new (virtual) reality that has repercussions for marketers of every kind of product, not just the fashionable sort.



For millions of millennial males, sneakers are fashion. Over the summer, Nike launched a new computer-vision/AR capability in its iOS SNKRS app to allow users to “unlock” the opportunity to buy the Nike SB Dunk High “Momofuku”—a collaboration with celebrity restaurateur (and lifestyle icon) David Chang, whose fans are known as “Changheads.” As Complex reports, SNKRS app users were asked to point their phone’s camera at the menu at Chang’s Fuku restaurant in Manhattan or at Nike posters plastered outside of Momofuku restaurants in U.S. cities, including Washington, D.C. and Las Vegas, causing a 3D image of the sneaker to pop up, along with the opportunity to buy it.

The marketing implications: The AR-assisted “sneaker drop” not only added an experiential element to what could have otherwise been a standard e-commerce transaction, but it helped Nike get the Momofuku sneaks into the hands (and onto the feet) of actual fans rather than opportunistic resellers. (A major black market has sprung up around the resale of limited-edition “kicks.”)



How do you figure out what’s really happening in fashion—i.e., what people are really wearing? Fashion magazines, of course, only offer part of the picture. The explosion of street-style blogs and Instagram feeds is helpful, but ultimately overwhelming. Enter computer vision as applied by Cornell University researchers, who released a study titled “StreetStyle: Exploring world-wide clothing styles from millions of photos” this summer. A custom computer-vision neural network was trained on 15 million photographs uploaded to photo-sharing services and social media platforms, in the process learning to detect everything from the sleeve length of shirts to the color of pants, providing the sort of insight in just a few hours that even armies of fashionistas scouring The Sartorialist and its online ilk over months could not find.

The marketing implications: As lead researcher Kavita Bala, a Cornell professor of computer science, explains, “Using these detected attributes, we can then derive visual insight. For example, where in the world is wearing hats more common? At what time of the year? Which colors are more popular in summer vs. winter?” In other words, computer vision can be deployed to figure out, at scale, exactly what people are wearing and when—helping manufacturers calibrate supply against actual demand, optimize seasonal product release schedules and gather hard data about fashion-item life cycles.



Likewise, London-based fashion-focused retail-analytics company Edited is also using computer vision to “look” at clothing—but in a highly strategic, competitor-focused way. Edited deploys Google-like web spiders to crawl the e-commerce sites of apparel brands and retailers around the world on a daily basis, reading and capturing visual information from pictures to collect and analyze hundreds of data points (everything from color and cut to release dates and competitors) on a specific item via machine learning.

The marketing implications: Fashion-industry customers can then use the system to get a real-time read on the competitive landscape. For instance, they can instantly find out the average price point for virtually every red dress sold online in the U.S., how products such as those dresses evolve over time (e.g., when hemlines lengthen or shorten) and when new versions are introduced and discontinued. For fashion houses and beyond, that means the end of the guessing game about what’s actually available in the marketplace at any given time.



Finally, at a more personal, one-on-one level, consider Amazon’s integration of computer vision into its line of Echo digital assistants in the form of the Echo Look. The device uses its built-in camera to serve as a virtual stylist, offering a mix of algorithmic and actual human stylist advice on which outfit to wear after you let it take pictures of you (then of course, it recommends new purchases based on your existing wardrobe).

The marketing implications: The verdict is still out on the Echo Look—the device has been on the market less than six months, and Amazon doesn’t break down sales figures on its Echo line. It’s also not clear how exactly the mix of AI and human stylists work together, but it’s likely part of the overall training process (the human stylists will also correct or reinforce when the AI is making a bad fashion call). Regardless, it’s easy to imagine how computer-vision-powered shopping assistants like the Look could soon be deployed across all kinds of product lines beyond apparel. The idea at play here is incredibly powerful: If brands can use computer vision to “see” and understand their customers one by one, then they’ll have the ability to truly customize marketing for, quite literally, individuals.

Illustrations by Tara Jacoby