Getting dressed in the morning can be a chore. So for a week, I outsourced the job to Alexa.
Millions of us use Alexa, the artificial intelligence technology behind Amazon’s Echo smart speakers, to get news updates, answer trivia questions, and play music. But the Echo Look, a $199 model released last month, adds a new twist: Alexa can “help you look your best.”
The Echo Look experience is eerily familiar to anyone who remembers the desktop computer program that helped Alicia Silverstone’s Cher Horowitz pick just the right outfit in the ’90s film classic “Clueless.” The six-inch device syncs up with your phone and Amazon account and has a hands-free camera to take full-body pictures or 360-degree videos of your outfits. Start voguing for the Look, and Alexa’s artificial intelligence kicks in, organizing your daily wardrobe and suggesting new pieces — for purchase on Amazon, of course.
A “Style Check” feature compares two outfits side-by-side and uses machine-learning algorithms to pick a winning look. You can even have your images scrutinized by Amazon’s team of fashion specialists or other users via Amazon’s Instagram-like social stream, Spark.
Amazon has been testing the phone app version of the Echo Look as one of its many inroads into the retail clothing industry. The company has more than 65 private-label clothing brands, and last month opened its subscription fashion box service, Prime Wardrobe, to all of its Prime members.
Kenlyn Jones, an assistant professor at the Massachusetts College of Art and Design, said that while Amazon has revolutionized retail, “they haven’t been able to transition into fashion” as easily. The Echo Look, she said, is helping the company break into the sector by capturing data about all of the clothing purchases we’ve been making elsewhere.
Intrigued, I asked the company for a review copy of the new gadget. I should note that right now, my style is limited in scope. At seven months pregnant, I wanted to see if the Echo Look was up for a challenge.
Setup is a snap: I download the Echo Look app, log on to Amazon, and connect the speaker to my home Wi-Fi. Within seconds, a friendly voice that sounds curiously like Rachel Zoe’s asks to take a picture of my outfit. (The celebrity stylist is a partner in the Echo Look’s launch).
I position the Look’s camera so that it gets me from head to toe.
“Alexa, take a picture,” I say. After prolonged flash it grabs my image.
Yikes. The photo is terrible: My hair isn’t dry, I have no makeup on, and the angle is awful (oh, so many chins). I reach for my hair dryer and realize: The Look has already made me far more self-conscious.
Primped and ready, I make a second attempt. Using Style Check, I try on a blue-and-white striped dress, and then a red-and-white striped dress (I like stripes). In a few seconds I learn that blue-and-white is the winner; “the colors work better together” and the “outfit shape works better” for me. A third outfit, a patterned blue-and-pink wrap dress, beats the blue-and-white number.
“Better colors,” the Look tells me through the app.
Sure enough, arriving at the office later that day, I’m immediately greeted with compliments from colleagues.
“Thanks,” I reply. “Amazon dressed me today.”
So what’s the AI doing, exactly? Jones says the images are probably being scanned to make an estimate about my measurements and body type, using the same proportion calculations that you’d use in traditional pattern making. Combined with the data it has gathered on fashion trends, it’s “figuring out what statistically works best for your body,” she said.
Throughout the week, as I snap a few outfits each morning, the Look builds a profile of my closet. I begin asking the Look for fashion advice.
“Denim-on-denim is here to stay. Try mixing light and dark washes,” the Look tells me.
I cringe. I’m against the “denim tuxedo” concept.
The next day, it recommends purchasing a quilted fanny pack. Um, just no.
On another day, I toggle to the Look app, which suggests that my black stretchy pants would work well with a military-green BCBGeneration wrap shirt. I click on the image, and it sends me directly to my Amazon app. One click, and for $59, it’s mine.
This is a major objective of the Echo Look, says John Cheney-Lippold, a professor of American culture at the University of Michigan who specializes in Internet studies.
Amazon may make us feel like it’s creating personalized offerings for us, but really, it’s just gathering data in hopes of getting all of us to buy more stuff. The profile that Amazon has created about me, based on my previous purchases of books, coffee pots, and garden tools, is easily matched to other women of my age and demographic, whose tastes, Amazon is betting, look similar to mine. Maybe I won’t buy the wrap shirt, but chances are someone else will.
“It’s a really dehumanizing way of understanding the world,” Cheney-Lippold jokes.
More disturbingly, my pregnancy is also probably of interest to Amazon, he adds; its algorithms may be able to spot that I’m a great candidate for their diaper subscription service. Amazon may also be scoping out my mid-century modern furniture and floral bedspread in the background of my photos, Cheney-Lippold says, in hopes of selling me matching pieces to fill out the room.
I tell Cheney-Lippold I’m feeling creeped out. He laughs. He isn’t done.
The device can get a read on my emotions, discern my gender, my ethnicity, and my class, based on bedroom size or the amount of natural light in the photo, Cheney-Lippold tells me.
I raised these issues with a company rep, who assured me that “we do not identify items in your photos that are not related to your outfit.”
But Amazon hasn’t updated its privacy policies to make that explicit. And Cheney-Lippold said that as the technology advances, it might eventually want to do more with the images than we might realize.
Facebook saw a massive pushback from customers after the political research firm Cambridge Analytica gained access to the personal data of millions of the social network’s users.
Cheney-Lippold said individual privacy may be less of a concern with Amazon, because it has a business interest in keeping customers’ trust. But he said every time someone uses the Echo Look, he or she is helping Amazon improve its machine-learning capabilities — and no one quite knows how the company will put those capabilities to use in the future.
So would I use the device? It’s certainly helpful, and fun to play with. And if I were in desperate need of a closet update, as I might be in a few months, it would be helpful to place outfit orders without much effort.
But do I trust the device to dress me? I don’t think so. Style, as I see it, is a combination of personal taste, life experience, and real-time emotion. And I don’t trust an algorithm to know me that well.