Last week Affectiva, a startup developing emotion recognition technology, announced it had raised $12 million in funding, a pretty hefty sum for a start-up that’s not trying to be another Instagram of video or Groupon for dogs. The company spun out of research from the affective computing group at MIT, who have been working for several years to design technology that can understand human emotion — which sounds creepier than it is.
Affectiva isn’t trying to peer into your innermost thoughts; it just wants to watch what happens to your face and skin when you’re watching an ad, or using your phone, so that technology can sense when we’re frustrated, confused, bored or (maybe) even happy.
“Affective computing – when I envisioned the whole field originally — was to make people’s communication with technology more respected. Instead of ignoring these important signals, it should acknowledge when it’s frustrating for us and try to do better,” said Rosalind W. Picard, the pioneer of affective computing as well as chief scientist and co-founder of Affectiva.
There are two ways that Affectiva’s trying to do this — with Affdex, a webcam that tracks the muscle movements on your face, and with Affectiva Q, a sensor that can measure any subtle electrical changes on your skin. While they differ in what they’re measuring (muscle movement vs. skin conductance), they’re both trying to achieve the same thing — how does what you’re seeing, or using, make you feel?
“It’s recognizing changes in your face and skin, looking at patterns of those changes. When we get lots of lots of data for certain facial expressions, we try to make an informed guess about what emotional states are most likely,” Picard told me. “It doesn’t directly read your emotions; it reads patterns, like people.”
But people are inherently different, and even for similar emotions, we might make very different facial expressions. A furrowed brow can mean confusion or skepticism, and frustration can even take the shape of a smile, says Picard. That’s why Affectiva’s trying to watch as many faces as possible, to gather data for every kind of emotional response. The more data they get, the more accurate technology will eventually become at READING YOUR MIND. (Or like, your face, anyway.)
Shopper Sciences, a market research agency, used Affectiva’s technology to track the emotions of shoppers over Black Friday last year. They asked in-store shoppers to wear the Affectiva sensor wristband and online shoppers to allow a webcam to watch their faces, in an effort to understand how different emotions might affect spending. They found that people online and in-store experienced similar levels of excitement and stress while shopping, but those who browsed online before heading into stores were less stressed and more confident — and spent an average of $400 more.
That capability is exactly why marketing companies and investors are deeply interested in spending tons of money on something that’s seems ridiculously difficult and highly speculative: Billions of dollars are thrown away on advertising every year, and no one really knows how it makes most people feel, says Picard. “They don’t know if it’s boring, amusing, annoying or interesting you. They don’t if you’re scowling, frowning or shaking your head — and that’s a pretty serious waste of money. “
- Donald Trump's pick for commerce secretary, Wilbur Ross, admitted at his confirmation hearing he once employed an undocumented immigrant as a household worker.
- It's official: Scientists announced today that 2016 was the hottest year on record and that greenhouse gasses are to blame ♨️️🌍
- President Obama commuted Chelsea Manning's 35-year sentence for giving classified documents to WikiLeaks. She'll be freed in May.
- "Will & Grace" will return for a 10-episode revival. Debra Messing and Eric McCormack will reprise their roles in the groundbreaking NBC sitcom.