Computers are good at straightforward tasks like crunching numbers and retrieving search results. However, deciphering the complexities of human emotions is still a work in progress. Google and others are developing tools in the field of affective artificial intelligence (AI) that analyze and respond to human emotions and facial expressions real time.
Rana el Kaliouby, cofounder and CEO of the startup Affectiva, is on a mission to expand what is meant by “artificial intelligence” by creating intelligent machines that understand human emotions. Reviewing the evolution of our interaction with computers, she asks, “what’s the next more natural interface?” and her answer is “conversational and perceptual.”
El Kaliouby and her team are developing what they call “Emotion AI.” They are defining a new market by pursuing two goals: allowing machines to adapt to human emotions in real time and providing insights and analytics so organizations can understand how people engage emotionally in the digital world.
According to Richard Yonck, founder of Intelligent Future Consulting, “In the past number of years, we’ve moved toward increasingly natural user interfaces– touch, gesture, voice. Emotional awareness is the next natural progression from this.” The objective is to create technology that interacts seamlessly with humans, such as a smartphone that detects when you’re feeling down and suggests you get up and move around. Or a car that knows you’re tired and recommends you pull over. Or a digital assistant that responds appropriately to your sarcastic tone.
Commercial applications include determining if consumers engage emotionally with marketing pitches. Hiring managers could more easily screen applicants for customer-facing positions by analyzing their behavior (better candidates tend to show a wider range of emotions). And, mental health professionals could use this technology to track the facial and vocal cues of patients to better gauge whether they’re depressed, and if so, how much.
Affectiva, which said it has received $26 million in venture-capital funding, has a database of four billion images of people from around the world. The company uses machine learning to parse pictures of people’s smiles, frowns and grimaces for use in different products. One is a market-research service that films consumer panelists through their webcams—like a remote focus group—and analyzes their real-time reactions to ads and other digital content. More than 1,400 clients have used it, including Mars, Kellogg’s and CBS, the company said.
- Soul Machines, a New Zealand-based startup is putting a “human face” on AI with interactive artificial humans using a combination of neural networks and brain models powered by IBM Watson and IBM Cloud. “Sarah” features artificially generated empathetic facial gestures and a natural voice intonation that not only feels more humanlike but will eventually recognize nonverbal behavior in real time using face recognition.
- In January of 2018, Annette Zimmermann, vice president of research at Gartner, predicted: “By 2022, your personal device will know more about your emotional state than your own family.” Only two months later, a landmark study from the University of Ohio claimed that their algorithm was now better at detecting emotions than people are.
- Emotionally intelligent AI could be used for nefarious purposes, including identifying dissent in repressive regimes. China already uses AI-powered facial-recognition technology at subway stations, airports, border crossings and on the street to track and identify people who break the law.
Quests and Actions (Q&A)
- According to the U.S. National Institute of Mental Health, about 20% of American adults experience some mental illness, but fewer than half get treatment. Could artificially intelligent tools (via chatbots) capable of simple text or voice conversations help fill the gaps—augmenting but not replacing their human counterparts?
- If researchers succeed in teaching computers the language of emotions could this make them better tools for improving our welfare? What are some examples?
- Could emotional AI help leaders be more emotionally intune with their team members? How could this be too intrusive?