What did you think of the last advertisement you looked at? Was it funny? Confusing? Would you buy the product? You may not remember or do not know with certainty what you feel, but more and more, the machines do it. New artificial intelligence technologies learn and recognize human emotions, and use this knowledge to improve everything, from marketing campaigns to health care.
These technologies are called “AI emotion”. Emotion AI is a subset of artificial intelligence (the wide term for machines reproducing the way humans think) which measures, understands, simulates and reacts to human emotions. It is also known as emotional computer or artificial emotional intelligence. The land dates back to at least 1995, when the media laboratory teacher at MIT Rosalind Picard published “Affective IT. “”
Javier Hernandez, researcher Affective computer group At Mit Media LabExplains emotion has a tool that allows a much more natural interaction between humans and machines. “Think about how you interact with other human beings; You look at their faces, you look at their body and change your interaction accordingly, ”said Hernandez. “How can (a machine) effectively communicate information if it does not know your emotional state, if he does not know what you feel, he does not know how you will respond to specific content?”
While humans may currently have the upper hand over reading the emotions, the machines are gaining ground using their own strengths. The machines are very good at analyzing large amounts of data, said Professor Mit Sloan, Erik Brynjolfsson. They can listen to the inflections of the voice and start to recognize when these inflections are correlated with stress or anger. The machines can analyze images and collect subtleties in micro-expression on the faces of humans which could occur too quickly for a person to recognize.
“We have a lot of neurons in our brain for social interactions. We were born with some of these skills, then we learn more. It is logical to use technology to connect to our social brain, not just our analytical brain. Brynjolfsson said. “Just as we can understand speech and machines can communicate in speech, we understand and also communicate with humor and other types of emotions. And the machines that can talk about this language – the language of emotions – will have better more effective interactions with us. It is great that we have made progress; It’s just something that was not an option 20 or 30 years ago, and now it’s on the table. “”
What industries are already using emotion?
Advertisement – In 2009, Rana El Kaliouby, Phd ’06, and Picard founded AffectionAn IA emotion company based in Boston, which specializes in AI and automotive advertising – the latter for 25% of fortune companies 500.
“Our technology captures these visceral and subconscious reactions, which we have found very strongly with the real behavior of consumers, such as sharing the announcement or the purchase of the product,” said El Kaliouby.
In the case of advertising research, once a customer has been checked and agreed with the terms of the use of affectiva (as promising not to use the technology of lies surveillance or detection), the customer has Access to affectiva technology. With the consent of a customer, technology uses the person’s phone or camera to capture their reactions while looking at a particular advertising.
Self-assessment – As the comments during a test group – is useful, said El Kaliouby, but getting a response at times allows marketing specialists to really say if a special announcement has resonated with people or was offensive, or If she was confused or struck a cord.
Call centers – Cogito technology, a company co -founded in 2007 by MIT Sloan, helps agents of the call center to identify customer moods on the phone and adjust the way they manage the conversation in real time. Cogito’s vocal-analytical software is based on years of research on human behavior to identify vocal models.
Mental health – In December 2018, Cogito launched a spin-off called Companionand a mental health monitoring application that accompanies it. The companion application listens to someone who talks about his phone and analyzes the use of the speaker’s voice and phone for signs of anxiety and mood changes.
The application improves user self -awareness and can increase adaptation skills, including stress reduction stages. The company worked with the Veterans Department, the Massachusetts General Hospital and the Brigham & Women’s hospital in Boston.
Another emotional technology focused on mental health is a portable device Developed at Mit Media Lab which monitors a person’s heart rate to say if they live something like stress, pain or frustration. The instructor then releases a perfume to help the wearer to adapt to the negative emotion they have at that time.
The bioesse laptop detects stress or pain and releases a perfume to help the wearer adapt to negative emotion.
Credit: Bioesse / Judith Amores
Media laboratory researchers too build an algorithm Using telephone data and a portable device, which predicts various degrees of depression.
Automotive – Hernandez, the media laboratory researcher, is currently working on a team putting emotion in vehicles.
Although a lot of attention has been given to safety in the environment outside a car, inside, inside, a range of distractions that can have an impact on safety. Consider a car that could say if a driver was arguing with the passenger next to them, depending on the high blood pressure, and adjust the speed of the distracted operator. Or a sensor which reported the steering wheel to subtly maneuver the car in the middle of the track, after a driver deprived of sleep, inscribed without knowing the sidewalk.
Assigned to a similar Automobile service of himself, who monitors the experiences of a driver’s state and the occupants to improve road safety and the experience of the occupants.
Assistance services – Some autistic people find it difficult to communicate emotionally. This is where emotion IA can be a kind of “assistance technology,” said Hernandez. Portable monitors can resume subtleties in facial expressions or body language in an autistic person (such as a high pulse rate) that others may not be able to see.
Hernandez said that there are also “communication prostheses” that help people who are autism to learn to read the facial expressions of others. An example is a game in which the person uses the camera on a tablet to identify the “smiling” or “fruitful” faces on the people around them.
“It is a way for them to engage with other people and to know how facial expressions work,” said Hernandez, adding that this video technology that measures moods in “smiling” or “fruitful” faces could work for customer comments in crowded themed parks or waiting rooms of the hospital, or could be used to provide anonymous comments in senior management in a large office.
Is emotion something to welcome or worry?
Hernandez has recommended that any business interested in applying this technology should promote a healthy discussion – the one that includes its advantages and what is possible with technology, and how to use it privately.
“What I say to businesses is to think what aspects of emotional intelligence should play an essential role in your business,” said Hernandez. “If you had to have this emotional interaction, how would that change, can you use technology for this?”
El Kaliouby said that it sees potential in the expansion of technology to new use cases, for example, using the technology of the call center to understand the emotional well-being of the employees or for other mental health uses. But the concern about the release of Big Brother is a legitimate concern, and which will have to be continuously addressed in the context of privacy and this technology. At this stage, El Kaliouby said that affectiva requires an option and consent for all cases of using its technology.
Another thing to keep in mind is that technology is as good as its programmer.
Brynjolfsson warned that these technologies are deployed, they must be appropriate for all people, and not only sensitive to the subset of the population used for training.
“For example, recognizing emotions on an African-American face can sometimes be difficult for a machine formed on Caucasian faces,” said Brynjolfsson. “And some of the gestures or inflections of voice in a culture can mean something very different in a different culture.”
Overall, what is important to remember is that when used in a thoughtful way, the ultimate advantages of technology can be and should be higher than the cost, said Brynjolfsson, an echo feeling by El Kaliouby, who said that it was possible to integrate technology into a thoughtful reflection of a thoughtful reflection on a reflection on a thoughtful reflection in a thought of reflection on a thoughtful of A reflection way. “The paradigm is not human against machine-it is really a human increasing the machine,” she said. “It is a human human machine.”
Ready to go further?
Illustration: Andrea Mongia
Read Next: 5 steps towards artificial intelligence “centered on people”
For more information
Meredith Somers