Affective computing systems automatically identify emotions. Affective computing is also called emotion detection, emotion AI, artificial emotional intelligence or affective AI. Understand affective computing in detail:
- What is affective computing?
- Can software really understand emotions? What is the theoretical foundation for emotion recognition?
- Why is affective computing relevant now?
- How does it work?
- What are affective computing use cases?
- What are alternatives/substitutes to emotion recognition?
- What are leading companies in emotion detection?
What is affective computing?
Affective computing systems auto-recognize emotions. See a longer definition below:
Affective computing is the development of systems that can recognize, interpret, process, and simulate human feelings and emotions.
It may seem strange that machines can do something that is so inherently human. However, there is growing research supporting the point that human emotions are recognizable using facial and verbal clues.
Understanding emotions is key, especially for companies selling complex products. From those who are not working in customer facing functions such as sales, marketing or customer service, it may not be clear how affective computing is valuable for businesses. Emotions, guided by the unconscious mind, are likely to be the decision makers in complex decisions. Furthermore, emotional, gut-based, decisions can be better than conscious decisions when it comes to complex decisions.
Can software really understand emotions? What is the theoretical foundation for emotion recognition?
People express emotions in surprisingly similar ways across cultures and machines can pick up the visual and verbal clues of emotions.
A large body of research since 1970s demonstrate that even pre-literate cultures which had minimal exposure to literate cultures, can identify basic emotional expressions such as anger, happiness or surprise. There is also new, contradicting evidence that support the theory that emotions are expressed individually in different ways and this is an ongoing topic of debate. However, despite the recent challenges, the theory that emotions are expressed in similar ways by different people is still widely accepted.
Machines are already at acceptable levels in identifying emotions from facial expressions. In a 2017 study cited >30 times, researchers achieved a classification accuracy of 73% for 7 emotional states with a relatively simple model using Facial Action Coding System developed by Ekman, one of the pioneers in the field of facial expressions and emotions. However, this was achieved under strictly defined conditions using 3D Microsoft Kinect cameras. Additionally, experiment participants posed to create the facial expressions, they did not naturally generate them. Despite these caveats, ~70% is a significant achievement.
Why is affective computing relevant now?
Increasing ubiquity of high resolution cameras, high speed internet and capabilities of machine learning, especially deep learning, are enabling the rise of affective computing. Affective computing relies on
- high resolution (ideally) dual cameras to capture videos
- fast broadband connections to communicate those videos
- machine learning models to identify emotions in those videos
All of these things have improved greatly since 2010s with almost all smartphone users having high resolution cameras and an increasing number of users experiencing real-time video capable upload speeds thanks to symmetric fiber-to-the-home (FFTH) installations. Finally, deep learning solutions which require significant amounts of data and computing power have become easier to deploy.
How does it work?
Most affective computing systems use labelled training data to train machine learning models which identify emotions in speech or videos. Since performance of deep learning systems improve with more data, companies in this space are trying to increase the scope of their labelled data set to improve their models.
To normalize facial expressions, affective computing solutions working on images use these techniques:
- Face is extracted from the background
- Facial geometry (e.g. locations of eyes, nose, mouth) can be estimated.
- Based on facial geometry, facial expressions can be normalized, taking out the impact of head rotations or other head movements
What are affective computing use cases?
Affective computing is an AI tool that can be useful in a wide variety of use cases including commercial functions and potentially even in HR. For example, having a department-wide employee engagement metric based on employee’s facial expressions could inform the company on how recent developments are impacting company morale. Current applications include
- Marketing: There are numerous startups helping companies optimize marketing spend by allowing them to analyze emotions of viewers.
- Customer service: Both in contact centers and retail locations, startups are providing companies estimates of customer emotions. These estimates are used to guide customer service responses and measure the effectiveness of customer service.
- Healthcare industry: Wearables with ability to detect emotions such as Embrace by Empatica have already been used by researchers to study stress, autism, epilepsy, and other disorders.
- Other: Emotion recognition can complement security and fraud identification efforts as well.
What are alternatives/substitutes to emotion recognition?
Depending on the specific use case, there are alternatives to affective computing. For example:
- Marketing: Instead of relying on emotions of potential customers, companies are more used to running pilots to assess the success of their potential marketing campaigns.
- Customer service: Voice is a good predictor of emotions which tends to correlate with customer satisfaction. Companies can also rely on customer satisfaction surveys to track customer satisfaction. However, surveys are completed at the end of the customer experience and unfortunately does not allow companies to make real time adjustments or offer real-time guidance to their personnel.
What are leading companies in emotion detection?
We identified the leading companies and their number of employees to give a sense of their market presence and their specialization areas. While larger companies are specializing towards specific solutions like marketing optimization, almost all companies in the space offer APIs for other companies to integrate affective computing into their solutions.
|Company||Number of employees on Linkedin||Example business applications|
|Cogito||101-200||customer service optimization, sales optimization, care optimization in healthcare|
|Affectiva||101-200||Applications in automotive, marketing optimization|
|Real eyes||51-100||marketing optimization|
|Kairos||51-100||Customer service optimization, access control|
|NVISIO||11-50||Use cases in finance, automotive, healthcare and media|
|wearehuman.io||1-10||Recruiting & employee retention,
financial fraud detection, customer satisfaction analysis, sales prediction and more
|Sky Biometry||1-10||marketing optimization, user authentication|
|CrowdEmotion||1-10||marketing optimization, content optimization|
Affective computing is an emerging field and this is our first article on the topic. Feel free to leave a comment or reach out if you have comments or suggestions.