Affective computing systems automatically identify emotions. Affective computing is also called emotion detection, emotion AI, artificial emotional intelligence or affective AI. Understand affective computing in detail:
- What is affective computing?
- Can software really understand emotions? What is the theoretical foundation for emotion recognition?
- Why is affective computing relevant now?
- How does it work?
- What are affective computing use cases?
- What are alternatives/substitutes to emotion recognition?
- What are leading companies in emotion detection?
What is affective computing?
Affective computing, also known as emotion AI, is an emerging technology that enables computers and systems to identify, process, and simulate human feelings and emotions. It is an interdisciplinary field that leverages computer science, psychology, and cognitive science.
While it may seem unusual that computers can do something that is so inherently human, research shows that they achieve acceptable accuracy levels of emotion recognition from visual, textual, and auditory sources. With the insights gained from emotion AI, businesses can offer better services for their customers and make better decisions in customer-facing processes like sales, marketing, or customer services.
Feel free to read more about what affective computing is in our related article.
Can software really understand emotions? What is the theoretical foundation for emotion recognition?
Although it is a big question that if computers are capable of recognizing emotions, recent research shows that AI can identify emotions through facial expressions and speech recognition. As people express their feelings in surprisingly similar ways across different cultures, computers can detect these clues to classify emotions accurately. They even outperform humans in research conducted during 2003 and 2006 as they achieved an accuracy of 70 to 80%, and human accuracy is ~60%. You can read more about this in the related section of our article.
Why is affective computing relevant now?
Increasing ubiquity of high resolution cameras, high speed internet and capabilities of machine learning, especially deep learning, are enabling the rise of affective computing. Affective computing relies on
- high resolution (ideally) dual cameras to capture videos
- fast broadband connections to communicate those videos
- machine learning models to identify emotions in those videos
All of these things have improved greatly since 2010s with almost all smartphone users having high resolution cameras and an increasing number of users experiencing real-time video capable upload speeds thanks to symmetric fiber-to-the-home (FFTH) installations. Finally, deep learning solutions which require significant amounts of data and computing power have become easier to deploy.
How does it work?
Most affective computing systems use labelled training data to train machine learning models which identify emotions in speech or videos. Since performance of deep learning systems improve with more data, companies in this space are trying to increase the scope of their labelled data set to improve their models.
To normalize facial expressions, affective computing solutions working on images use these techniques:
- Face is extracted from the background
- Facial geometry (e.g. locations of eyes, nose, mouth) can be estimated.
- Based on facial geometry, facial expressions can be normalized, taking out the impact of head rotations or other head movements
What are affective computing use cases?
Affective computing is an AI tool that can be useful in a wide variety of use cases including commercial functions and potentially even in HR. For example, having a department-wide employee engagement metric based on employee’s facial expressions could inform the company on how recent developments are impacting company morale. Current applications include
- Marketing: There are numerous startups helping companies optimize marketing spend by allowing them to analyze emotions of viewers.
- Customer service: Both in contact centers and retail locations, startups are providing companies estimates of customer emotions. These estimates are used to guide customer service responses and measure the effectiveness of customer service.
- Healthcare industry: Wearables with ability to detect emotions such as Embrace by Empatica have already been used by researchers to study stress, autism, epilepsy, and other disorders.
- Other: Emotion recognition can complement security and fraud identification efforts as well.
For more details, feel free to visit our affective computing applications guide with more than 20 use cases.
What are alternatives/substitutes to emotion recognition?
Depending on the specific use case, there are alternatives to affective computing. For example:
- Marketing: Instead of relying on emotions of potential customers, companies are more used to running pilots to assess the success of their potential marketing campaigns.
- Customer service: Voice is a good predictor of emotions which tends to correlate with customer satisfaction. Companies can also rely on customer satisfaction surveys to track customer satisfaction. However, surveys are completed at the end of the customer experience and unfortunately does not allow companies to make real time adjustments or offer real-time guidance to their personnel.
What are leading companies in emotion detection?
We identified the leading companies and their number of employees to give a sense of their market presence and their specialization areas. While larger companies are specializing towards specific solutions like marketing optimization, almost all companies in the space offer APIs for other companies to integrate affective computing into their solutions.
|Company||Number of employees on Linkedin||Example business applications|
|Cogito||101-200||customer service optimization, sales optimization, care optimization in healthcare|
|Affectiva||101-200||Applications in automotive, marketing optimization|
|Real eyes||51-100||marketing optimization|
|Kairos||51-100||Customer service optimization, access control|
|NVISIO||11-50||Use cases in finance, automotive, healthcare and media|
|wearehuman.io||1-10||Recruiting & employee retention,
financial fraud detection, customer satisfaction analysis, sales prediction and more
|Sky Biometry||1-10||marketing optimization, user authentication|
|CrowdEmotion||1-10||marketing optimization, content optimization|
Affective computing is an emerging field and this is our first article on the topic. Feel free to leave a comment or reach out if you have comments or suggestions.
How can we do better?
Your feedback is valuable. We will do our best to improve our work based on it.