AIMultiple ResearchAIMultiple Research

Affective Computing: In-Depth Guide to Emotion AI in 2024

Affective Computing: In-Depth Guide to Emotion AI in 2024Affective Computing: In-Depth Guide to Emotion AI in 2024

Affective computing systems automatically identify emotions. Affective computing is also called emotion detection, emotion AI, artificial emotional intelligence, or affective AI. Understand affective computing in detail:

What is affective computing?

Affective computing, also known as emotion AI, is an emerging technology that enables computers and systems to identify, process, and simulate human feelings and emotions. It is an interdisciplinary field that leverages computer science, psychology, and cognitive science.

While it may seem unusual that computers can do something inherently human, research shows that they achieve acceptable accuracy levels of emotion recognition from visual, textual, and auditory sources. With the insights gained from emotion AI, businesses can offer better services for their customers and make better decisions in customer-facing processes like sales, marketing, or customer services.

In our related article, feel free to read about what affective computing is.

Can the software understand emotions? What is the theoretical foundation for emotion recognition?

Although it is a big question whether computers can recognize emotions, recent research shows that AI can identify emotions through facial expressions and speech recognition. As people express their feelings in surprisingly similar ways across different cultures, computers can detect these clues to classify emotions accurately. They even outperform humans in research conducted between 2003 and 2006 as they achieved an accuracy of 70 to 80%, and human accuracy is ~60%. You can read more about this in the related section of our article.

Why is affective computing relevant now?

The increasing ubiquity of high-resolution cameras, high-speed internet and machine learning capabilities, especially deep learning, enable the rise of affective computing. Affective computing relies on

  • high resolution (ideally) dual cameras to capture videos
  • fast broadband connections to communicate those videos
  • machine learning models to identify emotions in those videos

All of these things have improved greatly since 2010s with almost all smartphone users having high-resolution cameras and an increasing number of users experiencing real-time video capable upload speeds thanks to symmetric fiber-to-the-home (FFTH) installations. Finally, deep learning solutions which require significant amounts of data and computing power have become easier to deploy.

How does affective computing work?

Most effective computing systems use labeled training data to train machine learning models which identify emotions in speech or videos. Since the performance of deep learning systems improves with more data, companies in this space are trying to increase the scope of their labeled data sets to improve their models.

To normalize facial expressions, affective computing solutions working on images use these techniques:

  1. The face is extracted from the background
  2. Facial geometry (e.g., locations of eyes, nose, mouth) can be estimated.
  3. Based on facial geometry, facial expressions can be normalized, taking out the impact of head rotations or other head movements
Facial geometry identification example
Source: Imperial College London Intelligent Behavior Understanding Group 

If you wish to work with an AI data service to obtain labeled training data, the following articles can help:

What are affective computing use cases?

Affective computing is an AI tool that can be useful in a wide variety of use cases including commercial functions and potentially even in HR. For example, having a department-wide employee engagement metric based on employee’s facial expressions could inform the company on how recent developments are impacting company morale. Current applications include

  • Marketing: There are numerous startups helping companies optimize marketing spend by allowing them to analyze emotions of viewers.
  • Customer service: Both in contact centers and retail locations, startups are providing companies estimates of customer emotions. These estimates are used to guide customer service responses and measure the effectiveness of customer service.
  • Healthcare industry: Wearables with ability to detect emotions such as Embrace by Empatica have already been used by researchers to study stress, autism, epilepsy, and other disorders.
  • Other: Emotion recognition can complement security and fraud identification efforts as well.

For more details, feel free to visit our affective computing applications guide with more than 20 use cases.

What are alternatives/substitutes to emotion recognition?

Depending on the specific use case, there are alternatives to affective computing. For example:

  • Marketing: Instead of relying on emotions of potential customers, companies are more used to running pilots to assess the success of their potential marketing campaigns.
  • Customer service: Voice is a good predictor of emotions which tends to correlate with customer satisfaction. Companies can also rely on customer satisfaction surveys to track customer satisfaction. However, surveys are completed at the end of the customer experience and unfortunately does not allow companies to make real-time adjustments or offer real-time guidance to their personnel.

What are leading companies in emotion detection?

We identified the leading companies and their number of employees to give a sense of their market presence and their specialization areas. While larger companies are specializing towards specific solutions like marketing optimization, almost all companies in the space offer APIs for other companies to integrate affective computing into their solutions.

CompanyNumber of employees on LinkedinExample business applications
Cogito101-200customer service optimization, sales optimization, care optimization in healthcare
Emotibot101-200
Affectiva101-200Applications in automotive, marketing optimization
Real eyes51-100marketing optimization
Kairos51-100Customer service optimization, access control
NVISIO11-50Use cases in finance, automotive, healthcare and media
Nuralogix11-50
sightcorp11-50
MtechLab1-50
wearehuman.io1-10
Recruiting & employee retention, financial fraud detection, customer satisfaction analysis, sales prediction and more
Sky Biometry1-10marketing optimization, user authentication
CrowdEmotion1-10marketing optimization, content optimization
releyeble1-10Retail analytics

Affective computing is an emerging field and this is our first article on the topic. Feel free to leave a comment or reach out if you have comments or suggestions.

Find the Right Vendors

Featured image source

Access Cem's 2 decades of B2B tech experience as a tech consultant, enterprise leader, startup entrepreneur & industry analyst. Leverage insights informing top Fortune 500 every month.
Cem Dilmegani
Principal Analyst
Follow on

Cem Dilmegani
Principal Analyst

Cem has been the principal analyst at AIMultiple since 2017. AIMultiple informs hundreds of thousands of businesses (as per similarWeb) including 60% of Fortune 500 every month.

Cem's work has been cited by leading global publications including Business Insider, Forbes, Washington Post, global firms like Deloitte, HPE, NGOs like World Economic Forum and supranational organizations like European Commission. You can see more reputable companies and media that referenced AIMultiple.

Throughout his career, Cem served as a tech consultant, tech buyer and tech entrepreneur. He advised businesses on their enterprise software, automation, cloud, AI / ML and other technology related decisions at McKinsey & Company and Altman Solon for more than a decade. He also published a McKinsey report on digitalization.

He led technology strategy and procurement of a telco while reporting to the CEO. He has also led commercial growth of deep tech company Hypatos that reached a 7 digit annual recurring revenue and a 9 digit valuation from 0 within 2 years. Cem's work in Hypatos was covered by leading technology publications like TechCrunch and Business Insider.

Cem regularly speaks at international technology conferences. He graduated from Bogazici University as a computer engineer and holds an MBA from Columbia Business School.

To stay up-to-date on B2B tech & accelerate your enterprise:

Follow on

Next to Read

Comments

Your email address will not be published. All fields are required.

0 Comments