Understanding a speech or context is a problem that needs to be resolved for more capable conversational AI solutions. Natural Language Understanding (NLU) is a field that focuses on understanding the meaning of text or speech beyond just understanding its components. It searches for what is the meaning and the purpose of that speech. We have answered all your NLU related questions:
What is Natural Language Understanding?
NLU helps conversational AI applications to determine the purpose of the user and direct them to the relevant solutions. Natural language understanding or NLU is an AI-powered solution for recognizing patterns in a human language. It enables conversational AI solutions to accurately identify the intent of the user and respond to it. When it comes to conversational AI, the critical point is to understand what the user says or wants to say in both speech and written language. Hence, these types of business applications usually rely on NLU.
NLU is a subset of natural language processing (NLP) and conversational AI, which helps computers to understand human language by understanding, analyzing and interpreting basic speech parts, separately. NLU can be considered as the first step to reach conversational AI: the first machine must understand what the user says before it responds.
NLU has a history dating back to the 1960s. At that time it was only limited to “Pattern-matching with small rule-sets”. However, developments in the field of AI and big data have opened a promising path for NLU. Today, there are many NLU applications in different areas. Some examples are:
- Chatbots: In case you have questions about them, feel free to read our most comprehensive article on the topic.
- Voice-driven assistants
- Search in natural language: It is a search for information using everyday spoken language.
- Web-scale information extraction: It is a challenge to find the most suitable information among gigantic data on the web.
- Legal discovery: A search for policies and laws in natural language.
- Content summarization
Why is it important now?
The main reasons why natural language processing are important today can be explained by these points:
NLU can be used as a tool that will support the analysis of an unstructured text. There are various ways that people can express themselves, and sometimes this can vary from person to person. Especially for personal assistants to be successful, an important point is the correct understanding of the user. NLU transforms the complex structure of the language into a machine-readable structure. This enables text analysis and enables machines to respond to human queries.
The amount of unstructured text that needs to be analyzed is increasing. Computers can perform language-based analysis for 24/7 in a consistent and unbiased manner. Considering the amount of raw data produced every day, NLU and hence NLP are critical for efficient analysis of this data. A well-developed NLU-based application can read, listen to, and analyze this data.
Industry analysts also see significant growth potential in NLU and NLP:
Analysts estimate >20% CAGR in the 2020-2025 period. According to Markets Insider’s research in 2019, the global natural language processing (NLP) market is expected to be worth $35 billion by 2025 with a record a 22% CAGR in the 2020-2025 period. The main underlying reason for the increase is shift from the product-centric to the customer-oriented experiences. The increasing demand for smart devices and IoT is also contributing to the widespread use of NLU.
For example, a recent Gartner report points out the importance of NLU in healthcare. NLU helps to improve the quality of clinical care by improving decision support systems and the measurement of patient outcomes.
What are the things to pay attention to while choosing Natural Language Understanding solutions?
- Language support: The NLU platform should support the language of the input data. Currently, the quality of NLU in some non-English languages in lower due to less commercial potential of the languages. However, with increased research interest, this is changing.
- Result quality: A good NLU solution should be able to recognize linguistic entities, extract the relationships between them and use semantic software to understand the content, no matter how it’s expressed. Continuous learning powered by machine learning can improve result quality over time.
- Speed: Understanding the language is part of the process in conversational AI applications. Other parts include generating a response or acting based on the query. Therefore the process of perceiving and interpreting the language should be done quickly. However, there may be a trade-off between the quality of the results vs the speed of computation. A selection should be made depending on the application area
- Flexibility: Adaptability to different solution areas is important. This is achieved by the training and continuous learning capabilities of the NLU solution.
- Usability: The solution should be easy to use by both non-tech and tech employees. A solution with different interfaces can be considered for a non-tech employee(a customer service employee for example) to actively develop this system with feedback. Since chatbots can be developed by non-tech employees, the usability of the software and the ease of the user interface are important.
How to evaluate the accuracy of NLU solutions?
NLU models can perform perfectly on a single and specific task. However, different tasks can lower the accuracy and precision. It is best to compare the performances of different solutions by using objective metrics. For example, a benchmark can be created by building a test set with a significant number of examples (e.g. >100) using customer service data to compare different services. Since most services are available as easy to register APIs, it should be easy to check this against API responses.
Here is a benchmark article by SnipsAI, AI voice platform, comparing F1-scores, a measure of accuracy, of different conversational AI providers. SnipsAI compares their system with Google’s API.ai, Facebook’s Wit.ai, Microsoft’s Luis.ai, and Amazon’s Alexa by using their open-source dataset on Github. The dataset includes 2400 queries for each of the 7 user intents they tested.
GLUE and its superior SuperGLUE are the most widely used benchmarks to evaluate the performance of a model on a collection of tasks, instead of a single task in order to maintain a general view on the NLU performance. They consist of nine sentence- or sentence-pair language understanding tasks, similarity and paraphrase tasks, and inference tasks. GLUE also provides a leaderboard for the benchmark.
What are the leading NLU companies?
As in many emerging areas, technology giants take a big place also in NLU. Some startups as well as open-source API’s are also part of the ecosystem.
- Knowledge Exploration Service
- Language Understanding Intelligent Service (LUIS)
- Azure Translator APIs
- Translate API
- Cloud Natural Language API
- Watson Conversation Service
- Watson Tone Analyzer
Open-source alternatives are:
- Facebook’s Wit.ai
- Rasa NLU
- Stanford CoreNLP
Feel free to learn more about conversational AI from our research articles:
If you have questions about how natural language understanding can help your business, feel free to ask us: