Agentic AI allows natural language interaction with industrial systems, enabling users to query data and receive actionable insights. We will outline a reference architecture designed for industrial environments, describe how task-specific agents and tools can be orchestrated. We will also explore current state of natural language interfaces (NLIs) in industrial systems.
Why do industrial systems need agentic AI?
The industrial data challenge & the agency gap: Industrial organizations have heavily invested in IoT technologies, deploying SCADA systems, PLCs, sensors, and connected assets.
These systems generate vast amounts of data, but much of it remains underutilized, locked in silos, and difficult to interpret. The result is a paradox: manufacturers are data-rich but insight-poor.
The agency gap arises because existing systems only collect and report data without autonomously interpreting or acting on it. This leaves companies relying on human expertise to analyze data and make decisions.

How agentic AI bridges the gap
By combining natural language interfaces with reasoning capabilities, agentic AI enables operators to bypass these technical barriers. An operator can ask a query such as “How did Boiler 3 perform last week compared to its baseline efficiency?” and the system will:
- Identify the relevant asset (Boiler 3) across SCADA or PLC systems.
- Retrieve the appropriate time-series sensor data.
- Align and filter the data by the requested time window.
- Apply statistical reasoning to determine the trend in efficiency.
- Deliver a clear output, such as “Boiler 3 operated at 92% of baseline efficiency last week.”
The diagram illustrates a reference architecture for agentic AI in industrial systems:
Reference architecture

Core capabilities of this architecture
1. User queries

At its core, the system allows operators to issue natural language queries, such as:
- “What was the energy consumption of Boiler 3 at Plant X on 15th July 2025?”
- “List all the safety parameters tracked for Cooling Tower 2 in Facility Y.”
2. IoT integrations (Data sources)

The system ingests real-time and historical data from operational sources to provide the raw signals needed for reasoning, including:
- SCADA systems (control data)
- PLC connectors (equipment controllers)
- Smart sensors (temperature, vibration, energy, pressure, etc.)
- Plant layouts (zones, equipment hierarchies, site mapping)
This provides the raw signals needed for reasoning and decision-making.
3. Reasoning layer (LLM)


Reasoning ensures that domain-specific queries can be understood and executed.
At the core, a large language model (LLM) interprets the user’s query. It:
- Breaks the query into subtasks: (identify Boiler 3 → filter by Plant X → constrain to 15 July 2025 → retrieve energy readings).
- Grounds entities: matches “Boiler 3” to the correct sensor ID in SCADA.
- Selects the right tools: SCADA API for raw data.
And, the LLM passes the request to the orchestration layer, which manages execution.
4. Action / orchestration Layer

This layer manages execution of the subtasks:
- Workflow coordination: ensures tasks are carried out in the correct order (get raw data first, then filter).
- Persistent memory: Uses persistent memory to keep track of the asset (Boiler 3) in case the user follows up with “What about Boiler 4?”.
- Orchestration engine: routes queries to the right tools or agents
5. Tool / retrieval layer

This layer provides access to external and internal resources, including:
- APIs and SaaS platforms (e.g., maintenance logs, ERP systems)
- Vector databases (to search historical or unstructured records)
- Knowledge bases (safety manuals, compliance data)
- Business logic modules
- User interactions (clarifications or feedback loops)
The retrieval layer ensures that both structured (sensor logs) and unstructured (documentation) data can be used in reasoning.
6. Industrial AI provider integration

The architecture connects to external industrial AI agents (e.g., Synera, Aitomatic, Retrocausal) for domain-specific capabilities such as engineering workflow automation, knowledge-driven modeling, and factory-floor guidance.
And, the system produces actionable insights, which can take different forms:
- Operational metrics (e.g., “Boiler 3 consumed 4.7 MWh on 15th July 2025”)
- Efficiency trends (performance baselines vs. actual results)
- Downtime analytics (causes and frequency of equipment disruptions)
Current state of natural language interfaces (NLIs) in industrial systems
While the integration of NLIs and LLMs into industrial systems is still emerging, several initiatives are underway to bridge this gap:
- Foxconn’s “FoxBrain”: Foxconn has developed its own large language model, “FoxBrain,” designed to enhance manufacturing and supply chain operations.2
- LLM-BRAIn: This research project, trained on 8,500 instruction-following demonstrations, enables robots to interpret natural language commands and generate appropriate behavior trees.3
Real-world industrial AI applications
In this section, we explore how AI is being applied in real-world industrial environments, focusing on the various AI-driven systems.
While some of these systems use task-specific AI to automate and optimize processes, they do not yet integrate the full agentic AI stack involving natural language interfaces (NLIs) or reasoning capabilities.
Manufacturing: Schaeffler’s AI factory
Schaeffler has integrated AI-powered diagnostic agents into its Hamburg plant. The system analyzes production data in real time to support operators with troubleshooting.
While the primary interface is not explicitly natural language-based, the underlying AI reasoning capabilities enable more intuitive interactions.4
Robotics: RoboBallet (Industrial robotics coordination)
Developed by UCL, DeepMind, and Intrinsic, RoboBallet demonstrates the use of reinforcement learning to coordinate industrial robots. Eight robot arms were able to complete 40 tasks, compared with traditional systems that typically manage five arms across ten tasks.
Operators can issue high-level commands in natural language, which the system interprets to control robot actions.5
Logistics: Brightpick’s agentic AI-powered warehouse
Brightpick automated a 35,000 sq. ft. warehouse handling ~50,000 daily picks across 6,000 SKUs. Its Autopicker robots, managed by the Brightpick Intuition platform, dynamically orchestrate in-aisle picking, pallet handling, goods-to-person stations, and overnight order buffering.
External Links
- 1. How Intelligent Agents in AI Can Work Alone | Gartner. Gartner
- 2. https://www.reuters.com/technology/foxconn-unveils-first-large-language-model-2025-03-10/
- 3. https://arxiv.org/abs/2305.19352
- 4. https://www.schaeffler.com/en/media/stories/digitalization-stories/smart-factory/
- 5. https://www.ft.com/content/31d959cf-a037-4330-8ea9-8d064c9ac613
- 6. https://www.youtube.com/watch?v=GRZp8MZ6piA&t=95s
Comments
Your email address will not be published. All fields are required.