Autonomous things (often shortened to AuT) are physical devices, such as vehicles, robots, and drones, that use onboard sensors, connectivity, and AI to perceive the physical world and autonomously complete tasks with little or no human direction
Explore what autonomous things are and how they operate, their most common use cases with real-life examples, and the challenges businesses should look out for when investing in AuT.
Definition and core characteristics of autonomous things
Autonomous things (AuT) are systems that can make decisions and take actions on their own, with little or no human supervision. These devices are self-sufficient, performing intelligent tasks that involve both energy and decision-making autonomy. They have three key traits that set them apart from traditional automated systems:
- Flexible Action: AuTs can adapt their behavior to changing conditions in real-time. Unlike fixed automated systems, they can adjust their actions when faced with new challenges or unexpected situations, allowing them to work effectively in dynamic environments.
- Decision-Making Autonomy: These systems can choose actions based on sensory input and set goals. They process multiple data streams, evaluate options, and select the best course of action, allowing them to handle situations not specifically programmed into their software.
- Self-Governance: AuTs operate with minimal human intervention. While they may rely on human input for high-level goals or safety checks, they make most operational decisions independently. This independence makes it possible to use AI and computers in settings where constant human oversight would be impractical.
Types of autonomous things
The autonomous things landscape encompasses four primary categories, each serving different operational domains and use cases.
Autonomous vehicles
Autonomous vehicles represent the most visible category, spanning from self-driving cars operating on public roads to unmanned aerial vehicles conducting surveillance missions. The Society of Automotive Engineers has established a classification system ranging from Level 0 (manual driving) to Level 5 (full automation), with current deployments primarily focused on Levels 2 and 3.1
Here is the three-layer architecture of autonomous vehicles:2
- Mission Planning Layer serves as the highest-level decision maker, handling route selection and task assignment. This layer uses graph search algorithms like Dijkstra’s or A* to navigate road networks, representing a fundamental component of AuT technology that allows vehicles to plan long-distance journeys autonomously.
- Behavioral Planning Layer acts as the critical decision-making component that ensures vehicles follow traffic rules and interact safely with other road users. This layer addresses one of the main existing safety issues by implementing Finite State Machines (FSMs) that can predict dangerous situations and respond appropriately, helping overcome concerns about autonomous vehicle safety.
- Motion Planning Layer generates real-time collision-free paths and trajectories. This component must demonstrate acceptable safety while operating in dynamic environments, using advanced algorithms like RRT (Rapidly-exploring Random Trees) and sampling-based methods to ensure vehicles can autonomously complete tasks while avoiding obstacles.
Core competency modules
- Perception Module represents the sensory foundation of autonomous systems, utilizing LIDAR, cameras, and roadside sensor data to understand the environment. Technological developments in sensor fusion have improved environmental awareness, while reduced sensor costs have made these technologies more accessible to autonomous vehicle companies and researchers.
- The perception system processes point clouds, performs object detection, and enables localization, which are all critical for autonomous entities without human direction to navigate safely. Modern deep learning approaches have changed the object recognition capabilities, though critical challenges remain in handling diverse weather conditions and complex urban scenarios.
- Planning Module encompasses the decision-making hierarchy that transforms perception data into actionable plans. This module must handle the unprecedented nature of autonomous navigation, balancing efficiency with safety while operating in the physical world. The planning system represents core autonomous software that enables vehicles to make complex decisions in real-time.
- Mission planning uses graph search over road networks, while behavioral planning implements rule-following logic through FSMs. Motion planning generates safe trajectories using advanced algorithms that can adapt to dynamic environments, representing AuT technologies that bring computers into direct interaction with the physical environment.
- The Control Module executes planned trajectories through precise vehicle control systems. This component uses Model Predictive Control (MPC) and geometric tracking algorithms to translate high-level plans into steering, throttle, and brake commands. The control system must maintain acceptable safety standards while ensuring smooth vehicle operation.
Autonomous robots
Autonomous robots are another important category, including systems used in industrial manufacturing, healthcare, hospitality, and search-and-rescue operations.
These robots vary in their level of autonomy, from factory robots that follow set paths to service robots that navigate unpredictable environments and interact with humans.
Agricultural robots are a notable success, with autonomous tractors and harvesting machines helping to address labor shortages while increasing precision and efficiency.
Autonomous smart home devices
Autonomous smart home devices represent a growing segment that includes IoT devices with embedded artificial intelligence, autonomous sensors for environmental monitoring, and intelligent infrastructure systems.
These devices typically operate within defined spaces but make independent decisions about energy management, security responses, and environmental controls based on sensor data and learned user preferences.
Multi-agent systems
Multi-agent systems feature coordinated groups of autonomous entities working together to accomplish complex tasks. Robot swarms can perform collective operations like search and rescue missions, while autonomous sensor networks enable distributed environmental monitoring and threat detection.
These systems exclude social networks in the traditional sense but create sophisticated communication networks among physical devices.
AuT architecture and technical components
Autonomous things typically employ a three-layer hierarchical architecture that enables real-time decision-making while maintaining safety and efficiency.
The reaction layer
The reactions layer handles low-level sensor data processing and immediate responses to environmental stimuli, operating with millisecond response times to ensure safety in dynamic situations.
This layer includes continuous control systems such as autopilots and motor control units that maintain basic operational parameters.
The rules layer
The rules layer manages middle-level decision-making through rule-based systems that ensure protocol compliance and safety standards. This layer interprets safety regulations, traffic laws, and operational procedures, translating them into specific behaviors and actions.
For autonomous vehicles, this layer processes traffic signals, road signs, and right-of-way rules to ensure legal and safe operation.
The principles layer
The principles layer coordinates high-level strategic decision-making and goal management, incorporating ethical considerations and long-term planning into operational choices. This layer manages route optimization, task prioritization, and resource allocation while considering broader operational objectives and constraints.
The processing infrastructure
The processing infrastructure relies on specialized AI chips and processors designed for machine learning inference, combined with edge computing modules that reduce latency by processing data locally rather than relying on cloud connections.
Multi-core systems enable parallel processing for real-time operations, while FPGA and ASIC implementations handle specific autonomous functions that require deterministic timing and high reliability.
Sensor systems
Sensor systems form the foundation of environmental perception, typically combining multiple technologies to achieve reliable performance across varying conditions.
Vision sensors, including cameras and stereo vision systems, provide detailed environmental information, while LiDAR systems use laser-based ranging to create precise three-dimensional maps of surroundings.
Radar systems complement visual sensors by providing reliable distance and velocity measurements in adverse weather conditions, and inertial measurement units track motion and orientation for navigation purposes.
The technology behind AuT
The artificial intelligence and machine learning components enable autonomous decision-making through deep neural networks for pattern recognition and feature extraction.
Computer vision algorithms handle object detection, recognition, and tracking, while decision-making frameworks process multiple inputs to select optimal actions. Natural language processing capabilities enable human interaction where required, allowing operators to communicate with autonomous systems using voice commands or text instructions.
AuT use cases with examples
Transportation
The NHTSA estimates 39,345 traffic fatalities in 2024, a 4% decrease from 40,901 in 2023. Around 30% of 2023 fatalities were due to impaired driving, which remains a leading cause of accidents.3 AuT systems are expected to match or surpass human performance in improving transportation safety.
The transportation sector is now the main testing ground for autonomous vehicle technology. Commercial robotaxi services are already operating in several cities, generating significant revenue. Leading autonomous vehicle companies have moved past pilot programs and are scaling up operations, changing the way urban mobility challenges are addressed.
Real-life example:
Aurora Innovation has launched commercial driverless trucking operations in Texas, becoming the first company to operate a self-driving Class 8 trucking service on public roads with regular round-trip deliveries between Dallas and Houston. The Aurora Driver has completed over 1,200 miles without a human driver, with launch customers including Uber Freight and Hirschbach Motor Lines.
Before operations, Aurora closed its safety case and released a comprehensive safety report to demonstrate the technology’s readiness for public roads, with the system equipped with advanced sensors that can see beyond four football fields and redundant safety systems, including braking, steering, and computing.4
Real-life example:
Tesla Robotaxi is an autonomous ride-hailing service that uses Tesla vehicles, primarily the Model Y, equipped with the company’s Full Self-Driving (FSD) software. These vehicles are designed to operate without a driver under certain conditions, initially with a human safety monitor on board.
The system relies on a combination of cameras, sensors, and AI-powered neural networks to navigate roads, follow traffic rules, and autonomously complete tasks like picking up and dropping off passengers.
It works by leveraging Tesla’s FSD software, which processes real-time data from the vehicle’s surroundings to make driving decisions. Riders can hail a Robotaxi via an app, and the car autonomously drives to the pickup point and transports the passenger to their destination.
Tesla plans to enhance this with features like automatic Supercharger parking and eventually move toward vehicles without steering wheels or pedals, like the upcoming Cybercab.
This service represents a significant step toward fully autonomous transportation. It could reduce traffic accidents caused by human error, lower transportation costs, increase mobility access for non-drivers, and shift the landscape of urban transportation. It also plays a vital role in the broader autonomous vehicles market and reflects ongoing technological developments in AI-driven mobility.5
Military
Military autonomous systems are the fastest-growing segment in the global drone market, driven by rapid increases in defense spending as nations recognize the strategic value of unmanned systems. These systems focus on force protection and operational efficiency, meeting the unique demands of modern warfare.
The use of autonomous systems in the military has accelerated, with the US Pentagon’s Replicator Initiative, announced in August 2023, aiming to deploy thousands of autonomous weapons within 18-24 months.6 Additionally, the Air Force’s Collaborative Combat Aircraft program has allocated $392 million in its FY2024 budget, with $5.4 billion projected over four years for developing and deploying these systems.7
Real-life example:
The US Navy is deploying two cost-effective anti-drone interceptors on destroyers protecting the USS Gerald R. Ford carrier strike group. These systems include Anduril’s Roadrunner-M ($500,000) and Raytheon’s Coyote Block 2 ($125,000).
The deployment addresses the “cost-curve problem” of using multimillion-dollar missiles against inexpensive enemy drones. These turbojet-powered loitering munitions can operate autonomously for up to one hour and target drones at ranges approaching 10 miles.
They substantially reduce defense costs compared to traditional interceptors such as the $4+ million SM-6 missiles used against Houthi threats in the Red Sea. The Roadrunner-M provides vertical takeoff and landing capability and can return to the ship for reuse if not expended.
Both systems have demonstrated effectiveness in land-based operations with the Army and Special Operations Command before being adapted for maritime deployment.8
Retail
Retail environments have embraced autonomous systems for inventory management and warehouse operations, deploying robots to address labor shortages and improve operational precision. These implementations demonstrate how autonomous entities can bring computers into physical retail spaces while maintaining customer service quality.
Real-life example:
Simbe offers Store Intelligence technology that uses autonomous robots to improve retail inventory management.
The company’s Tally robots provide real-time shelf scanning and inventory visibility to store managers. Key benefits include 98% on-shelf availability, 90% improved price integrity, and 50 hours per week of staff time redirected to customer service.
The system replaces manual stock audits with automated scans and integrates with ordering systems for precise replenishment.
Store managers report significant reductions in out-of-stock rates, with some locations achieving 50-60% decreases in stockouts. The technology helps eliminate phantom inventory issues and provides prioritized alerts for store associates to address problems quickly.9
Figure 1: Simbe’s retail robot Tally in a grocery store.10
Real-life example:
Walmart is expanding autonomous forklift technology across its distribution centers following a successful 16-month pilot program.
The company is deploying 19 Fox Robotics FoxBot autonomous forklifts across four high-tech distribution centers. These AI-powered forklifts use machine vision and dynamic planning to unload trucks and transport pallets to automated storage systems safely.
These technological developments complement existing warehouse automation rather than replacing workers. Associates are being retrained to supervise and direct the robotic systems, with some achieving three times their previous manual output.
The company emphasizes that automation benefits both business operations and employee development by reducing physical strain and creating opportunities for skill advancement.
Figure 2: Walmart’s autonomous forklift example.11
Construction
Construction represents one of the most hazardous industries where autonomous systems directly address the dangerous elements of worker safety while improving project efficiency. These deployments focus on removing humans from high-risk situations while maintaining construction quality and timeline requirements.
Real-life example:
Built Robotics has introduced autonomous dozers, excavators, and track loaders with AI guidance systems for road construction projects, significantly improving worker safety by removing humans from dangerous operations.12
Real-life example:
Boston Dynamics’ Spot robot has been deployed across multiple construction sites for scanning, progress tracking, and safety inspections. The robot can access areas too dangerous for human workers and provide detailed documentation of project progress.
Figure 3: Boston Dynamics’ Spot in construction sites.13
Healthcare robotics deployment
Healthcare facilities have embraced autonomous systems to address critical staffing challenges while improving patient care quality through precision and consistency. These implementations focus on enhancing machines’ capabilities to support medical professionals rather than replacing human interaction in patient care scenarios.
Real-life example:
Aethon provides autonomous mobile robots designed to automate routine logistics and delivery tasks in healthcare facilities. The company offers two main models: the T3, for transporting carts, and the Zena RX, for securely delivering pharmacy, laboratory, and clinical materials. These robots serve various hospital departments, including laboratory, pharmacy, environmental services, food services, linens, and general supplies.
Aethon’s solutions also include 24/7 connected support, integration with elevators and building systems, and ongoing service and maintenance.
The company emphasizes that its automation allows healthcare staff to focus on patient care, improving efficiency and service levels across medical facilities.14
Security
Autonomous things (AuT) in security are used to perform tasks like patrolling, surveillance, and threat detection in environments such as airports, public spaces, and industrial sites.
These autonomous systems operate without human control, collecting and processing data with AI to respond in real time to suspicious activities, all without direct human involvement.
Key technologies include:
- Ground-based autonomous machines
- Drone systems integrated with autonomous software
- Video analytics tools supported by mobile software
Real-life example:
Knightscope manufactures the K5 Autonomous Security Robot, a fully autonomous security solution designed to enhance safety at commercial properties, campuses, and public spaces. The K5 operates 24/7 with AI-driven threat detection, autonomous patrolling capabilities, and real-time monitoring.
The robot provides physical deterrence, continuous surveillance, and connects to Knightscope’s Security Operations Center for remote monitoring and live alerts.
The company reports significant security improvements, including a 46% reduction in crime reports, a 27% increase in arrests, and a 68% reduction in citations at client locations.
Knightscope also offers additional security products, including emergency communication devices, gunshot detection systems, and monitoring software to create comprehensive automated security solutions.15
Figure 4: The K5 Autonomous Security Robot example.16
Weather forecasting
Autonomous things are critical in weather prediction, improving the scale, frequency, and accuracy of environmental data collection. Autonomous drones and unmanned aerial vehicles (UAVs), for example, can operate independently to gather atmospheric data from remote or hazardous areas, such as hurricanes or volcanic regions, where human intervention would be risky or impossible.
Autonomous systems are also used on satellites and high-altitude balloons to collect long-term climate data. These systems can measure factors like humidity, wind speed, and temperature without manual control.
By integrating these data into larger autonomous software platforms, meteorological agencies can create more accurate forecasts and disaster models, which are crucial for emergency preparedness and infrastructure protection. Quick responses and ensuring safety are key in these scenarios.
The adoption of these autonomous technologies is driven by advancements in analytics and sensor miniaturization, which reduce costs and enable broader coverage.
Real-life example:
Meteomatics develops weather drones called Meteodrones that collect atmospheric data from the Earth’s boundary layer to improve weather forecasting accuracy.
The Meteodrone MM-670 can fly up to 6 kilometers in altitude and operates in challenging weather conditions, including temperatures as low as -45°C. These drones carry sensors to measure temperature, humidity, air pressure, and wind speed.
The system includes a Ground Control Station for real-time monitoring and an optional Meteobase that serves as an autonomous launch, landing, and charging station. Meteodrones offer advantages over traditional radiosondes by being reusable, controllable, and capable of hovering over specific locations.17
Agriculture
Autonomous things in agriculture include precision seeding, pesticide spraying, and soil analysis, which are now managed by autonomous mobile robots and drone robots without the need for human direction. These smart robots can detect crop health variations using computer vision and AI algorithms, allowing for timely and localized interventions.
A notable example is the use of autonomous drones for crop monitoring. These UAVs capture multispectral imagery of fields, which is then analyzed by autonomous software to assess plant health, water levels, and pest presence.
These technologies help increase crop yields, reduce chemical use, and improve irrigation efficiency, aligning with the global push for sustainable agriculture.
Additionally, autonomous tractors and self-driving harvesting vehicles are becoming key players in the autonomous vehicle market.
Supported by companies entering the agri-tech sector, these machines, once trained, operate independently across various terrains, optimizing routes and adapting to obstacles in their environment.
Real-life example:
DJI Agriculture promotes drone technology as a transformative solution for tea farming, addressing challenges faced by traditional cultivation methods.
DJI’s agricultural drones, particularly the Agras T50, offer significant advantages, including enhanced precision through advanced weighing sensors, improved efficiency with 50% faster completion times, and environmental sustainability through battery-powered operation.
For example, a 912-hectare plantation requiring 50 workers and 45 days can be completed by seven Agras T50 drones in just 21 days. The drones apply fertilizers at rates of 300-450 kg per hectare with precise control over application parameters.
The technology enhances worker safety by eliminating direct chemical exposure and operates effectively in various weather conditions, including muddy terrain after rainfall. DJI drones support the complete tea plant lifecycle through fertilizer application, pest control, and data collection for optimization.18
Leading companies in the AuT landscape
The challenges of AuT technologies
Safety and reliability
Ensuring that autonomous things demonstrate acceptable safety in real-world environments is a significant concern. Autonomous systems must make accurate decisions in complex, dynamic, and sometimes unpredictable settings.
For instance, self-driving cars may face sudden obstacles or changing traffic conditions. Any misjudgment by the system can lead to motor vehicle crashes or harm to people, mainly when autonomous drones or robots operate in crowded or hazardous areas.
Solution:
Improving the safety of autonomous things requires multiple strategies. First, sensor redundancy and real-time data fusion can help the system detect objects and changes in the physical environment more accurately.
Second, testing in both simulations and real-world conditions must become standard, particularly under edge cases. AI algorithms should be designed to learn from past failures and environmental data to improve future performance.
Also, regulatory bodies need to define clear safety benchmarks and oversee testing protocols before wide-scale deployment.
Human trust and interaction
Many users hesitate to rely on autonomous devices due to unclear decision-making processes and limited transparency. Whether it involves personal robots, autonomous smart home devices, or self-driving vehicles, human trust is undermined when systems behave unpredictably or lack clear interfaces.
This challenge becomes more critical in systems that require close human interaction or handovers, such as autonomous driving in mixed traffic environments.
Solution:
Designing systems that communicate their intentions and status clearly can help build user confidence.
Explainable AI methods can allow users to understand why a system made a particular decision. Moreover, introducing autonomy in stages (starting with partial autonomy under human direction and moving toward full autonomy) allows users to build trust gradually. This approach creates a sustainable feedback cycle between user experience and system development.
Cybersecurity and data protection
Autonomous systems that rely on connectivity are vulnerable to cybersecurity threats. A compromised autonomous vehicle or drone robot could be redirected, disabled, or used maliciously. Unauthorized access to sensor data or control signals can lead to physical and reputational damage.
These risks grow as autonomous systems become integrated into critical infrastructure or supply chain networks.
Solution:
Developers must implement cybersecurity measures from the design phase. This includes secure communication protocols, advanced authentication, intrusion detection systems, and encrypted storage.
Real-time threat monitoring and anomaly detection systems should be built into autonomous software. In sensitive environments, fail-safe modes and human override mechanisms must be available to prevent autonomous operation in case of a breach.
Regulatory uncertainty and legal responsibility
The rapid growth of autonomous technologies has outpaced legal and regulatory frameworks. Key questions remain unanswered:
- Who is liable when an autonomous car causes an accident?
- What standards apply to delivery vehicles operating on sidewalks?
- How do federal aviation administration regulations apply to autonomous drones flying over private property?
The absence of clear laws hinders investment and deployment, particularly in regions with complex legal environments.
Solution:
Governments, industry leaders, and researchers must collaborate to develop adaptive and sector-specific regulations. These regulations should define accountability models for autonomous operation, set testing and certification standards, and support international alignment where cross-border systems are involved.
In high-risk sectors, regulatory sandboxes can allow testing of new autonomous entities under controlled conditions. Prompt regulatory developments are critical to ensure safety, clarify responsibilities, and support innovation.
Environmental adaptability
Many autonomous things perform well in structured environments but struggle in variable or unstructured conditions. For example, roadside sensor data may become unreliable during heavy rain or snow, reducing the accuracy of autonomous vehicles.
Similarly, autonomous mobile robots used in agriculture may encounter unpredictable terrain, weather, or obstacles that require complex navigation.
Solution:
To improve adaptability, developers are integrating advanced perception systems and machine learning models that adjust behavior based on environmental context.
Swarm intelligence and multi-agent coordination can allow groups of robots or drones to handle tasks cooperatively and more flexibly. Designing systems with physical and contextual awareness enables them to perform specific tasks autonomously, even under non-ideal conditions.
High costs and infrastructure needs
The development, deployment, and maintenance of autonomous machines can be expensive. While reduced sensor costs lead to lower hardware prices, many systems still require specialized infrastructure such as high-definition maps, reliable 5G connectivity, or remote pilot control centers.
These requirements can be a barrier in regions without advanced infrastructure or in companies with limited budgets.
Solution:
Cost barriers can be addressed through scalable system designs, modular hardware, and open-source software frameworks that reduce development time. Public-private partnerships can support infrastructure investments, especially in logistics and transport.
Integrating autonomous things into existing infrastructure rather than replacing it also helps reduce costs. For example, using existing vehicles or roadside units with upgraded sensors can support autonomous driving trials without building entirely new environments.
Reference Links

Cem's work has been cited by leading global publications including Business Insider, Forbes, Washington Post, global firms like Deloitte, HPE and NGOs like World Economic Forum and supranational organizations like European Commission. You can see more reputable companies and resources that referenced AIMultiple.
Throughout his career, Cem served as a tech consultant, tech buyer and tech entrepreneur. He advised enterprises on their technology decisions at McKinsey & Company and Altman Solon for more than a decade. He also published a McKinsey report on digitalization.
He led technology strategy and procurement of a telco while reporting to the CEO. He has also led commercial growth of deep tech company Hypatos that reached a 7 digit annual recurring revenue and a 9 digit valuation from 0 within 2 years. Cem's work in Hypatos was covered by leading technology publications like TechCrunch and Business Insider.
Cem regularly speaks at international technology conferences. He graduated from Bogazici University as a computer engineer and holds an MBA from Columbia Business School.

Comments 2
Share Your Thoughts
Your email address will not be published. All fields are required.
Volvo and Waymo are the same. Tesla, Honda, Ford are the top of the list. The rest are either not in progress or are waiting.
Hello, DJ. Yes, Waymo and Volvo have formed a partnership, allowing Volvo to use Waymo's technology on its fleet. We have made the amendment to the article. Thanks for pointing it out!
Interesting!