AIMultiple ResearchAIMultiple ResearchAIMultiple Research
We follow ethical norms & our process for objectivity.
This research is not funded by any sponsors.
AI in Industries
Updated on Aug 14, 2025

Autonomous Things: Use Cases with Examples

Headshot of Cem Dilmegani
MailLinkedinX

Autonomous things (often shortened to AuT) are physical devices, such as vehicles, robots, and drones, that use onboard sensors, connectivity, and AI to perceive the physical world and autonomously complete tasks with little or no human direction

Explore what autonomous things are and how they operate, their most common use cases with real-life examples, and the challenges businesses should look out for when investing in AuT.

Definition and core characteristics of autonomous things

Autonomous things (AuT) refer to systems that can make decisions and take actions independently, with minimal or no human supervision or control. These autonomous entities are self-sufficient devices capable of performing intelligent computations that combine both energy autonomy and inference autonomy. They possess three fundamental characteristics that distinguish them from traditional automated systems:

  1. The first characteristic involves flexible autonomous action capability, allowing these systems to adapt their behavior to environmental changes in real-time. Unlike pre-programmed automated systems that follow fixed routines, autonomous machines can modify their actions based on new circumstances or unexpected obstacles. This adaptability enables them to operate effectively in dynamic environments where conditions frequently change.
  2. The second defining trait is decision-making autonomy, which represents the capacity to choose actions based on sensory input and predetermined objectives. These systems analyze multiple data streams simultaneously, weigh different options, and select the most appropriate course of action. This capability enables autonomous systems to handle situations not explicitly programmed into their original software.
  3. Self-governance is the third key characteristic, enabling operation with little to no human interaction. While these systems may have human direction for high-level goals or safety oversight, they execute most operational decisions independently. This independence allows them to bring computers and artificial intelligence into physical environments where continuous human control would be impractical or impossible.

Types of autonomous things

The autonomous things landscape encompasses four primary categories, each serving different operational domains and use cases.

Autonomous vehicles

Autonomous vehicles represent the most visible category, spanning from self-driving cars operating on public roads to unmanned aerial vehicles conducting surveillance missions. The Society of Automotive Engineers has established a classification system ranging from Level 0 (manual driving) to Level 5 (full automation), with current deployments primarily focused on Levels 2 and 3.1

Here is the three-layer architecture of autonomous vehicles:2

  • Mission Planning Layer serves as the highest-level decision maker, handling route selection and task assignment. This layer uses graph search algorithms like Dijkstra’s or A* to navigate road networks, representing a fundamental component of AuT technology that allows vehicles to plan long-distance journeys autonomously.
  • Behavioral Planning Layer acts as the critical decision-making component that ensures vehicles follow traffic rules and interact safely with other road users. This layer addresses one of the main existing safety issues by implementing Finite State Machines (FSMs) that can predict dangerous situations and respond appropriately, helping overcome concerns about autonomous vehicle safety.
  • Motion Planning Layer generates real-time collision-free paths and trajectories. This component must demonstrate acceptable safety while operating in dynamic environments, using advanced algorithms like RRT (Rapidly-exploring Random Trees) and sampling-based methods to ensure vehicles can autonomously complete tasks while avoiding obstacles.

Core competency modules

Perception Module represents the sensory foundation of autonomous systems, utilizing LIDAR, cameras, and roadside sensor data to understand the environment. Technological developments in sensor fusion have improved environmental awareness, while reduced sensor costs have made these technologies more accessible to autonomous vehicle companies and researchers.

The perception system processes point clouds, performs object detection, and enables localization, which are all critical for autonomous entities without human direction to navigate safely. Modern deep learning approaches have changed the object recognition capabilities, though critical challenges remain in handling diverse weather conditions and complex urban scenarios.

Planning Module encompasses the decision-making hierarchy that transforms perception data into actionable plans. This module must handle the unprecedented nature of autonomous navigation, balancing efficiency with safety while operating in the physical world. The planning system represents core autonomous software that enables vehicles to make complex decisions in real-time.

Mission planning uses graph search over road networks, while behavioral planning implements rule-following logic through FSMs. Motion planning generates safe trajectories using advanced algorithms that can adapt to dynamic environments, representing AuT technologies that bring computers into direct interaction with the physical environment.

The Control Module executes planned trajectories through precise vehicle control systems. This component uses Model Predictive Control (MPC) and geometric tracking algorithms to translate high-level plans into steering, throttle, and brake commands. The control system must maintain acceptable safety standards while ensuring smooth vehicle operation.

Autonomous robots

Autonomous robots constitute another major category, including industrial manufacturing systems, service robots for healthcare and hospitality, and specialized units for search and rescue operations.

These systems demonstrate varying degrees of autonomy, from factory robots following predetermined paths to service robots that navigate unpredictable environments while interacting with humans. Agricultural robots have emerged as a particularly successful application, with autonomous tractors and harvesting equipment addressing labor shortages while improving precision and efficiency.

Autonomous smart home devices

Autonomous smart home devices represent a growing segment that includes IoT devices with embedded artificial intelligence, autonomous sensors for environmental monitoring, and intelligent infrastructure systems.

These devices typically operate within defined spaces but make independent decisions about energy management, security responses, and environmental controls based on sensor data and learned user preferences.

Multi-agent systems

Multi-agent systems feature coordinated groups of autonomous entities working together to accomplish complex tasks. Robot swarms can perform collective operations like search and rescue missions, while autonomous sensor networks enable distributed environmental monitoring and threat detection.

These systems exclude social networks in the traditional sense but create sophisticated communication networks among physical devices.

AuT architecture and technical components

Autonomous things typically employ a three-layer hierarchical architecture that enables real-time decision-making while maintaining safety and efficiency.

The reactions layer

The reactions layer handles low-level sensor data processing and immediate responses to environmental stimuli, operating with millisecond response times to ensure safety in dynamic situations.

This layer includes continuous control systems such as autopilots and motor control units that maintain basic operational parameters.

The rules layer

The rules layer manages middle-level decision-making through rule-based systems that ensure protocol compliance and safety standards. This layer interprets safety regulations, traffic laws, and operational procedures, translating them into specific behaviors and actions.

For autonomous vehicles, this layer processes traffic signals, road signs, and right-of-way rules to ensure legal and safe operation.

The principles layer

The principles layer coordinates high-level strategic decision-making and goal management, incorporating ethical considerations and long-term planning into operational choices. This layer manages route optimization, task prioritization, and resource allocation while considering broader operational objectives and constraints.

The processing infrastructure

The processing infrastructure relies on specialized AI chips and processors designed for machine learning inference, combined with edge computing modules that reduce latency by processing data locally rather than relying on cloud connections.

Multi-core systems enable parallel processing for real-time operations, while FPGA and ASIC implementations handle specific autonomous functions that require deterministic timing and high reliability.

Sensor systems

Sensor systems form the foundation of environmental perception, typically combining multiple technologies to achieve reliable performance across varying conditions.

Vision sensors, including cameras and stereo vision systems, provide detailed environmental information, while LiDAR systems use laser-based ranging to create precise three-dimensional maps of surroundings.

Radar systems complement visual sensors by providing reliable distance and velocity measurements in adverse weather conditions, and inertial measurement units track motion and orientation for navigation purposes.

The technology behind AuT

The artificial intelligence and machine learning components enable autonomous decision-making through deep neural networks for pattern recognition and feature extraction.

Computer vision algorithms handle object detection, recognition, and tracking, while decision-making frameworks process multiple inputs to select optimal actions. Natural language processing capabilities enable human interaction where required, allowing operators to communicate with autonomous systems using voice commands or text instructions.

AuT use cases with examples

Transportation

NHTSA estimates 39,345 traffic fatalities in 2024, representing a 4% decrease from 40,901 fatalities in 2023. Roughly 30% of 2023 fatalities involved impaired driving, the single most dangerous element in many car accidents.3 AuT systems are expected to meet or exceed human performance in transportation safety.

The transportation sector has emerged as the primary testing ground for autonomous vehicle technology, with commercial robotaxi services now operating in multiple cities and generating substantial revenue streams. Major autonomous vehicle companies have moved beyond pilot programs to achieve operational scale, fundamentally changing how the autonomous vehicles market approaches urban mobility challenges.

Real-life example:

Aurora Innovation has launched commercial driverless trucking operations in Texas, becoming the first company to operate a self-driving Class 8 trucking service on public roads with regular round-trip deliveries between Dallas and Houston. The Aurora Driver has completed over 1,200 miles without a human driver, with launch customers including Uber Freight and Hirschbach Motor Lines.

Before operations, Aurora closed its safety case and released a comprehensive safety report to demonstrate the technology’s readiness for public roads, with the system equipped with advanced sensors that can see beyond four football fields and redundant safety systems, including braking, steering, and computing.4

Real-life example:

Tesla Robotaxi is an autonomous ride-hailing service that uses Tesla vehicles, primarily the Model Y, equipped with the company’s Full Self-Driving (FSD) software. These vehicles are designed to operate without a driver under certain conditions, initially with a human safety monitor on board. The system relies on a combination of cameras, sensors, and AI-powered neural networks to navigate roads, follow traffic rules, and autonomously complete tasks like picking up and dropping off passengers.

It works by leveraging Tesla’s FSD software, which processes real-time data from the vehicle’s surroundings to make driving decisions. Riders can hail a Robotaxi via an app, and the car autonomously drives to the pickup point and transports the passenger to their destination. Tesla plans to enhance this with features like automatic Supercharger parking and eventually move toward vehicles without steering wheels or pedals, like the upcoming Cybercab.

This service represents a significant step toward fully autonomous transportation. It could reduce traffic accidents caused by human error, lower transportation costs, increase mobility access for non-drivers, and shift the landscape of urban transportation. It also plays a vital role in the broader autonomous vehicles market and reflects ongoing technological developments in AI-driven mobility.5

Military

Military autonomous systems represent the fastest-growing segment of the global enterprise drone market, with defense spending accelerating rapidly as nations recognize the strategic advantages of unmanned systems. These deployments prioritize force protection and operational efficiency while addressing the unprecedented nature of modern warfare requirements.

Military applications of autonomous systems have accelerated significantly, with the US Pentagon’s Replicator Initiative announced in August 2023 planning to deploy multiple thousands of autonomous weapons systems within 18-24 months.6 The Air Force Collaborative Combat Aircraft program has allocated $392 million in FY2024 budget, with $5.4 billion projected over four years for the development and deployment of autonomous military systems.7

Real-life example:

The US Navy is deploying two cost-effective anti-drone interceptors on destroyers protecting the USS Gerald R. Ford carrier strike group. These systems include Anduril’s Roadrunner-M ($500,000) and Raytheon’s Coyote Block 2 ($125,000).

The deployment addresses the “cost-curve problem” of using multimillion-dollar missiles against inexpensive enemy drones. These turbojet-powered loitering munitions can operate autonomously for up to one hour and target drones at ranges approaching 10 miles.

They substantially reduce defense costs compared to traditional interceptors such as the $4+ million SM-6 missiles used against Houthi threats in the Red Sea. The Roadrunner-M provides vertical takeoff and landing capability and can return to the ship for reuse if not expended.

Both systems have demonstrated effectiveness in land-based operations with the Army and Special Operations Command before being adapted for maritime deployment.8

Retail

Retail environments have embraced autonomous systems for inventory management and warehouse operations, deploying robots to address labor shortages and improve operational precision. These implementations demonstrate how autonomous entities can bring computers into physical retail spaces while maintaining customer service quality.

Real-life example:

Simbe offers Store Intelligence technology that uses autonomous robots to improve retail inventory management.

The company’s Tally robots provide real-time shelf scanning and inventory visibility to store managers. Key benefits include 98% on-shelf availability, 90% improved price integrity, and 50 hours per week of staff time redirected to customer service.

The system replaces manual stock audits with automated scans and integrates with ordering systems for precise replenishment. Major retail clients include Wakefern Food Corp, SpartanNash, and BJ’s Wholesale Club. Store managers report significant reductions in out-of-stock rates, with some locations achieving 50-60% decreases in stockouts. The technology helps eliminate phantom inventory issues and provides prioritized alerts for store associates to address problems quickly.9

An example of AuT: Simbe's retail robot Tally in a grocery store

Figure 1: Simbe’s retail robot Tally in a grocery store.10

Real-life example:

Walmart is expanding autonomous forklift technology across its distribution centers following a successful 16-month pilot program.

The company is deploying 19 Fox Robotics FoxBot autonomous forklifts across four high-tech distribution centers. These AI-powered forklifts use machine vision and dynamic planning to unload trucks and transport pallets to automated storage systems safely.

These technological developments complement existing warehouse automation rather than replacing workers. Associates are being retrained to supervise and direct the robotic systems, with some achieving three times their previous manual output. The company emphasizes that automation benefits both business operations and employee development by reducing physical strain and creating opportunities for skill advancement.

Walmart's autonomous forklift example.

Figure 2: Walmart’s autonomous forklift example.11

Construction

Construction represents one of the most hazardous industries where autonomous systems directly address the dangerous elements of worker safety while improving project efficiency. These deployments focus on removing humans from high-risk situations while maintaining construction quality and timeline requirements.

Real-life example:

Built Robotics has introduced autonomous dozers, excavators, and track loaders with AI guidance systems for road construction projects, significantly improving worker safety by removing humans from dangerous operations.12

Real-life example:

Boston Dynamics’ Spot robot has been deployed across multiple construction sites for scanning, progress tracking, and safety inspections. The robot can access areas too dangerous for human workers and provide detailed documentation of project progress.

Boston Dynamics' Spot in construction sites.

Figure 3: Boston Dynamics’ Spot in construction sites.13

Healthcare robotics deployment

Healthcare facilities have embraced autonomous systems to address critical staffing challenges while improving patient care quality through precision and consistency. These implementations focus on enhancing machines’ capabilities to support medical professionals rather than replacing human interaction in patient care scenarios.

Real-life example:

Aethon provides autonomous mobile robots for healthcare facilities to automate routine logistics and delivery tasks. The company offers two main robot models: the T3 for cart transportation and the Zena RX for secure delivery of pharmacy, laboratory, and clinical materials. These hospital robots serve multiple departments, including laboratory, pharmacy, environmental services, food services, linens, and general supplies.

Aethon’s solutions include 24/7 connected support, elevator and building system integration, and ongoing service and maintenance.

Aethon emphasizes that their automation allows healthcare staff to focus on patient care rather than transportation tasks, leading to improved efficiency and service levels throughout medical facilities.14

Security

Autonomous things (AuT) in the security domain are used to autonomously complete tasks such as patrolling, surveillance, and threat detection in physical environments like airports, public squares, and industrial sites. These autonomous entities without human direction collect and process data using AI, enabling real-time responses to suspicious activity without the need for direct human interaction.

Technologies used:

  • Ground-based autonomous machines
  • Drone systems integrated with autonomous software
  • Video analytics tools supported by mobile software

Real-life example:

Knightscope manufactures the K5 Autonomous Security Robot, a fully autonomous security solution designed to enhance safety at commercial properties, campuses, and public spaces. The K5 operates 24/7 with AI-driven threat detection, autonomous patrolling capabilities, and real-time monitoring.

The robot provides physical deterrence, continuous surveillance, and connects to Knightscope’s Security Operations Center for remote monitoring and live alerts.

The company reports significant security improvements, including a 46% reduction in crime reports, a 27% increase in arrests, and a 68% reduction in citations at client locations. Knightscope also offers additional security products, including emergency communication devices, gunshot detection systems, and monitoring software to create comprehensive automated security solutions.15

The K5 Autonomous Security Robot example

Figure 4: The K5 Autonomous Security Robot example.16

Weather forecasting

Autonomous things play an increasingly central role in weather prediction by enhancing the scale, frequency, and accuracy of data collection in the physical environment. Autonomous drones and unmanned aerial vehicles (UAVs), for instance, can operate autonomously to gather atmospheric data from remote or hazardous locations, such as hurricanes or volcanic regions, where human intervention is risky or impossible.

Autonomous systems are also employed on satellites and high-altitude balloons to collect long-term climate observations. These systems can autonomously complete tasks such as measuring humidity, wind speed, and temperature without manual control.

By integrating their outputs into larger autonomous software platforms, meteorological institutions can simulate more accurate forecasts and disaster models. This is particularly important for emergency preparedness and infrastructure protection, where prompt responses are required and demonstrating acceptable safety becomes vital.

The deployment of such autonomous technologies is supported by advancements in analytical capabilities and sensor miniaturization, which reduces sensor costs and leads to broader coverage.

Real-life example:

Meteomatics develops weather drones called Meteodrones that collect atmospheric data from the Earth’s boundary layer to improve weather forecasting accuracy.

The Meteodrone MM-670 can fly up to 6 kilometers in altitude and operates in challenging weather conditions, including temperatures as low as -45°C. These drones carry sensors to measure temperature, humidity, air pressure, and wind speed.

The system includes a Ground Control Station for real-time monitoring and an optional Meteobase that serves as an autonomous launch, landing, and charging station. Meteodrones offer advantages over traditional radiosondes by being reusable, controllable, and capable of hovering over specific locations.17

Agriculture

Autonomous things in agriculture include precision seeding, pesticide spraying, and soil analysis, which are now managed by autonomous mobile robots and drone robots without the need for human direction. These smart robots can detect crop health variations using computer vision and AI algorithms, allowing for timely and localized interventions.

A key example is the use of autonomous drones in crop monitoring. These UAVs scan fields to collect multispectral imagery, which is then analyzed by autonomous software to evaluate plant health, water levels, and pest presence.

These technologies contribute to higher yields, reduced chemical usage, and more efficient irrigation, goals aligned with the global demand for sustainable agriculture.

Autonomous tractors and self-driving vehicles used in harvesting are part of the autonomous vehicles market, increasingly supported by autonomous vehicles companies entering the agri-tech space. These machines, once trained, operate autonomously across diverse terrains, optimizing routes and adjusting behavior in response to other objects or obstacles in the physical environment.

Real-life example:

DJI Agriculture promotes drone technology as a transformative solution for tea farming, addressing challenges faced by traditional cultivation methods.

DJI’s agricultural drones, particularly the Agras T50, offer significant advantages, including enhanced precision through advanced weighing sensors, improved efficiency with 50% faster completion times, and environmental sustainability through battery-powered operation. For example, a 912-hectare plantation requiring 50 workers and 45 days can be completed by seven Agras T50 drones in just 21 days. The drones apply fertilizers at rates of 300-450 kg per hectare with precise control over application parameters.

The technology enhances worker safety by eliminating direct chemical exposure and operates effectively in various weather conditions, including muddy terrain after rainfall. DJI drones support the complete tea plant lifecycle through fertilizer application, pest control, and data collection for optimization.18

Leading companies in the AuT landscape

Updated at 08-14-2025
CategoryCompanyProduct(s)
Autonomous Ride-hailingWaymoWaymo One, Waymo Driver
Autonomous Ride-hailingTeslaAutopilot, Full Self-Driving
Autonomous Ride-hailingGeneral MotorsSuper Cruise, Cruise AVs
Manufacturing AutomationFANUC6-axis robots, SCARA robots
Manufacturing AutomationABB RoboticsIRB series, collaborative robots
Manufacturing AutomationUniversal RobotsUR series collaborative robots
Military and DefenseGeneral AtomicsMQ-9 Reaper, Gray Eagle
Military and DefenseNorthrop GrummanRQ-4 Global Hawk, Fire Scout
Military and DefenseDJIPhantom, Mavic, Matrice
Military and DefenseAnduril IndustriesLattice AI platform
Smart Home AutomationAmazonAlexa ecosystem, Echo devices
Smart Home AutomationGoogle NestThermostats, cameras, Hub
Smart Home AutomationSamsungSmartThings Hub, sensors
AgricultureJohn DeereAutoTrac, ExactEmerge, autonomous tractors
AgricultureCarbon RoboticsLaserWeeder
ConstructionBuilt RoboticsRPD 35, RPS 25 pile drivers
ConstructionConstruction RoboticsSAM100, MULE robots
HealthcareIntuitive Surgicalda Vinci surgical system
HealthcareStrykerMako robotic arm
SecurityKnightscopeK5 security robot

The challenges of AuT technologies

Safety and reliability

Ensuring that autonomous things demonstrate acceptable safety in real-world environments is a significant concern. Autonomous systems must make accurate decisions in complex, dynamic, and sometimes unpredictable settings. For instance, self-driving cars may face sudden obstacles or changing traffic conditions. Any misjudgment by the system can lead to motor vehicle crashes or harm to people, mainly when autonomous drones or robots operate in crowded or hazardous areas.

Solution:

Improving the safety of autonomous things requires multiple strategies. First, sensor redundancy and real-time data fusion can help the system detect objects and changes in the physical environment more accurately.

Second, testing in both simulations and real-world conditions must become standard, particularly under edge cases. AI algorithms should be designed to learn from past failures and environmental data to improve future performance.

Also, regulatory bodies need to define clear safety benchmarks and oversee testing protocols before wide-scale deployment.

Human trust and interaction

Many users hesitate to rely on autonomous devices due to unclear decision-making processes and limited transparency. Whether it involves personal robots, autonomous smart home devices, or self-driving vehicles, human trust is undermined when systems behave unpredictably or lack clear interfaces. This challenge becomes more critical in systems that require close human interaction or handovers, such as autonomous driving in mixed traffic environments.

Solution:

Designing systems that communicate their intentions and status clearly can help build user confidence.

Explainable AI methods can allow users to understand why a system made a particular decision. Moreover, introducing autonomy in stages (starting with partial autonomy under human direction and moving toward full autonomy) allows users to build trust gradually. This approach creates a sustainable feedback cycle between user experience and system development.

Cybersecurity and data protection

Autonomous systems that rely on connectivity are vulnerable to cybersecurity threats. A compromised autonomous vehicle or drone robot could be redirected, disabled, or used maliciously. Unauthorized access to sensor data or control signals can lead to physical and reputational damage. These risks grow as autonomous systems become integrated into critical infrastructure or supply chain networks.

Solution:

Developers must implement cybersecurity measures from the design phase. This includes secure communication protocols, advanced authentication, intrusion detection systems, and encrypted storage.

Real-time threat monitoring and anomaly detection systems should be built into autonomous software. In sensitive environments, fail-safe modes and human override mechanisms must be available to prevent autonomous operation in case of a breach.

The rapid growth of autonomous technologies has outpaced legal and regulatory frameworks. Key questions remain unanswered: Who is liable when an autonomous car causes an accident? What standards apply to delivery vehicles operating on sidewalks? How do federal aviation administration regulations apply to autonomous drones flying over private property? The absence of clear laws hinders investment and deployment, particularly in regions with complex legal environments.

Solution:

Governments, industry leaders, and researchers must collaborate to develop adaptive and sector-specific regulations. These regulations should define accountability models for autonomous operation, set testing and certification standards, and support international alignment where cross-border systems are involved.

In high-risk sectors, regulatory sandboxes can allow testing of new autonomous entities under controlled conditions. Prompt regulatory developments are critical to ensure safety, clarify responsibilities, and support innovation.

Environmental adaptability

Many autonomous things perform well in structured environments but struggle in variable or unstructured conditions. For example, roadside sensor data may become unreliable during heavy rain or snow, reducing the accuracy of autonomous vehicles. Similarly, autonomous mobile robots used in agriculture may encounter unpredictable terrain, weather, or obstacles that require complex navigation.

Solution:

To improve adaptability, developers are integrating advanced perception systems and machine learning models that adjust behavior based on environmental context.

Swarm intelligence and multi-agent coordination can allow groups of robots or drones to handle tasks cooperatively and more flexibly. Designing systems with physical and contextual awareness enables them to perform specific tasks autonomously, even under non-ideal conditions.

High costs and infrastructure needs

The development, deployment, and maintenance of autonomous machines can be expensive. While reduced sensor costs lead to lower hardware prices, many systems still require specialized infrastructure such as high-definition maps, reliable 5G connectivity, or remote pilot control centers.

These requirements can be a barrier in regions without advanced infrastructure or in companies with limited budgets.

Solution:

Cost barriers can be addressed through scalable system designs, modular hardware, and open-source software frameworks that reduce development time. Public-private partnerships can support infrastructure investments, especially in logistics and transport.

Integrating autonomous things into existing infrastructure rather than replacing it also helps reduce costs. For example, using existing vehicles or roadside units with upgraded sensors can support autonomous driving trials without building entirely new environments.

Share This Article
MailLinkedinX
Cem has been the principal analyst at AIMultiple since 2017. AIMultiple informs hundreds of thousands of businesses (as per similarWeb) including 55% of Fortune 500 every month.

Cem's work has been cited by leading global publications including Business Insider, Forbes, Washington Post, global firms like Deloitte, HPE and NGOs like World Economic Forum and supranational organizations like European Commission. You can see more reputable companies and resources that referenced AIMultiple.

Throughout his career, Cem served as a tech consultant, tech buyer and tech entrepreneur. He advised enterprises on their technology decisions at McKinsey & Company and Altman Solon for more than a decade. He also published a McKinsey report on digitalization.

He led technology strategy and procurement of a telco while reporting to the CEO. He has also led commercial growth of deep tech company Hypatos that reached a 7 digit annual recurring revenue and a 9 digit valuation from 0 within 2 years. Cem's work in Hypatos was covered by leading technology publications like TechCrunch and Business Insider.

Cem regularly speaks at international technology conferences. He graduated from Bogazici University as a computer engineer and holds an MBA from Columbia Business School.
Sıla Ermut is an industry analyst at AIMultiple focused on email marketing and sales videos. She previously worked as a recruiter in project management and consulting firms. Sıla holds a Master of Science degree in Social Psychology and a Bachelor of Arts degree in International Relations.

Next to Read

Comments

Your email address will not be published. All fields are required.

2 Comments
DJ
Nov 17, 2021 at 03:07

Volvo and Waymo are the same. Tesla, Honda, Ford are the top of the list. The rest are either not in progress or are waiting.

Bardia Eshghi
Nov 18, 2022 at 07:54

Hello, DJ. Yes, Waymo and Volvo have formed a partnership, allowing Volvo to use Waymo’s technology on its fleet. We have made the amendment to the article. Thanks for pointing it out!

NA H
Sep 14, 2021 at 05:52

Interesting!

Related research