Humanoid robots, smart devices, and autonomous driving are often cited lucrative business use cases at the edge. But edge AI computing will liberate AI from data centers and centralized servers in the cloud to manufacturing floors, operating rooms, and throughout municipal centers, processing data in real-time and closer to IoT devices, sensors, and intelligent systems. It also offers low-latency and autonomous decision-making so AI would become available everywhere and enable fully autonomous industrial facilities to revolutionize business and life in general.
It’s a subject close to the CIO of Rockwell Automation, an Nvidia customer and partner at the forefront of edge AI computing. “The shift to decentralization of artificial intelligence from cloud-centric architectures to edge-based deployments represents more than just a technical evolution,” says Chris Nardecchia, CIO at the digital transformation provider, which showcased its Emulate3D advanced factory-scale virtual controls testing technology at Nvidia GTC last month. “It fundamentally redefines how AI capabilities integrate into every facet of our industrial and personal environments.”
This solution, which is integrated with Nvidia’s Omniverse APIs, will allow manufacturers to validate automation systems before physical deployment through virtual factory acceptance testing.
Edge AI allows enterprises to deploy AI applications in smart devices or robots in a warehouse, for instance, performing compute-intensive inferential and reasoning models close to the data source rather than from a public cloud or data center. This speeds up the AI considerably.
At its recent conference, Nvidia continued its ambitious push to the edge with a bevy of advanced AI hardware, software platforms, and developer frameworks including Jetson Orin, Xavier and Nano platforms, Blackwell Ultra AI chips enhanced for advanced applications at the edge, Groot N1 AI robotics model for autonomous machines, IGX Orin Industrial-grade edge AI platform for industrial and medical needs, as well as Nvidia AI Data Platform to enable data analytics at the edge.
Nvidia’s core EGX Enterprise Edge AI platform, in particular, facilitates real-time AI workloads for healthcare, manufacturing, and retail industries while its Metropolis platform powers video analytics at the edge for smart cities.
Jetson Nano Super workstation, introduced at Nvidia GTC, will offer business users powerful AI capabilities at their fingertips in remote offices or within business centers. And its Clara for Healthcare, Drive for autonomous vehicles, and Aerial for 5G Networks will also offer real-time monitoring, predictive maintenance, and process optimization to reduce downtime and improve system performance throughout the entire lifecycle of industrial assets in the field.
One analyst says Nvidia’s lineup of platforms and products for Physical AI, such as Omniverse for industrial digitization, underscores the vendor’s expanding identity from simply a semiconductor manufacturer to a hardware, platforms, tools, and frameworks provider.
“The market isn’t understanding the implications of what Nvidia is doing here,” says Chirag Dekate, chief AI analyst at Gartner Group. “It’s truly breathtaking. The combinations of EGX and Jetson, plus Nvidia’s Cosmos platform, where you can develop physical AI-like environments that combine the best of AI and digital twins, create an environment that helps accelerate training of intelligence that can be deployed at the edge. And it’s at the edge where they’re now transforming our robotics, smart robotics, AV and human robotics. They’re spinning off a new growth vector, just like they did in the data center with GPUs.”
Sequence of events
The market first embraced gen AI for content creation and then moved into agentic AI to enable models that can reason and perform tasks. But the heart of the industrial AI revolution is physical AI or AI-enabled robotics, which can realize fully autonomous industrial facilities, with much of it deployed at the edge.
Rockwell’s autonomous mobile robots (AMR), for instance, demonstrate measurable impacts in throughput, labor optimization, and saving time. The robots serve as mobile edge computing platforms, processing sensor data locally while feeding aggregated insights to the company’s FactoryTalk Edge Manager, Nardecchia says.
Nvidia’s platforms for edge computing will expand, enrich, and expose industrial data, creating new sources of value through applications and analytics. This capability becomes particularly powerful when combined with physical AI and agentic AI, enabling truly autonomous systems that can sense, decide, and act with minimal human intervention, he says, adding that Rockwell’s strategic acquisitions of Otto Motors and Clearpath Robotics have positioned the company well for production logistics automation.
Edge AI transforming industrial digitization
According to a recent IDC forecast of 27 enterprise industries, global spend on edge computing solutions will account for nearly $261 billion this year, and is projected to grow at a CAGR of 13.8%, reaching $380 billion by 2028.
“Edge computing is poised to redefine how businesses leverage real-time data, and its future hinges on tailored, industry-specific solutions that address unique operational demands,” says Dave McCarthy, research VP, cloud and edge services at IDC. “We’re seeing service providers double down on investments, building out low-latency networks, enhancing AI-driven edge analytics, and forging partnerships to deliver scalable and secure infrastructure. These efforts are critical to realizing the full potential of edge computing, enabling everything from smarter manufacturing floors to responsive healthcare systems, and ultimately driving a new wave of innovation across verticals.”
As a result, CIOs are planning out their next generation AI architectures with powerful platforms, tools, and frameworks to create robots and IoT devices with such autonomous decision-making capabilities in mind.
“CIOs are definitely planning on AI for edge workloads,” says Nate Melby, CIO of Dairyland Power Cooperative, who’s eyeing such advancements to manage power grids in storms, and enabling systems with speedy analysis and decision making in hazardous environments. New business opportunities and profitable outcomes from using edge AI devices in other physical environments restrictive to human presence are also anticipated.
“By pushing AI to the edge, we can take advantage of less dependence on centralized architecture to build resilience, and create easier expansion and resource flexibility by balancing cloud resources with local devices to optimize and process more sensitive or mission critical data locally,” Melby says. “But it’s going to take some time for it to evolve.”
Compounding interest in the edge
Many top cloud and AI vendors including OpenAI, Google, Amazon, and innovative AI startups are targeting the edge. Cloud provider Oracle, for instance, recently added a GPU-optimized configuration to its Oracle Roving Edge Device. “We’re seeing customer demand for edge computing AI,” says Dave Rosenberg, SVP, field and industry marketing at Oracle Cloud Infrastructure.
Insight Enterprises’ CTO of product innovation, Amol Ajgaonkar, adds that many industries besides manufacturing will make use of edge AI, but it won’t be easy. If the edge is defined as anything that’s not in the cloud — a laptop, a machine on the manufacturing floor, a point of sale device in a retail store — then industries like healthcare, retail, and finance are prime targets for edge AI, he says.
“With AI at the edge, a big challenge is deciding what data is good or bad for the task at hand, and setting up the process so an AI agent or group of agents can manage without needing a human constantly in the loop,” says Ajgaonkar. “When working to create a predictive model, such as to manage ongoing maintenance of a machine on the factory floor, certain biases or malformed data without any filtering process can affect the model and skew the agents’ resulting actions. Clean data in will always be critical for a clean output, but it’s a delicate balance.”
And Tom Richer, CEO of AI consultancy Intelagen and former CIO says he advises CIOs to closely monitor Nvidia’s advancements due to their dominance in AI infrastructure, data center transformation, and edge AI capabilities, all of which directly impact an organization’s ability to innovate and compete.
“The increasing adoption of edge computing for AI workloads, driven by the need for low latency, bandwidth optimization, and enhanced security, necessitates a strategic decision between on-premises and service provider deployments, with a hybrid approach often proving most effective to balance control and scalability,” he says. “And that requires CIOs to develop clear strategies, invest in necessary infrastructure, and stay abreast of evolving technologies.”
Getting an edge from the edge
The implementation of physical AI requires powerful edge computing capabilities that process sensor data and execute complex algorithms with minimal latency. Edge computing decentralizes data processing by bringing it closer to data sources, resulting in faster response times, improved data transmission efficiency, and enhanced security — all critical factors for applications in robotics and industrial automation, Rockwell’s Nardecchia explains.
As platforms and technologies continue to mature, we can expect AI to become increasingly embedded in physical systems across industrial environments.
For companies like Rockwell, this evolution represents an opportunity to integrate edge AI capabilities throughout its product portfolios. The business outcomes from properly managed edge computing are substantial, including affordable access to data, faster software deployments, future-ready analytic platforms, improved security posture, better scaling of digital transformation initiatives, and reduced TCO.
The Edge AI Foundation says CIOs and enterprises want automation and smart devices at the edge. “Edge AI is all about running AI workloads where the data is created, and the gravitational pull toward the edge means lower cost, lower power, more impact, typically, and that can also mean enhanced privacy, latency, flexibility, and clearing,” says Pete Bernard, the nonprofit’s CEO, noting that CIOs are in charge of figuring out the information strategy. “You want to move your compute as close as possible to where the data is created, avoid ingress and egress fees to clouds as well as OpEx costs, and have more control over your processing in general.”
As platforms and technologies continue to mature, we can expect AI to become increasingly embedded in physical systems across industrial environments.
“The rise of foundation models is rippling toward the edge through distilled and quantized transformers, as well as small foundation models,” says Paul Golding, VP of edge AI at Analog Devices. “This shift demands dense, compute-intensive infrastructure at the edge. Simultaneously, the need for real-time processing, low latency, and privacy is driving AI closer to the source of data — what we often call the sensor, or physical, edge. Agentic AI’s ability to autonomously learn, adapt, and act in real time will revolutionize task orchestration across heterogeneous nodes, and as we move from machine automation to machine autonomy, new forms of distributed intelligence will emerge, enabling mission-critical operations to run at the edge without relying on centralized cloud systems. The frontier of AI remains wide open.”
Read More from This Article: Edge AI for robots, smart devices not far off
Source: News