Analysts predict the incoming phase of enterprise AI will herald agentic systems that require minimal human intervention, with 75% of CIOs increasing their AI budgets during this year alone, according to a recent report from Gartner. As gen AI becomes embedded into more devices, endowing it with autonomous decision-making will depend on real-time data and avoiding excessive cloud costs. This is where edge computing comes in.
By processing data closer to the source, edge computing can enable quicker decisions and reduce costs by minimizing data transfers, making it an alluring environment for AI. “Edge computing is progressing rapidly, evolving from a promising concept into a critical tool for many industries,” says Theresa Payton, former White House CIO and founder of cybersecurity company Fortalice Solutions. “By 2025, edge computing will become even more widespread, particularly as AI and IoT expand.”
Dairyland Power Cooperative, a Wisconsin-based electricity supply company, for example, has turned to gen AI to improve optimization and performance of infrastructure in the field. “AI makes edge computing more relevant to CIOs because it helps us reduce delays in processing data. And in situations where we’re aiming for real-time processing, this can be a huge advantage,” says Nate Melby, the company’s VP and CIO.
Others agree that increased edge processing will naturally emerge alongside the rollout of AI at large. “As AI applications proliferate in mission-critical enterprise use cases, some of those AI applications will move to the edge,” says Robert Blumofe, EVP and CTO at distributed cloud computing platform company Akamai. “Edge computing can reduce latency, lower cost, and lower data exposure risks.”
But gen AI in the enterprise has seen incredible hype, with actually few value-added use cases, analysts popping the bubble, and some tech leaders pulling the plug. The recent deceleration in interest around AI has Tim Crawford, CIO Strategic Advisor at AVOA, cautioning leaders to make sensible investments. Still, he sees plenty of efficiency-driven opportunities for gen AI, along with certain edge applications, that make the area worth exploring.
CIOs on the edge
Interest in edge computing has snowballed in recent years. In 2024, there’s been a 14% increase in worldwide spending on edge computing, according to estimates from IDC. A big driver for the interest is the move to put AI into practical use, which requires a combination of low latency and privacy, where edge computing excels.
“We’ve seen edge computing use expand rapidly,” says Blumofe. “We use it for our own applications, and customers are increasingly adopting it for theirs.” And, looking to 2025, more CIOs plan to implement AI on the edge.
“Next year, our organization is looking to leverage edge computing to help enhance operational decision intelligence and improve our trajectory toward intelligent assets,” adds Melby. In their field, AI and edge computing are becoming necessary to realize the next generation of highly intelligent industrial digital operations. It’s foundational for a new, networked, and dynamic energy ecosystem, he says.
And according to Fortalice Solutions’ Payton, moving data processing closer to where it’s created is especially beneficial for applications that require immediate action and real-time insights, whether in retail, manufacturing, or customer experiences. “Organizations that prioritize real-time decision-making and data processing should plan to embrace edge computing in their roadmaps for 2025 and beyond,” she says.
AI and edge, hand in hand
As edge computing is all about real-time data processing at the end-point where data is gathered and needs to be processed, AI becomes a clear ally, says Antonio Vázquez, CIO of software company Bizagi. “AI can contribute to solving those issues that slowed down adoption of the technology in the past by bringing additional efficiency in terms of data transfer, scalability, security, and cost.”
Operational gains make it worth considering as well. “AI makes edge computing more relevant to CIOs by enabling real-time, intelligent data processing at the network edge,” adds Chetan Venkatesh, CEO and co-founder of computing company Macrometa. For him, the combination unlocks performance gains, an enhanced user experience, a new application delivery method, and better resilience. An added benefit, as well, is data privacy, a contentious topic for AI systems. “Processing sensitive data locally addresses growing concerns about data sovereignty and compliance,” he says.
Leveling up how and where data is processed also equates to positive business outcomes. “AI makes edge computing highly relevant to CIOs because it allows businesses to process and analyze data closer to where it’s generated,” says Payton. “As AI continues to evolve, its dependence on rapid data processing makes edge computing not just beneficial but essential for competitive advantage.”
Use cases for AI on the edge
AI inference could be placed on a device, on-premise, or in the cloud, but the edge shines for numerous scenarios where speed and privacy matter. “Edge AI allows for instant decision-making where it matters most — close to the data source,” says Venkatesh. “This opens up use cases that weren’t possible before.”
Many user-facing situations could benefit from edge-based AI. Payton emphasizes facial recognition technology, real-time traffic updates for semi-autonomous vehicles, and data-driven enhancements on connected devices and smartphones as possible areas. “In retail, AI can deliver personalized experiences in real-time through smart devices,” she says. “In healthcare, edge-based AI in wearables can alert medical professionals immediately when it detects anomalies, potentially saving lives.”
And a clear win for AI and edge computing is within smart cities, says Bizagi’s Vázquez. There are numerous ways AI models at the edge could help beyond simply controlling traffic lights, he says, such as citizen safety, autonomous transportation, smart grids, and self-healing infrastructures. To his point, experiments with AI are already being carried out in cities such as Bahrain, Glasgow, and Las Vegas to enhance urban planning, ease traffic flow, and aid public safety.
Self-administered, intelligent infrastructure is certainly top of mind for Dairyland’s Melby since efforts within the energy industry are underway to use AI to meet emission goals, transition into renewables, and increase the resilience of the grid. “We’re trying to embrace more flexible energy exchanges, distribute the generation of energy, and blend together many resources in a real-time operation,” he says. “By leveraging AI and edge computing, we could effectively de-risk some complex operational decisions by establishing machine decisions with clear and predictable boundaries.” One specific area is selecting and balancing multiple energy sources, like wind, solar, or battery storage, based on cost and forecasts, and automatically optimizing bidirectional power flow.
Another sector is manufacturing. Akamai’s Blumofe points to how manufacturers could use AI algorithms at the edge to monitor production quality and workplace safety, and make real-time adjustments to production processes. This could also include predictive maintenance and machine self-diagnosis.
Other specialized circumstances are relevant, such as bringing gen AI to a soldier in theater, says AVOA’s Crawford. Yet, on the whole, he has a more pragmatic take on edge AI, viewing it as more of a specialized use case than an all-encompassing technique. “AI and computing at the edge is still very niche,” he says. He credits this in part to the high costs of training models and the low returns. “There has to be significant value to offset the cost.”
How edge impacts the business
CIOs tend to have a positive outlook in terms of the impact edge AI could have on the business, saying it can result in improved reliability, reduced data transfers, enhanced personalization, and lower risk of data exposure.
One key benefit is bringing reliability to the edge. “Self-healing systems is the key to improve reliability in any technology that needs to optimize its resources, being close to where things happen and far from where the systems are managed,” says Bizagi’s Vázquez. This could be accomplished using AI-driven components for load balancing, fault tolerance, or predictive anomaly detection.
Beyond traditional protections, autonomous AI at the edge is set to unlock unprecedented responsiveness to real-time operations. “Improved decision intelligence, continuous and seamless streamlined automation, and the advancement toward a digital ecosystem that can ensure future interoperability are huge benefits,” says Melby. In the case of the energy industry in particular, it can help improve power grid maintenance by shifting from reactive to predictive approaches, he says.
In addition to operational benefits, others anticipate elevating the customer experience with more rapid, personalized experiences using AI at the edge. “Today’s users expect instant, intelligent, and insight-rich online experiences,” says Macrometa’s Venkatesh. Yet these increasingly dynamic interactions typically require chains of API requests to distant servers, causing latency, he says. “Well-designed, well-deployed edge AI integrates more capabilities with minimal code changes, allowing businesses to deliver the real-time, interactive experiences users crave.”
While potential benefits abound, expectations must be based on reality, because if the business outcomes aren’t there, many use cases will fall to the wayside, says Crawford.
The bar is also high due to the sheer power requirements of AI training and inferencing, which present physical limitations. He points to a recent grid failure in Ireland, which restricted energy-intensive computations like AI processing.
Getting edge AI right
Excitement to implement AI on the edge should be tempered with cautious optimism. Payton, for instance, advises aligning AI strategies with business outcomes and taking a ‘walk, don’t run’ approach. “I’d caution CIOs to implement pilot-test-learn approaches to ensure you fully understand the total cost of ownership, security considerations, and business resiliency plans when implementing AI and edge computing,” she says.
CIOs must also justify the investment and optimize their usage of physical assets. Melby suggests that organizations carefully consider the problem they’re attempting to solve and what the result will be. “In my industry, we’re trying to be more efficient and more resilient, and edge computing with AI will help us in ways we haven’t been able to achieve before,” he says. “That kind of potential is transformative.” He adds that CIOs should carefully plan the location of modular or small data centers at the edge to reap the most value.
Success also hinges on choosing the right model for the application at hand since not every AI application needs the heavy weight of an LLM running on power-hungry, high-end GPUs, says Blumofe. “In many enterprise use cases, a much smaller AI model optimized to run on ordinary CPUs is a much better solution,” he says. “Such optimized models running at the edge can dramatically reduce latency and significantly lower cost.” With the soaring number of LLMs on the market, simply parsing through the available options also poses a task.
In addition to making sound judgments, looping in platform-agnostic tools and cutting-edge advancements in cellular technologies will be important for the future of the edge. “Prioritize scalable, decentralized architectures that can handle AI workloads and leverage hybrid cloud solutions that integrate edge, cloud, and on-premise systems seamlessly,” says Payton. “Additionally, technologies like 5G will play a crucial role in supporting faster data transmission, making edge computing even more viable and effective.”
Bracing for a cleaner future
Headwinds from various directions are coming in pretty short order that CIOs aren’t talking about enough yet, says Crawford. One of which is the carbon footprint. For instance, reporting for the EU’s Corporate Sustainability Reporting Directive (CSRD) comes into effect in early 2025, which requires an accurate accounting of a company’s environmental impact.
Such regulations could complicate the rollout of processing-heavy AI initiatives. So in this climate, CIOs must be smart and intentional. Crawford encourages them to envision the value chain before nosediving into AI, to assess the landscape for risky technical debt, and to double down on resiliency.
“Across the board, it’s important to implement best practices that emphasize security, safety, resiliency, transparency, fairness, and accountability to address risks like bias, security vulnerabilities, and ethical concerns,” says Payton.
And while the risks AI poses might give pause, the alternative of not evolving makes inaction a nonstarter. “For CIOs looking to enhance their infrastructure, embracing edge computing with AI isn’t just a trend, it’s a necessity to stay competitive,” says Venkatesh.
Read More from This Article: AI makes edge computing more relevant to CIOs
Source: News