Concerns about cost unpredictability and AI readiness are prompting IT leaders to plan major changes to their virtualization strategies.
According to a new survey commissioned by HPE, two-thirds of IT decision-makers are thinking about revamping their virtualization deployments in the next two years, but only 5% say their organizations are ready for the change.
Cost concerns are one driver of what HPE calls the “great virtualization reset.” Tech giant Broadcom’s acquisition of market leader VMware in late 2023 led to licensing cost increases, prompting some organizations to “devirtualize,” although over half of organizations in the survey are still running VMware as their primary virtualization platform.
In addition to the VMware price hikes, IT leaders have noted that cloud costs are running higher than expected this year, according to the survey.
“Their virtualization bill just like quintupled, and they have to invest in AI, they have salary modernization, and now they’re saddled with this bill that they can’t afford,” says Hang Tan, COO for hybrid cloud at HPE. “It’s no secret by now that everybody’s looking for an alternative.”
But an even larger reason for the trend is a focus on AI readiness and questions about virtual machines hosted in the cloud, according to HPE, which offers its own virtualization solutions.
Rethinking VMs and the cloud at the same time
Many IT leaders want new AI-friendly virtualization capabilities, and at the same time, they are rethinking their cloud computing setups as they consider how AI and virtualization fit together.
There’s a growing demand for hybrid cloud environments as IT leaders seek to control AI costs, monitor workloads, and avoid vendor lock-in, Tan says. At the same time, IT leaders are looking for capabilities such as unified backup, cross-platform governance, and integrated observability as they modernize their virtualization and cloud setups, according to the survey.
As price concerns collide with AI modernization efforts, IT leaders are seeing an opportunity to reassess their virtualization stack and start with a blank page, Tan says.
“There’s a silver lining: Never waste a crisis,” he says. “It’s freeing for the CIO to say, ‘Now I have a chance to rethink my strategy, whereas before I had standardized on one stack and operating model, but I knew I had all these issues I had to resolve.’”
AI drives VM changes
AI upends old virtualization uses, says Artur Balabanskyy, founder and CTO at IT services and app development firm Tapforce.
“AI workloads are GPU heavy and unpredictable,” he says. “That breaks the old model of sizing VMs once and letting them run. Companies start questioning whether they need full virtualization at all, or if lighter layers or bare metal make more sense for AI-powered projects.”
While traditional VM hypervisors are CPU centric and assume steady demand, AI is memory hungry, GPU bound, and latency sensitive, he adds. “The abstraction layers add cost and drag, so there are big problems when models hit real scale,” Balabanskyy says. “This is a weird new world for folks whose job it is to plan for future usage.”
Beyond cost issues, virtualization and AI technologies intersect at the need for quality data, says Sune Baastrup, CIO at Danfoss, a heating, cooling, and data center component manufacturing firm. The company has recently reevaluated its virtualization technologies with a focus on data portability and flexibility, he adds.
“We see virtualization as much more than the traditional just getting more out of one physical box of hardware,” he says. “It’s actually preparing for a future where you have much more dynamic use cases, where we’ll see a lot of workloads being moved from either central places out into more decentral locations, or all the way out to the edge.”
As Danfoss embraces AI, the company’s virtualization and data strategies need to keep up, Baastrup adds. “If there’s one thing that you know is in focus to get value out of AI, it’s the access to quality data,” he says.
Limiting factors
Meanwhile, IT leaders see several obstacles to making a change, according to the survey. More than a quarter of those surveyed say budget constraints limit their ability to reimagine virtualization strategies, and nearly a quarter also cite technical complexity, migration risks, and skills gaps as significant barriers.
Organizational readiness is also a potential barrier, Danfoss’s Baastrup says. Part of the job of the CIO is to convince the IT team that they should change vendors or technologies, he adds.
“What each CIO needs to come down to is, ‘How do I get my organization ready to consume technology that is already available today and how do I find a partner that can take those steps with me when those breakthrough AI use cases become relevant?’” he says.
Budget, skills, and fear of change can limit efforts to shift away from virtualization, adds Tapforce’s Balabanskyy. “Large-scale changes are expensive and disruptive, even when they make sense on paper,” he says. “Teams know how to run what they have, and retraining or replacing that knowledge carries real risk. Price and stability still win most internal arguments.”
Take it slow
With several obstacles to making virtualization changes, HPE recommends that IT leaders take a measured approach. Organizations don’t have to lift and shift all workloads to a new provider all at once, says HPE’s Tan.
“There’s the knee-jerk reaction of, ‘My costs are going up, and I need to move off ASAP,’” he says. “But we tell our customers, this is an opportunity for you to rethink your strategy, and you should be thoughtful about it. Our advice has actually been, ‘Don’t be in a rush.’”
CIOs should also think holistically about how new virtualization solutions fit into their overall IT environment, adds Scott Steele, COO at IT services provider Thrive. Virtualization is one piece of a larger IT transformation discussion that should be focused on AI and other future needs, he says.
“There are new capabilities and new vendors coming out with either AI capabilities or new infrastructure that supports it,” he says. “It’s a time period where you should be looking at your entire infrastructure to make sure that you’re set up for where everything’s going, not where it’s been.”
Read More from This Article: Rethinking virtualization in the age of AI
Source: News

