Skip to content
Tiatra, LLCTiatra, LLC
Tiatra, LLC
Information Technology Solutions for Washington, DC Government Agencies
  • Home
  • About Us
  • Services
    • IT Engineering and Support
    • Software Development
    • Information Assurance and Testing
    • Project and Program Management
  • Clients & Partners
  • Careers
  • News
  • Contact
 
  • Home
  • About Us
  • Services
    • IT Engineering and Support
    • Software Development
    • Information Assurance and Testing
    • Project and Program Management
  • Clients & Partners
  • Careers
  • News
  • Contact

Deconstructing the data center: A massive (and massively liberating) project

A few years back, Bhaskar Ramachandran read the tea leaves and what he saw was clear: With all the enhancements hyperscalers continuous make, there was no value in having on-premises data centers any longer.

“There is just no way for a private company to match that,” says Ramachandran, global vice president and CIO of paints and coatings manufacturer PPG. “This is their business, and they’re really good at it, and it was clear that the size of the hyperscalers is just going to win over the infrastructure game. So it didn’t make sense for us to keep up with the infrastructure.”

PPG began dismantling its eight global data centers about four years ago, with the final one completed in November 2025. For a 143-year-old company that has gone through 60-some acquisitions, that was no small feat.

Applications and infrastructure became a lot to manage, combined with trying to maintain a strong cybersecurity posture and compliance. “You can’t consistently manage this sort of a footprint, and it becomes really unwieldy very quickly,” Ramachandran says.

Decommissioning a data center is like defusing a complex bomb. Every wire, sequence, and step must be handled with care, because one wrong move can be a blow to your organization in downtime risks, data breaches, or a hit to its bottom line. 

“The decommissioning of data centers is underestimated in terms of complexity, financial risks, reputation loss, and data exposure,” according to Gartner. The firm estimates that by 2030, twice as many enterprise data centers will have been decommissioned compared to those built. Reasons include consolidations, obsolescence, and shifting workloads to cloud and colocation services.

The inadvertent data center

In some instances, data centers have cropped up without much forethought. “Most organizations I work with didn’t build a data center intentionally — they grew into one,” says Aaron Walker, CEO of IT consultancy Overbyte, and a former associate partner at IBM Consulting. “A rack in a closet became a row in a repurposed room, and suddenly, you have a facility that was never designed for the job holding years of infrastructure decisions.”

Deconstructing that environment is work that often gets overlooked, says Walker.

He recently consulted with a large, fully remote online school in the throes of this process. The deconstruction work began with a full audit of what systems existed. From there every workload was categorized to determine what gets migrated, what gets moved to cloud-native infrastructure, and what gets retired entirely, he says.

Then came the physical side: decommissioning hardware and deciding what equipment had residual value and what to recycle. 

“The timeline pressures are real,” Walker says. “You can’t just power things down. Dependencies surface that nobody documented.”

The IT organizational side had its own challenges. “People have years of institutional knowledge tied to physical systems, and there’s genuine anxiety about dismantling something they built and maintained,” he says. 

Walker’s team also ran into issues trying to upgrade systems during the migration, which is generally a mistake, he says. “A data center deconstruction is already a significant change event, and layering additional upgrades on top of it introduces unnecessary risk. In most cases, it is better to separate modernization from migration.”

From start to finish, the deconstruction ran about a year, but timing will vary from project to project, he says.

Less hassle, more flexibility

When the time came for digital marketing agency Helium SEO to consider what to do with its data center, CTO Paul DeMott says the math was simple. “We were paying $12,000 a month toward the colocation fees, hardware support, and the maintenance cost for the physical servers sitting in racks. Cloud infrastructure promised better reliability, automatic scaling, and way less hassle once we were done moving everything.”

The most compelling reason to rid itself of a physical footprint, though, was flexibility. Physical servers equated to capacity planning six months ahead, DeMott says, and if they needed more resources, IT had to wait weeks for hardware to come and get installed.

“Cloud allows resources to be spun up in minutes and shut down at the same speed,” he says. “We went from buying expensive hardware that depreciated to purchasing what we are actually using.”

IT began by creating a list of all the apps running on physical servers and classifying them according to how difficult it would be to move them. “Simple web apps moved first as they barely needed changes,” DeMott says. “Databases and anything which stores data — that’s a little bit later because we’d have had to plan the migration well.”

Some older apps had to be changed to work on the cloud, he adds. The actual move took place over six months, and IT decommissioned the data center while deploying apps to the cloud in tandem, moving the services step by step with backup plans for each one.

Still, the process wasn’t seamless. “Translating 15TB of data to the cloud takes 72 hours on our internet connection, and that was the biggest problem,” DeMott notes. IT ended up using AWS Snowball, a physical hard drive, because it took staff weeks to upload everything, “and [it] ruined the performance in our network.”

Another issue was figuring out the cloud costs, which DeMott characterizes as “brutal. Different types of servers, storage, data transfer costs made it almost impossible to budget,” he says. “Our first month bill accrued at 40% more than we estimated because we forgot about charges for moving data out of the cloud.”

It took IT three months of “fumbling” to get costs below what the company paid for the data center before things stabilized.

The power of ‘cloud only’

Once PPG made the decision to dismantle its data centers and move everything to the cloud, it was time to spread the word internally. “When you say, ‘cloud only,’ it makes it much easier for you to have conversations,” Ramachandran says. “It just sets the entire organization up on a single mission … just those two words make it very, very clear to everybody in the company what that means. There is no room for interpretation.”

The news was revealed at a global town hall, and initially, Ramachandran says, the sentiment was, “this too, shall pass. Then people decided to get on board.”

There were the typical organizational change management issues to deal with. Building momentum takes time, he says, but once the first data center was shut down, people came to the realization that “Okay, we are actually doing this,” Ramachandran says. “Then there was no resistance … everybody got on board, and things started to accelerate.”

Officials ensured that all the training IT needed was made available to them and the company paid for everything, certifications included. “We recognized it in town halls; anybody that went through this training and got the certification. We celebrated people. We promoted people that did the things we wanted them to do,” he says. All of this helped reinforce the mission.

“For the most part, business users didn’t care; their apps were available and they didn’t care where they were,” although there were a couple of exceptions among more technically savvy employees who were concerned about workflow and the security implications of cloud. There was a perception among some that a data center was more secure, Ramachandran says.

That led to looking at publicly available information on all the cybersecurity incidents in the recent past. The research indicated a clear pattern, he says.

“And the pattern is: The more significant cybersecurity events were actually happening to companies” that were largely on-prem environments, Ramachandran observes. “So you came to this point where the cloud actually became lot more secure than on-prem infrastructure.”

There are several reasons why, he maintains, including that, relatively speaking, it is a lot easier to implement security policies consistently in the cloud because “you have a single pane of glass enforcement of policies that you don’t have in an on-prem environment.”

This makes managing your attack surface area more straightforward, Ramachandran says. “So you put all of this together, you package it up on the presentation, and talk to those people one on one, and then say, ‘This is why.’”

The dismantling process

PPG works with a single hyperscaler for its business in China and three others. Deciding what apps went where was largely a function of the technology and which hyperscaler “lends itself to that brand of technology versus the other.” In some instances, where a decision of which to use wasn’t clear, IT made the call.

Step one was deciding on an approach, and PPG opted to modernize its apps at the same time as the deconstruction work. “When you pull together the business case to modernize applications, we came to a conclusion that if we do modernization on the application layer and the infrastructure layer at the same time, I would probably be retired by the time we migrated the data center,” Ramachandran says.

That made it easy to decide when to do a lift and shift and when to not bother migrating certain applications, he says. Then IT could focus on other business priorities to modernize the workforce.

“We just adjusted our roadmap to say the new [app] would go straight into the cloud” while not bothering to move older workloads, Ramachandran says.

The human element

The next step was “finding the people that are hungry to do something new and probably have a bit of experience and … they are waiting for someone to say, ‘Hey, let’s do this,’” Ramachandran says of the data center deconstruction. “They are forward thinkers. Every organization in our scale has [them]. It’s identifying those people and then … empowering them. They became the leaders in the new infrastructure.”

Once the migration started, it was important to celebrate the wins. That gets more people interested in being a part of the new organization PPG was forming called the Cloud COE (center of excellence).

The biggest mistake companies make is treating deconstruction as a single project instead of a phased operational shift, says Roland Parker, founder and CEO of Impress Computers, a managed IT services and cybersecurity firm in Houston.

“We walked one 200-person manufacturer through moving workloads in priority tiers — production-critical systems last, not first — which kept their floor running while we systematically eliminated physical infrastructure over 14 months,” he says.

However, it’s “the human side [that] kills more timelines than the tech does,” Parker observes. “Field supervisors and plant managers have work-arounds built around how legacy systems behave.” So, before touching a single rack, Parker’s team audits those informal processes, “because if you don’t, you migrate the infrastructure and orphan the people who actually use it.”

Overbyte’s Walker agrees, saying that almost all the snafus his team ran into during the online school deconstruction project were not technical, but came down to visibility. “At some point, you have to confront unknown systems; things with incomplete or outdated documentation,” he says. “We had moments where, after beginning to deprovision systems, stakeholders surfaced saying, ‘Wait, that’s still in use.’”

Dismantling systems is not the end

PPG experienced no disruptions during the dismantling process, Ramachandran says, other than some tactical delays and contracts that needed updating.

“There were some learnings on the network side because networking can get complex,” he says. “Sometimes, we extended the outage windows” to up to five hours, for example. Those were the hiccups.”

From start to finish, the decommissioning process of all eight data centers took about three years. “The end is not migrating all the workloads. The end is actually shutting down the data center,” Ramachandran stresses. This requires deconstructing the power, the cooling, fire systems, and multiple generators used for backup, which had to be removed by helicopter.

“You have to take the diesel fuel out and dispose it off and sell it. We have to get recertification of the building for safety, because this is a building where you had kilowatts of power coming in, which basically [also] went through a deconstruction process,” he says. “So you have to get a safety certification … all of this takes time because we have to give the building back to the building management the way they gave it to us.”

What data center deconstruction buys you

The painstaking data center deconstruction process has given Ramachandran valuable insight. “Make sure your best people spend time creating value for the business, as opposed to babysitting infrastructure,” he says, because infrastructure no longer adds value.

“You also do a lot of inherent risk management by getting rid of data centers and moving to a cloud environment you don’t have to worry about,” he adds. Noting the current state of the economy, Ramachandran says coping with sudden price increases for memory and chips is no longer stressful since they aren’t buying infrastructure.

“You’re basically giving back working capital to the company, because you’re moving the organization from a fixed capital environment to your variable cost model completely,” he says, “and you don’t have to refresh your hardware every four or five years.”

Cost was never the objective for the data center deconstruction, Ramachandran notes. “Nonetheless, when we did the business case, we said it’s not going to cost us any more or any less, but will buy us better security, better flexibility, better agility for the organization,” as well as better focus and technology. “And we achieved all of those.”

The value is in all those other areas. “We are not data center operators. The team is now focused on delivering applications that are meaningful to the business,” Ramachandran says. “The team is much closer than ever to the business because we are not talking infrastructure but how to make the business better.”

Walker says companies should measure twice, cut once. “Most teams want to jump straight into migration,” he says, “but the real work is building a complete inventory and mapping dependencies upfront.”

While it made sense for PPG to modernize some apps at the same time as the data center deconstruction work, Walker advises IT leaders to resist the urge to do everything at once. “Focus on moving what you understand first, and isolate the unknowns early,” he says.
“The success of these projects is usually determined by how well you handle the edge cases, not the easy wins.”

Any new technological development IT can make without interrupting operations dramatically reduces time to market, Ramachandran says.

Working on the latest technologies makes IT happy, and that helps with talent retention, he adds, “because we can say we’re cloud only, so this 143-year-old company looks modern. That is meaningful in so many ways.”


Read More from This Article: Deconstructing the data center: A massive (and massively liberating) project
Source: News

Category: NewsApril 28, 2026
Tags: art

Post navigation

PreviousPrevious post:How collaboration technology defines the next phase of hybrid workNextNext post:5 mistakes tech leaders make when deploying enterprise AI

Related posts

The boardroom divide: Why cyber resilience is a cultural asset
April 28, 2026
Samsung Galaxy AI for business: Productivity meets security
April 28, 2026
Startup tackles knowledge graphs to improve AI accuracy
April 28, 2026
AI won’t fix your data problems. Data engineering will
April 28, 2026
The inference bill nobody budgeted for
April 28, 2026
Why simplicity is the silent driver of hybrid workplace success 
April 28, 2026
Recent Posts
  • The boardroom divide: Why cyber resilience is a cultural asset
  • Samsung Galaxy AI for business: Productivity meets security
  • Startup tackles knowledge graphs to improve AI accuracy
  • AI won’t fix your data problems. Data engineering will
  • The inference bill nobody budgeted for
Recent Comments
    Archives
    • April 2026
    • March 2026
    • February 2026
    • January 2026
    • December 2025
    • November 2025
    • October 2025
    • September 2025
    • August 2025
    • July 2025
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • August 2024
    • July 2024
    • June 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • August 2023
    • July 2023
    • June 2023
    • May 2023
    • April 2023
    • March 2023
    • February 2023
    • January 2023
    • December 2022
    • November 2022
    • October 2022
    • September 2022
    • August 2022
    • July 2022
    • June 2022
    • May 2022
    • April 2022
    • March 2022
    • February 2022
    • January 2022
    • December 2021
    • November 2021
    • October 2021
    • September 2021
    • August 2021
    • July 2021
    • June 2021
    • May 2021
    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    Categories
    • News
    Meta
    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org
    Tiatra LLC.

    Tiatra, LLC, based in the Washington, DC metropolitan area, proudly serves federal government agencies, organizations that work with the government and other commercial businesses and organizations. Tiatra specializes in a broad range of information technology (IT) development and management services incorporating solid engineering, attention to client needs, and meeting or exceeding any security parameters required. Our small yet innovative company is structured with a full complement of the necessary technical experts, working with hands-on management, to provide a high level of service and competitive pricing for your systems and engineering requirements.

    Find us on:

    FacebookTwitterLinkedin

    Submitclear

    Tiatra, LLC
    Copyright 2016. All rights reserved.