Data-driven decision making (DDDM) is just what it sounds like. DDDM uses facts, metrics, and data to guide strategic business decisions that align with goals, objectives, and priorities. If organizations can realize the full value of their data, everyone is empowered to make better decisions.
When you look at why IT analytics and DDDM exist, it’s because executives often make decisions based on a hunch. Sometimes, maybe often depending on the executive and the context, their hunches are correct.
For example, Fred Smith has an insight into the transport business and, despite widespread skepticism, creates Federal Express. Michael Eisner hears a pitch for an offbeat game show and, based on his gut, commits millions to developing Who Wants to Be a Millionaire?
But gut instinct is not how we want to consistently operate an enterprise. We don’t want to make decisions that way. Data is a much more reliable foundation for decision making.
Accurate data is fresh, timely data.
In IT, if data is even a week, let alone three months old, you’d be better off licking your finger and holding it up in the wind rather than deciding based on data that old. Ninety days ago doesn’t tell you where your applications are, where your workloads are, or where your customers are. And it tells you nothing about potential risks from cyberattacks.
An awful lot of IT analytics is garbage in, garbage out because old data, by the time it’s used, has become false data. So, you perform intelligent analytics on it to achieve a conclusion that’s no better than the false data you started with.
For example, imagine a hospital that hadn’t updated its configuration management database (CMDB) in 90 days. That’s like flying an airplane on 90-day-old instrument data. And that really understates the problem.
Pilots don’t have to worry about a new mountain or a new skyscraper popping up every couple of weeks. But in IT, the equivalent of a new mountain can emerge in hours or days.
What types of decisions are informed by accurate endpoint data?
There’s a hierarchy of data-driven decisions in operations, security, and compliance, but let’s start with one that’s operational. In most organizations, IT refreshes software based on an alert from the vendor or the help desk receiving a high number of complaints.
The vendor usually alerts its customers when it’s time to update because of a recently discovered vulnerability. It’s often a “panic” alert. But it’s the rare IT team that has enough personnel to update every piece of software that needs it. And it wouldn’t be a good way to go. There would likely be downstream instability from any change you made. So, the decision to update or not becomes subjective, not data-driven.
But with the right tool, you can know, second by second, every application crash that happens across all your enterprise applications. Having that data in real time means IT can say, “The vulnerability hasn’t made the top ten of complaints this week, but we know this application is crashing and we’ll fix it centrally.” Knowing second by second what’s crashing, what’s degrading CPU performance, and/or what’s blue-screening, lets IT make a decision that’s also a business decision.
In some situations, seconds-old data matters, especially with a distributed workforce. You want to be able to see instantly the vulnerabilities of every endpoint. There may be too many to fix, but when you know where they are and how critical, you can make informed decisions about which ones to address. For example, there may be corporate network mitigations in place, but users at home are in the “Wild West,” potentially exposed to every attack.
Just what is “fresh data” from an IT perspective?
The importance of data freshness is not uniform across IT operations. For example, if hardware is on a low refresh cycle, such as two or three years, it’s not significant if CPU or hard drive model data is a month old. But if you’re making decisions about retiring servers or migrating workloads from a physical to a virtual environment, data that is days old will very likely cause problems. You could be retiring a server a business unit depends on or moving workloads that support a critical service.
With up-to-the-second — or at least up-to-the-hour — data, you’re in a much better position to act.
IT analytics and digital transformation.
Digital transformation is like Zero Trust. It means different things to different people. Ask 10 engineers, and you’ll get 12 different answers. One aspect of digital transformation is the mobility and centralization of data. It allows organizations to switch application and service providers because the data and the service have been decoupled.
But if you look at where digital transformation efforts have “gone south,” it’s often in the realm of process and not knowing what servers and endpoints are communicating with to enable a business service. And this is where the timeliness of data and digital transformation intersect.
For example, if you have the ability to crawl through every .txt file, PDF, Word doc, and Excel spreadsheet on a laptop to find something that shouldn’t be there but should be stored centrally, it’s much easier to switch your central storage provider.
Accurate data removes the risk from that decision. That’s how fresh data increases agility. If it takes months of effort to move from on-premises to a hosted system, or from one host to another, the friction and cost of transfer are so high you won’t do it. With more agility and less friction, digital transformation efforts become a buyer’s market.
Learn how to make better business decisions with accurate, complete and up-to-date data about all endpoints — wherever they are.
Read More from This Article: Data-Driven Decision Making
Source: News