By George Trujillo, Principal Data Strategist, DataStax
Think about your favorite recipe. You might have all the ingredients for an apple pie, but there’s no guarantee all the elements will come together to produce a delicious dessert. Similarly, many organizations have built data architectures to remain competitive, but have instead ended up with a complex web of disparate systems which may be slowing them down.
In an earlier article, I discussed three proven ingredients for a holistic data platform approach to managing and harnessing data – cloud-native technologies, real-time data, and open source software(OSS) – to drive business value. Here, I’ll dive into the recipe for bringing these elements together to help enterprises take full advantage of the real-time data that’s critical to being a competitive business.
The challenge of data silos
Think of how frustrated you get when you have to wait 15 seconds for a response from a web browser. Then imagine how business users, analysts, and data scientists feel when they have to wait weeks or even months for the new datasets they’ve requested. This is a reality faced by many organizations that have cobbled together an array of siloed data management technologies.
It isn’t uncommon for an organization to operate as many as five messaging systems and a different database technology for every day of the week. Strategies intended to solve specific problems have in many cases created technology stacks resembling the Tower of Babel.
Too often strategy focuses on success within the confines of a team. Teams that take a myopic view on cloud, analytics, database, and streaming technologies might create some measurable success, but viewed holistically their impact is limited. Even organizations that understand the importance of a cohesive data strategy can find it exceedingly difficult to execute it, without getting bogged down by cross-functional team barriers and business friction and impacting time to delivery.
Aligning data
A real-time data architecture should be designed with a set of aligned data streams that flow easily throughout the data ecosystem. An enterprise data management strategy has to align applications, streams, and databases to create a unified real-time data platform. Data has to keep getting easier to work with to enable creativity and innovation.
As Einstein may or may not have said: “Insanity is doing the same thing over and over and expecting different results.” Likewise, data challenges must be addressed at a strategic level, not just at the project, use case, or line of business (LOB) level. Otherwise enterprises are doomed to keep repeating the same mistakes. By creating flexible and adaptable data architecture and ecosystems, organizations can drive business value.
The real-time data platform is the heart of an organization’s data ecosystem. Like a heart, the real-time platform pumps data streams into the enterprise data ecosystem. And just as a human brain suffers from insufficient blood flow, a poor flow of data streams impacts real-time decision- making, machine learning, and AI. A strong real-time platform makes the entire data ecosystem healthier.
As I detailed in my previous article, the three keys to success for a data-driven business include: cloud-native technologies, real-time data, and OSS. These converge to create an optimum data management strategy (see the figure below).
Using OSS helps enterprises avoid vendor lock-in, manage unit cost economics, and boost innovation. When organizations consider the cloud, they see the potential for innovation, transformation, new capabilities, market disruption with new services, data democratization, and self-service. This presents the opportunity for a new look at which technology stack is the right one to drive the business forward.
It’s important to consider the alignment of applications, streaming (messaging and queuing) technologies, and databases. Data streams from applications, external sources, and databases often need to be correlated, aggregated, and refined downstream. LoBs should be empowered with easy access to data streams. Leveraging data in these streams is easier when all three of the core pieces of the data ecosystem work together. Let’s look at how to do this.
A unified real-time data platform
Kubernetes, the open source container orchestration system that automates software deployment, scaling, and management, is a key part of enabling this. It is the glue that allows applications to easily scale and expand across different environments.
Data needs to move easily with applications. Aligning Kubernetes with streaming technologies (such as Apache Kafka or Apache Pulsar) increases the seed of delivering new applications and machine learning models.
Real-time business needs are transforming databases into sources of streaming data, to be processed on demand. Having data flow from a database to a data warehouse or cloud storage then back into memory for real-time decision-making takes too long. Databases must ingest and generate streams that work with applications and external streaming data easily, with low unit costs, and at scale.
Pulsar and Apache Cassandra®, the NoSQL, high-throughput, open source database, are excellent examples of the role OSS can play in a unified data architecture. Pulsar and Cassandra are highly scalable and have built-in capabilities to enable data to move easily across private, hybrid, and multi-cloud environments — and the applications that operate in them. Kubernetes, Pulsar, and Cassandra can align as a platform to enable applications and data to work together, as shown in the diagram below.
This helps organizations accelerate or decelerate to a hybrid or multi-cloud strategy. Complexity and cross-team barriers are broken down when data streams from applications, external sources and databases can easily flow together across on-premise, cloud, and multi-cloud environments. There is complete freedom of choice to run Kubernetes, Pulsar, and Cassandra on-premise or across multiple clouds.
When these components work together, they can enable a focus on digital transformation:
- According to Gartner, cloud-native platforms will serve as the foundation for more than 95% of new digital initiatives by 2025 – up from less than 40% in 2021.
- McKinsey reports in Building a Great Data Platform, a state-of-the-art data and analytics platform is no longer an option but a necessity for larger enterprises. It acts as a central repository for all data, distills it into a single source of truth, and supports the scaling up of robust digital and advanced-analytics programs that translate data into business value.
Digital transformation is high on every organization’s agenda to accelerate business innovation and increase customer satisfaction. This requires aligning the organization to a common vision that creates business value. A data operating model helps enable business value as the data ecosystem evolves, but it also has to reduce the complexity that’s so common in today’s enterprise data ecosystems. Leveraging the execution patterns of cloud-native technologies, real-time data, and OSS supports consistency across the organization for the data operating model. Simply put, for businesses to move faster, data has to be easier to work with — as easy as apple pie.
Learn more about DataStax here.
About George Trujillo:
George Trujillo is principal data strategist at DataStax. Previously, he built high-performance teams for data-value driven initiatives at organizations including Charles Schwab, Overstock, and VMware. George works with CDOs and data executives on the continual evolution of real-time data strategies for their enterprise data ecosystem.
Read More from This Article: The Right Recipe for a Real-time Data Stack
Source: News