The cloud computing revolution brought with it many innovations, but also lessons about the pitfalls of rapidly adopting new technologies without a well-planned strategy. Today, IT leaders are at a similar crossroads with the rise of generative AI. As organizations build their AI factories today in this new era, IT leaders have an opportunity to learn from their cloud-first sins of the past and strategically build in a way that prioritizes security, governance, and cost efficiencies over the long term, avoiding errors that might need to be corrected down the line.
The cloud retrospective: A learning curve
Cloud computing’s rise a decade ago ushered in Shadow IT, where teams and even individual employees swiped a credit card to gain instant access to vast compute and storage resources. Over time, many organizations found themselves grappling with issues concerning costs, security, and governance that had them rethinking the underlying model. Organizations today risk falling into a similar scenario known as Shadow AI, where teams turn to public clouds or API service providers in their rush to build or adopt AI solutions. If these initiatives are not properly overseen, the expenses associated with running GenAI services in the public cloud could rapidly become excessive and create a host of problems down the line.
Applying shadow IT’s lessons to Generative AI
As organizations build their AI strategies, the lessons from the cloud era can be particularly invaluable. Here are some lessons to consider:
Build for cost efficiencies now
Well before today’s AI acceleration, a shift was already underway as organizations began to rethink the cloud operating model. A 2021 Andreessen Horowitz article estimated a $100 billion market value gap across the top 50 cloud-invested public software companies they attributed to the impact of public cloud spend.
GenAI offers great potential—McKinsey estimates generative AI could add the equivalent of $2.6 trillion to $4.4 trillion annually[1]—but also comes at a cost. Instead, organizations can build from the ground up with efficiencies in mind, unlike the early days of cloud adoption where long-term costs were often overlooked.
A recent study by Enterprise Strategy Group found that running an open-source large language model (LLM) on premises with retrieval-augmented generation (RAG) was 38% to 75% more cost-effective than in the public cloud and as much as 88% more cost-effective using the API-based approach[2]. Armed with this knowledge, leaders can build from the ground up for long-term success as opposed to short-term wins that require course correction.
Prioritize an “on-prem first” strategy that brings AI to your data
Cost is just one consideration in an increasingly AI-driven world. AI is only as valuable as the data it connects with. But where does this data live? Gartner predicts 75% of enterprise-generated data will be created and processed outside a traditional centralized data center or cloud by 2025[3]. This means organizations must prioritize the safety and sovereignty of their data, safeguard access and governance, and increase transparency.
Given the insights from past experiences with cloud computing, organizations should consider whether an “on-prem first” strategy or bringing AI to the data, makes the most sense in an AI world, especially for applications where control over data and compliance is crucial. This approach allows businesses to build from a strong foundation of existing infrastructure and leverage AI services in the public cloud strategically where it makes sense. It also allows organizations to side-step issues such as data gravity, avoiding the need to rethink strategies and the challenges associated with rebalancing years down the road.
Continuously evaluate, learn, and adapt
GenAI, like all areas of technology, will evolve. What works today might not be your ideal strategy tomorrow. But the goal should be to build from a place that offers you the most flexibility and reusability while keeping you in complete control of your data, infrastructure, and management. It can also mean working with the right partners who can advise on technological advancements and help you adjust strategies on the fly.
Sunny skies ahead in the AI era
The shift toward GenAI is an opportunity to apply hard-earned lessons from the cloud computing era. But by making cost efficiencies a priority from day one, considering an “on-prem first” approach, and continuously adapting and learning throughout the process, IT organizations can deploy GenAI in a way that is both innovative and sustainable. This balanced approach will help avoid past mistakes and build a robust foundation for future innovation.
Learn more about the Dell AI Factory in this webinar.
[1] The economic potential of generative AI: The next productivity frontier, McKinsey
[2] Maximizing AI ROI: Inferencing On-premises With Dell Technologies Can Be 75% More Cost-effective Than Public Cloud, Enterprise Strategy Group, April 2024
[3] What Edge Computing Means for Infrastructure and Operations Leaders, Gartner
Read More from This Article: Learn from past cloud-first mistakes for better AI
Source: News