Skip to content
Tiatra, LLCTiatra, LLC
Tiatra, LLC
Information Technology Solutions for Washington, DC Government Agencies
  • Home
  • About Us
  • Services
    • IT Engineering and Support
    • Software Development
    • Information Assurance and Testing
    • Project and Program Management
  • Clients & Partners
  • Careers
  • News
  • Contact
 
  • Home
  • About Us
  • Services
    • IT Engineering and Support
    • Software Development
    • Information Assurance and Testing
    • Project and Program Management
  • Clients & Partners
  • Careers
  • News
  • Contact

The missing piece in every failed AI/BI rollout is already on your data team

There’s a conversation happening in every data org right now. It goes something like this:

“If AI can answer business questions in seconds, what exactly are we paying our data analysts to do?”

It’s a fair question. And if you’re asking it, you’re probably looking at the problem the wrong way.

I’ve spent the last few years working side by side with data teams at Fortune 500 companies like ConocoPhillips and Cisco. What I’ve watched unfold is not the obsolescence of the data analyst. It’s the beginning of their most important chapter yet, if they’re willing to step into it.

Let me explain. But first, I need to take you back to where we started.

The old world: BI as a relay race

For the better part of the last decade, Business Intelligence worked like a relay race. The baton passed through many hands before a business user ever got an answer.

It started with a request. A VP of Sales would send a Slack to the data team: “Can you build me a view of pipeline coverage by region, segmented by deal size and expected close date?” Simple enough, in theory.

What happened next was anything but simple.

The data analyst would first go spelunking in the data warehouse. Which tables held the CRM data? Was it in Salesforce, synced to Snowflake or still sitting in a legacy system? Were the field names consistent? Did close_date in one schema mean the same thing as expected_close in another? This data prep phase alone — cleaning, joining, validating — could consume two to three days before a single chart was drawn. Research has long confirmed what every analyst already knows in their bones: The preparation work can swallow the majority of their time, leaving precious little for actual analysis.

Then came the workbook. The analyst would build a Tableau dashboard or a Looker Explore, carefully constructing the logic. In Looker, this meant writing LookML: Defining views, dimensions, measures and the relationships between them. This is the semantic layer — the translation dictionary between raw database columns and business-friendly concepts like “pipeline coverage” or “at-risk deals.” It’s sophisticated work. It requires understanding both the technical data model and the intent of the business question.

Once the semantic model was right, the analyst would build the dashboard itself — choosing the right visualization, applying filters, establishing drill-down hierarchies. Then a round of review with the stakeholder. Revisions. More revisions. Finally, the dashboard was published.

The business user got their answer, often one to two weeks after they asked the question.

And then a slightly different question would come in, and the whole cycle would begin again.

This wasn’t a failure of the data team. It was a structural problem. The old model of BI was built for a world where data was scarce, questions were infrequent and business moved slowly enough that a two-week turnaround was acceptable. That world no longer exists.

The missing ingredient: context

Here’s what keeps getting overlooked in the “AI replaces analysts” conversation: AI doesn’t know your business.

A large language model is trained on the internet. It knows what “churn rate” means in the abstract. It does not know that at your company, “churn” excludes accounts that downgraded but didn’t cancel, per a decision made in Q3 2021 during a board-driven metric refresh. It does not know that your fiscal year ends in October, not December. It does not know that the anomaly in the Southeast region’s numbers last quarter was caused by a one-time restructuring of territory assignments, not a real decline in performance. It does not know that when your CFO asks about “revenue,” she means recognized revenue, not booked and that your revenue recognition policy is tied to a specific contract milestone that lives in a field called milestone_event_type = ‘GO_LIVE’ in your ERP.

Without that context, even the most capable AI will produce answers that are technically correct and completely wrong.

This is not a model problem. It is not a data quality problem, though data quality matters. It is a context problem. And it is the central architectural challenge of the AI analytics era.

Gartner reinforces this directly: Organizations that prioritize semantics in AI-ready data will increase their GenAI model accuracy by up to 80% and reduce costs by up to 60%. As they put it, poor semantics lead to greater hallucinations, more tokens required and higher costs. Context is not a nice-to-have. It is the lever.

Consider what “context” actually means in an enterprise setting. It has multiple layers:

  • Data context is the structural knowledge: What tables exist, how they join, what the columns mean, where data comes from and what edge cases cause anomalies. This is what a veteran data engineer carries in their head after three years in a particular data warehouse. It’s the knowledge that customer_id in the CRM doesn’t always match customer_id in the billing system, and here’s the lookup table that reconciles them.
  • Business context is the semantic knowledge: How the organization defines its metrics, which definitions have changed over time, what initiatives are underway that might affect the numbers and which data sources to trust for which questions. It’s knowing that “active user” means something different to the product team than it does to the finance team.
  • Historical context is institutional memory: What questions have been asked before, what anomalies were investigated and why, what decisions were made and on what basis and what the AI should not learn from because it reflects a one-time event rather than a durable pattern.
  • Presentational context is judgment about how to communicate: Which audiences need which level of detail, when a number needs a narrative, when a trend needs a benchmark and how to frame an insight so it drives action rather than confusion.

AI can process information at superhuman speed. It cannot originate context it was never given. One of the more clarifying observations I’ve encountered is from research at Tellius, which noted that “human analysts carry this context in their heads. They remember what they investigated before, what patterns they’ve seen and what explanations they’ve already validated or ruled out. They build institutional knowledge over time.” Today’s AI systems, by contrast, are largely stateless. Every query starts from zero.

That is the gap. And filling it is now the most important job in any data organization.

The rise of the AI context engineer

I want to propose a new title. Not because titles matter, but because names shape how we think about roles, and the role that is emerging deserves a name that captures its true significance: The AI context engineer (ACE).

The ACE is not a dashboard builder. The ACE is not a SQL writer. The ACE is the person who makes AI analytics actually work — by building, curating, governing and continuously refining the context layer that sits between raw enterprise data and intelligent AI responses.

Think about what this role actually requires.

The ACE must understand the business deeply enough to know what questions will be asked, what answers will matter and what edge cases will cause the AI to go wrong. They are, in some sense, the organizational ethnographer — the person who has absorbed years of institutional knowledge and can translate it into something a machine can act on.

The ACE must understand the data architecture well enough to model it accurately — to define not just what columns mean, but how they relate, what their lineage is, what their known quality issues are and when they should and shouldn’t be used.

The ACE must be a curator of history: Documenting past analyses, flagging one-time anomalies, preserving the reasoning behind metric definitions so that future AI-generated answers reflect the organization’s evolving understanding of itself.

The ACE must be a quality controller: Continuously evaluating the AI’s outputs, identifying where the context layer is incomplete or misleading, and closing the gaps before they propagate into bad decisions.

And the ACE must be a translator: Communicating to business stakeholders not just what the data shows, but why the AI answered a question a particular way, and when human judgment should override an automated insight.

This is not a less sophisticated role than the data analyst of the past. It is a vastly more sophisticated one. The data analyst used to be primarily a technical craftsperson — skilled at SQL, at visualization, at data modeling. The ACE is all of that, plus strategist, plus organizational psychologist, plus AI systems architect.

The companies we work with that are getting the most from AI analytics — the ones where adoption doubles month over month, where business users genuinely trust the outputs, where AI is changing how decisions get made — they all have people functioning in this role, even if they don’t call it that yet. They have someone who owns the context layer. Who champions it. Who treats it as a living system that needs ongoing investment.

What an ACE actually does: A day in the life

Let me make this concrete.

A new quarter begins at a mid-sized technology company. The CRO sends a message to the data team: The board wants a new way of looking at net revenue retention — one that breaks out expansion, contraction and churn separately, and accounts for the company’s recent shift from annual to monthly billing cycles.

In the old world, this was a two-week project: Schema discovery, SQL development, semantic model updates, dashboard build, review, revision, publish.

In the AI analytics world, the CRO can ask this question directly — if the context layer is ready to support it. The ACE’s job is to make sure it is.

First, understanding the business intent. That means sitting down with the CRO to understand not just the mechanics of the new metric, but the decision it will inform. What will the CRO do differently if expansion is trending up but contraction is also rising? What benchmark matters — industry average, internal historical trend, competitor proxy?

Second, translating intent into data logic. Where does billing cycle information live? How is a “contraction” event recorded in the system? Is there a field, or does it need to be inferred from a month-over-month delta in contract value? The ACE knows the data well enough to answer these questions without a weeks-long discovery sprint.

Third, encoding the context. Adding the new metric definition, its calculation logic, its relevant filters, its known edge cases and its relationships to adjacent metrics into the context layer. This is the equivalent of writing LookML in the old world — but richer, because it includes not just the formula but the intent, the history and the caveats.

Fourth, validating the AI’s output. Running a battery of test questions to ensure the AI returns the right answer for the right reasons. Not just “is the number correct?” but “does the AI understand when not to use this metric?”

Fifth, governing ongoing accuracy. As the company’s business model evolves, the ACE monitors the AI’s outputs for drift, flags questions that reveal gaps in the context layer and continuously updates the system.

The CRO gets answers in minutes instead of weeks. But only because the ACE did the upstream work to make that possible.

Why this is good news for data analysts

If you’re a data analyst reading this and wondering whether you have a future, the answer is yes — an extraordinary one. But it requires a shift in how you think about your value.

Your value was never really in writing SQL. It was in knowing which SQL to write. It was in the institutional knowledge that told you which table to trust, which definition to use and which anomaly to flag. It was in the judgment about how to frame a number for a CEO who thinks in stories, not schemas.

AI can write SQL. AI cannot originate the judgment that makes SQL meaningful.

The data points are encouraging. A 2025 Alteryx survey of 1,400 data analysts worldwide found that 87% say their role has become more strategically important in the past year, and 94% say AI is enhancing that strategic nature. Only 17% are worried about being replaced — a sharp reversal from just a year prior, when 65% of data leaders expected AI to take analyst jobs within two to three years. What changed? Analysts who leaned into AI found it made them more powerful, not less necessary.

What you carry in your head — the business context, the data context, the historical patterns, the organizational definitions — is precisely what AI needs and cannot generate on its own. Your job is not to compete with AI at tasks AI can now do faster. Your job is to feed AI the knowledge that makes it worth trusting.

The ACE role is the formalization of that value. Context is not a side effect of good analytics work. It is the core of it. The people who have spent years accumulating that context are exactly the right people to build and steward the systems that make AI analytics possible.

Data analysts are not being sidelined by AI. They are becoming the people who make AI work for everyone else in the organization. That is a remarkable elevation in status, if they’re willing to claim it.

The organizations getting this right

The companies seeing transformational results from AI analytics share a common pattern: They have invested in the context layer, and they have humans who own it.

The organizations still struggling with AI analytics are the ones that deployed a tool and then waited for it to figure out their business. It doesn’t work that way. The AI is the engine. Context is the fuel. Without the fuel, the engine goes nowhere.

A broader Alteryx survey reinforces this pattern: Only 23% of organizations have successfully scaled AI pilots into production, and just 28% fully trust AI to support decision-making. The diagnosis is consistent with what we see every day — trust breaks down when AI is deployed without the business context and logic needed to produce consistent, explainable results.

The lesson for data leaders is clear. The transition from traditional BI to AI analytics is not primarily a technology transition. It is an organizational one. The technology works. What determines whether it delivers value is whether your organization has people who understand that their job is now to build and maintain the context that makes AI trustworthy.

The inflection point

We are at an inflection point in the history of enterprise data. Gartner estimates that by 2028, over half the GenAI models used in enterprises will be domain-specific, with “context emerging as one of the most critical differentiators for successful agent deployments.” The era of static dashboards and reactive reporting is ending. The era of always-on, conversational, proactive AI analytics is here.

The question every data organization needs to answer is not “will AI replace my analysts?” It’s “Are my analysts ready to become AI context engineers?”

The ones who are — the ones who lean into this shift, take ownership of the context layer and become the bridge between organizational knowledge and AI capability — will find themselves more valued, more strategic and more impactful than any data analyst has ever been.

The ones who wait for someone else to define their role may find that someone else already has.

The missing ingredient for AI to truly transform BI has never been model capability. It has been context. And the people best positioned to provide that context are the analysts who have been building it, living it and protecting it for years.

That’s not a threat to their career. It’s the foundation of their next one.

The missing piece is already on your payroll. Time to make them an ACE.

This article is published as part of the Foundry Expert Contributor Network.
Want to join?


Read More from This Article: The missing piece in every failed AI/BI rollout is already on your data team
Source: News

Category: NewsApril 8, 2026
Tags: art

Post navigation

PreviousPrevious post:La innovación inteligente en pagos transforma cada transacción en crecimiento y confianzaNextNext post:Beyond the gold rush: Hunting for ‘digital eggs’ to secure AI value

Related posts

샤오미, MIT 라이선스 ‘미모 V2.5’ 공개···장시간 실행 AI 에이전트 시장 겨냥
April 29, 2026
SAS makes AI governance the centerpiece of its agent strategy
April 29, 2026
The boardroom divide: Why cyber resilience is a cultural asset
April 28, 2026
Samsung Galaxy AI for business: Productivity meets security
April 28, 2026
Startup tackles knowledge graphs to improve AI accuracy
April 28, 2026
AI won’t fix your data problems. Data engineering will
April 28, 2026
Recent Posts
  • 샤오미, MIT 라이선스 ‘미모 V2.5’ 공개···장시간 실행 AI 에이전트 시장 겨냥
  • SAS makes AI governance the centerpiece of its agent strategy
  • The boardroom divide: Why cyber resilience is a cultural asset
  • Samsung Galaxy AI for business: Productivity meets security
  • Startup tackles knowledge graphs to improve AI accuracy
Recent Comments
    Archives
    • April 2026
    • March 2026
    • February 2026
    • January 2026
    • December 2025
    • November 2025
    • October 2025
    • September 2025
    • August 2025
    • July 2025
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • August 2024
    • July 2024
    • June 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • August 2023
    • July 2023
    • June 2023
    • May 2023
    • April 2023
    • March 2023
    • February 2023
    • January 2023
    • December 2022
    • November 2022
    • October 2022
    • September 2022
    • August 2022
    • July 2022
    • June 2022
    • May 2022
    • April 2022
    • March 2022
    • February 2022
    • January 2022
    • December 2021
    • November 2021
    • October 2021
    • September 2021
    • August 2021
    • July 2021
    • June 2021
    • May 2021
    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    Categories
    • News
    Meta
    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org
    Tiatra LLC.

    Tiatra, LLC, based in the Washington, DC metropolitan area, proudly serves federal government agencies, organizations that work with the government and other commercial businesses and organizations. Tiatra specializes in a broad range of information technology (IT) development and management services incorporating solid engineering, attention to client needs, and meeting or exceeding any security parameters required. Our small yet innovative company is structured with a full complement of the necessary technical experts, working with hands-on management, to provide a high level of service and competitive pricing for your systems and engineering requirements.

    Find us on:

    FacebookTwitterLinkedin

    Submitclear

    Tiatra, LLC
    Copyright 2016. All rights reserved.