Skip to content
Tiatra, LLCTiatra, LLC
Tiatra, LLC
Information Technology Solutions for Washington, DC Government Agencies
  • Home
  • About Us
  • Services
    • IT Engineering and Support
    • Software Development
    • Information Assurance and Testing
    • Project and Program Management
  • Clients & Partners
  • Careers
  • News
  • Contact
 
  • Home
  • About Us
  • Services
    • IT Engineering and Support
    • Software Development
    • Information Assurance and Testing
    • Project and Program Management
  • Clients & Partners
  • Careers
  • News
  • Contact

How MCP can revolutionize the way DevOps teams use AI

Traditionally, the main benefit that generative AI technology offered DevOps teams was the ability to produce things, such as code, quickly and automatically.

But not all DevOps work involves generating things. Much of it centers on performing actions, like modifying cloud service configurations, deploying applications or merging log files, to name just a handful of examples. Traditional generative AI workflows aren’t very useful for needs like these because they can’t easily access DevOps tools or data.

Thanks to the Model Context Protocol (MCP), however, DevOps teams now enjoy a litany of new ways to take advantage of AI. MCP makes it possible to integrate AI into a wide variety of common DevOps workflows that extend beyond familiar use cases like code generation.

Imagine, for example, asking an LLM which Amazon S3 storage buckets or Azure storage accounts contain data that is publicly accessible, then change their access settings? Or having an LLM identify documents in an Amazon DynamoDB database that haven’t been updated in over a year and delete or archive them. With MCP, DevOps practitioners can carry out actions like these automatically, using natural language prompts.

MCP is poised to become a very big deal in DevOps (and, for that matter, beyond), which is why now is the time for DevOps teams to learn how and why to take advantage of this important AI innovation.

What is MCP?

To understand the role of MCP in DevOps, you must first understand what MCP means and how it works.

MCP, which Anthropic introduced as an open standard in late 2024, is a protocol for connecting AI models to external tools and data sources. It provides an efficient, standardized way of building AI-powered agents that can perform actions in response to natural-language requests from users.

The MCP standard works using a server-client architecture. MCP servers provide the functionality necessary to carry out actions, like modifying files or managing databases. MCP clients are typically AI agents that serve as intermediaries between MCP servers and AI models. When users ask an MCP client to help them do something, the client uses an AI model to process the request. It then uses the results to tell the MCP server which actions to perform.

MCP is a big deal because until it debuted, there was no easy or efficient way of interacting with AI models beyond writing custom, tool-specific integrations (which is how tools like GitHub Copilot use AI models to help write code) or asking questions via a chatbot interface like ChatGPT. As a result, most AI use cases were limited to asking AI models to do things like summarize information. Plus, if you wanted to include custom data, such as an internal documentation file, as part of the context during an interaction with a model, you had to add it manually.

But with MCP, developers can write applications that integrate AI into a variety of other types of workflows. They can also automate the process of connecting AI models to custom data sources, making it much easier to factor unique context into AI-powered automations. And they can do it all using a standardized protocol that works with virtually all AI models and agent frameworks.

MCP for DevOps

For DevOps engineers, MCP opens the door to a multitude of powerful use cases. Here’s a look at some key examples:

Problem analysis

The DevOps engineer spends a lot of time finding the answer to technical questions, such as “can host X communicate over port Y?” or “which of my S3 buckets are publicly accessible, and which types of data do they contain?” 

Traditionally, answering questions like these required parsing configuration files manually, or possibly writing some kind of script to try to collect the information. But with an MCP server capable of connecting to the appropriate data sources and tools, DevOps teams can ask the questions they need in natural language, then let AI answer them. This is a prime example of how MCP can help DevOps teams work faster.

Enhance DevOps tools

MCP makes it possible to enhance and extend the functionality of a variety of DevOps tools.

As an example, DevOps teams could use MCP to connect custom data sources to coding tools like VS Code or Copilot. With this approach, the tools benefit from AI-powered capabilities that factor in unique contextual elements, like a custom codebase that would not otherwise be accessible by an AI model. The result is the ability to do things like generate code that more closely aligns with unique organizational requirements.

Cloud management

Using MCP servers designed to interact with public cloud services, DevOps teams can automate and scale cloud management processes.

For instance, imagine you want to find S3 buckets that contain a certain type of data resource and, if the buckets are publicly accessible, modify their configurations. With an MCP server that supports S3, you’d be able to ask a client to make these changes, then let an AI model find relevant buckets and update their configurations for you automatically. In this case, you’d be using AI not just to sort through information specific to your organization, but to automate actions as well.

Documentation ingestion

Although AI services like ChatGPT provide features for connecting to custom documentation databases, the capabilities they offer and the integration process vary from one AI tool to another. This can make it challenging to support DevOps use cases that require models to ingest documentation databases, which is often a prerequisite for carrying out tasks that depend on information available only from a given database.

MCP solves this challenge by making it possible to create model-agnostic connectors to platforms like Confluence or SharePoint sites. Once they’ve established a connection, DevOps teams can use AI to search through the databases or automate actions based on information within them.

The limitations of MCP for DevOps

As a flexible and open protocol, MCP is subject to few limitations in terms of which DevOps use cases it can support. Indeed, it’s tough to think of something that it’s technically impossible to achieve using MCP.

But in a practical sense, certain usability and security challenges can make MCP difficult to deploy for certain types of tasks.

From a usability perspective, one major challenge at present is that most MCP servers run locally and require various types of resources, like a Python interpreter, in order to work. This means that DevOps engineers have to invest some time in configuring their environments to support MCP.

Additionally, there is a risk that a configuration that works for one MCP server might not work for another due to issues like having a different version of Python installed than the one that the server expects. Containerizing servers is a way to work around this challenge, but only if you’re willing to install all of the tooling necessary to run containers, which brings you back to the problem of having to invest a lot of effort in setting up your environment for MCP.

These challenges are solvable, and they’re by no means a reason not to take advantage of MCP. But DevOps engineers need to understand that, although MCP solutions can feel almost like magic once they’re up and running, the setup process is not magical.

As for security, MCP agents are subject to all of the risks that come with any type of LLM-based technology. They have the potential to leak sensitive data because any resources that are available to an MCP server could become exposed to a third-party AI model. A potential solution is to avoid third-party models by hosting models locally (or on a server located behind a firewall) instead, but not all models support this approach, and it adds to MCP setup challenges. 

MCP servers could also potentially carry out actions that you don’t want them to perform, like deleting critical resources. To control for this risk, it’s important to apply a least-privilege approach to MCP server design and management by ensuring that they can only access the minimum resources necessary to support a target use case. The capabilities of MCP servers are limited to the level of security access available to users, so by restricting user privileges, admins can restrict MCP security risks.

MCP and the future of AI in DevOps

To be sure, MCP is not perfect. But it constitutes a huge leap forward in terms of how DevOps teams can leverage AI. It’s also a technology that’s here and now, and that DevOps engineers can start using today. Going forward, it’s likely that MCP will become as integral to DevOps as technologies like CI/CD.

Derek Ashmore is AI enablement principal at Asperitas, where his focus is on DevSecOps, infrastructure code, cloud computing, containerization, making applications cloud-native and migrating applications to the cloud. His books include the “The Java EE Architect’s Handbook” and “Microservices for Java EE Architects.”


Read More from This Article: How MCP can revolutionize the way DevOps teams use AI
Source: News

Category: NewsApril 29, 2025
Tags: art

Post navigation

PreviousPrevious post:Zero Trust everywhere: How MGM Resorts found agility and security with ZscalerNextNext post:What makes a true AI agent? CIOs struggle with the definition as hype blurs lines

Related posts

가트너 “전체 마케팅 지출 중 61.1%가 디지털···검색 광고 비중 13.9%”
June 13, 2025
‘검색 결과 상단에 노출’··· 딥시크 사칭 웹사이트 주의보
June 13, 2025
세일즈포스, 슬랙 API 약관 개정··· ‘LLM 통한 데이터 수집 제한’
June 13, 2025
“올해 1분기 의료·교육기관 겨냥한 랜섬웨어 공격 급증” SK쉴더스
June 13, 2025
Modernizing healthcare cybersecurity with lessons from the Fortune 500
June 12, 2025
Trabajadores quemados por la IA: ¿un nuevo problema para el CIO?
June 12, 2025
Recent Posts
  • 가트너 “전체 마케팅 지출 중 61.1%가 디지털···검색 광고 비중 13.9%”
  • ‘검색 결과 상단에 노출’··· 딥시크 사칭 웹사이트 주의보
  • 세일즈포스, 슬랙 API 약관 개정··· ‘LLM 통한 데이터 수집 제한’
  • “올해 1분기 의료·교육기관 겨냥한 랜섬웨어 공격 급증” SK쉴더스
  • Modernizing healthcare cybersecurity with lessons from the Fortune 500
Recent Comments
    Archives
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • August 2024
    • July 2024
    • June 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • August 2023
    • July 2023
    • June 2023
    • May 2023
    • April 2023
    • March 2023
    • February 2023
    • January 2023
    • December 2022
    • November 2022
    • October 2022
    • September 2022
    • August 2022
    • July 2022
    • June 2022
    • May 2022
    • April 2022
    • March 2022
    • February 2022
    • January 2022
    • December 2021
    • November 2021
    • October 2021
    • September 2021
    • August 2021
    • July 2021
    • June 2021
    • May 2021
    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    Categories
    • News
    Meta
    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org
    Tiatra LLC.

    Tiatra, LLC, based in the Washington, DC metropolitan area, proudly serves federal government agencies, organizations that work with the government and other commercial businesses and organizations. Tiatra specializes in a broad range of information technology (IT) development and management services incorporating solid engineering, attention to client needs, and meeting or exceeding any security parameters required. Our small yet innovative company is structured with a full complement of the necessary technical experts, working with hands-on management, to provide a high level of service and competitive pricing for your systems and engineering requirements.

    Find us on:

    FacebookTwitterLinkedin

    Submitclear

    Tiatra, LLC
    Copyright 2016. All rights reserved.