Traditionally, the main benefit that generative AI technology offered DevOps teams was the ability to produce things, such as code, quickly and automatically.
But not all DevOps work involves generating things. Much of it centers on performing actions, like modifying cloud service configurations, deploying applications or merging log files, to name just a handful of examples. Traditional generative AI workflows aren’t very useful for needs like these because they can’t easily access DevOps tools or data.
Thanks to the Model Context Protocol (MCP), however, DevOps teams now enjoy a litany of new ways to take advantage of AI. MCP makes it possible to integrate AI into a wide variety of common DevOps workflows that extend beyond familiar use cases like code generation.
Imagine, for example, asking an LLM which Amazon S3 storage buckets or Azure storage accounts contain data that is publicly accessible, then change their access settings? Or having an LLM identify documents in an Amazon DynamoDB database that haven’t been updated in over a year and delete or archive them. With MCP, DevOps practitioners can carry out actions like these automatically, using natural language prompts.
MCP is poised to become a very big deal in DevOps (and, for that matter, beyond), which is why now is the time for DevOps teams to learn how and why to take advantage of this important AI innovation.
What is MCP?
To understand the role of MCP in DevOps, you must first understand what MCP means and how it works.
MCP, which Anthropic introduced as an open standard in late 2024, is a protocol for connecting AI models to external tools and data sources. It provides an efficient, standardized way of building AI-powered agents that can perform actions in response to natural-language requests from users.
The MCP standard works using a server-client architecture. MCP servers provide the functionality necessary to carry out actions, like modifying files or managing databases. MCP clients are typically AI agents that serve as intermediaries between MCP servers and AI models. When users ask an MCP client to help them do something, the client uses an AI model to process the request. It then uses the results to tell the MCP server which actions to perform.
MCP is a big deal because until it debuted, there was no easy or efficient way of interacting with AI models beyond writing custom, tool-specific integrations (which is how tools like GitHub Copilot use AI models to help write code) or asking questions via a chatbot interface like ChatGPT. As a result, most AI use cases were limited to asking AI models to do things like summarize information. Plus, if you wanted to include custom data, such as an internal documentation file, as part of the context during an interaction with a model, you had to add it manually.
But with MCP, developers can write applications that integrate AI into a variety of other types of workflows. They can also automate the process of connecting AI models to custom data sources, making it much easier to factor unique context into AI-powered automations. And they can do it all using a standardized protocol that works with virtually all AI models and agent frameworks.
MCP for DevOps
For DevOps engineers, MCP opens the door to a multitude of powerful use cases. Here’s a look at some key examples:
Problem analysis
The DevOps engineer spends a lot of time finding the answer to technical questions, such as “can host X communicate over port Y?” or “which of my S3 buckets are publicly accessible, and which types of data do they contain?”
Traditionally, answering questions like these required parsing configuration files manually, or possibly writing some kind of script to try to collect the information. But with an MCP server capable of connecting to the appropriate data sources and tools, DevOps teams can ask the questions they need in natural language, then let AI answer them. This is a prime example of how MCP can help DevOps teams work faster.
Enhance DevOps tools
MCP makes it possible to enhance and extend the functionality of a variety of DevOps tools.
As an example, DevOps teams could use MCP to connect custom data sources to coding tools like VS Code or Copilot. With this approach, the tools benefit from AI-powered capabilities that factor in unique contextual elements, like a custom codebase that would not otherwise be accessible by an AI model. The result is the ability to do things like generate code that more closely aligns with unique organizational requirements.
Cloud management
Using MCP servers designed to interact with public cloud services, DevOps teams can automate and scale cloud management processes.
For instance, imagine you want to find S3 buckets that contain a certain type of data resource and, if the buckets are publicly accessible, modify their configurations. With an MCP server that supports S3, you’d be able to ask a client to make these changes, then let an AI model find relevant buckets and update their configurations for you automatically. In this case, you’d be using AI not just to sort through information specific to your organization, but to automate actions as well.
Documentation ingestion
Although AI services like ChatGPT provide features for connecting to custom documentation databases, the capabilities they offer and the integration process vary from one AI tool to another. This can make it challenging to support DevOps use cases that require models to ingest documentation databases, which is often a prerequisite for carrying out tasks that depend on information available only from a given database.
MCP solves this challenge by making it possible to create model-agnostic connectors to platforms like Confluence or SharePoint sites. Once they’ve established a connection, DevOps teams can use AI to search through the databases or automate actions based on information within them.
The limitations of MCP for DevOps
As a flexible and open protocol, MCP is subject to few limitations in terms of which DevOps use cases it can support. Indeed, it’s tough to think of something that it’s technically impossible to achieve using MCP.
But in a practical sense, certain usability and security challenges can make MCP difficult to deploy for certain types of tasks.
From a usability perspective, one major challenge at present is that most MCP servers run locally and require various types of resources, like a Python interpreter, in order to work. This means that DevOps engineers have to invest some time in configuring their environments to support MCP.
Additionally, there is a risk that a configuration that works for one MCP server might not work for another due to issues like having a different version of Python installed than the one that the server expects. Containerizing servers is a way to work around this challenge, but only if you’re willing to install all of the tooling necessary to run containers, which brings you back to the problem of having to invest a lot of effort in setting up your environment for MCP.
These challenges are solvable, and they’re by no means a reason not to take advantage of MCP. But DevOps engineers need to understand that, although MCP solutions can feel almost like magic once they’re up and running, the setup process is not magical.
As for security, MCP agents are subject to all of the risks that come with any type of LLM-based technology. They have the potential to leak sensitive data because any resources that are available to an MCP server could become exposed to a third-party AI model. A potential solution is to avoid third-party models by hosting models locally (or on a server located behind a firewall) instead, but not all models support this approach, and it adds to MCP setup challenges.
MCP servers could also potentially carry out actions that you don’t want them to perform, like deleting critical resources. To control for this risk, it’s important to apply a least-privilege approach to MCP server design and management by ensuring that they can only access the minimum resources necessary to support a target use case. The capabilities of MCP servers are limited to the level of security access available to users, so by restricting user privileges, admins can restrict MCP security risks.
MCP and the future of AI in DevOps
To be sure, MCP is not perfect. But it constitutes a huge leap forward in terms of how DevOps teams can leverage AI. It’s also a technology that’s here and now, and that DevOps engineers can start using today. Going forward, it’s likely that MCP will become as integral to DevOps as technologies like CI/CD.
Derek Ashmore is AI enablement principal at Asperitas, where his focus is on DevSecOps, infrastructure code, cloud computing, containerization, making applications cloud-native and migrating applications to the cloud. His books include the “The Java EE Architect’s Handbook” and “Microservices for Java EE Architects.”
Read More from This Article: How MCP can revolutionize the way DevOps teams use AI
Source: News