This article explains how to integrate Spring AI with external MCP servers that provide APIs for popular tools such as GitHub and SonarQube. Spring AI provides built-in support for MCP clients and servers. In this article, we will use only the Spring MCP client. If you are interested in more details on building MCP servers, please refer to the following post on my blog. MCP has recently become very popular, and you can easily find an MCP server implementation for almost any existing technology.

You can actually run MCP servers in many different ways. Ultimately, they are just ordinary applications whose task is to make a given tool available via an API compatible with the MCP protocol. The most popular AI IDE tools, such as Cloud Code, Codex, and Cursor, make it easy to run any MCP server. I will take a slightly unusual approach and use the support provided with Docker Desktop, namely the MCP Toolkit.

My idea for today is to build a simple Spring AI application that communicates with MCP servers for GitHub, SonarQube, and CircleCI to retrieve information about my repositories and projects hosted on those platforms. The Docker MCP Toolkit provides a single gateway that distributes incoming requests among running MCP servers. Let’s see how it works in practice!

Source Code

Feel free to use my source code if you’d like to try it out yourself. To do that, you must clone my sample GitHub repository. Then you should only follow my instructions. This repository contains several sample applications. The correct application for this article is in the spring-ai-mcp/external-mcp-sample-client directory.

Getting Started with Docker MCP Toolkit

First, run your Docker Desktop. You can find more than 300 popular MCP servers to run in the “Catalog” bookmark. Next, you should search for SonarQube, CircleCI, and GitHub Official servers (note that there are additional GitHub servers). To be honest, I encountered unexpected issues running the CircleCI server, so for now, I based the application on MCP communication with GitHub and SonarCloud.

Each MCP server usually requires configuration, such as your authorization token or service address. Therefore, before adding a server to Docker Toolkit, you must first configure it as described below. Only then should you click the “Add MCP server” button.

spring-ai-mcp-sonarqube-server

For the GitHub MCP server, in addition to entering the token itself, you must also authorize it via OAuth. Here, too, the MCP Toolkit provides graphical support. After entering the token, go to the “OAuth” tab to complete the process.

Screenshot 2026 02 06 at 08.38.10

This is what your final result should look like before moving on to implementing the Spring Boot application. You have added two MCP servers, which together offer 65 tools.

Screenshot 2026 02 06 at 08.35.52

To make both MCP servers available outside of Docker, you need to run the Docker MCP gateway. In the default stdio mode, the API is not exposed outside Docker. Therefore, you need to change the mode to streaming using the transport parameter, as shown below. The gateway is exposed on port 8811.

This is what it looks like after launch. Additionally, the Docker MCP gateway is secured by an API token. This will require appropriate settings on the MCP client side in the Spring AI application.

spring-ai-mcp-docker-gateway-start

Integrate Spring AI with External MCP Clients

Prepare the MCP Client with Spring AI

Let’s move on to implementing our sample application. We need to include the Spring AI MCP client and the library that communicates with the LLM model. For me, it’s OpenAI, but you can use many other options available through Spring AI’s integration with popular chat models.

Our MCP client must authenticate itself to the Docker MCP gateway using an API token. Therefore, we need to modify the Spring WebClient used by Spring AI to communicate with MCP servers. It is best to use the ExchangeFilterFunction interface to create an HTTP filter that adds the appropriate Authorization header with the bearer token to the outgoing request. The token will be injected from the application properties.

Then, let’s set the previously implemented filter for the default WebClient builder.

After that, we must configure the MCP gateway address and token in the application properties. To achieve that, we must use the spring.ai.mcp.client.streamable-http.connections property. The MCP gateway listens on port 8811. The token value will be read from the MCP_TOKEN environment variable.

Implement Application Logic with Spring AI and OpenAI Support

The concept behind the sample application is quite simple. It involves creating a @RestController per tool provided by each MCP server. For each, I will create a simple prompt to request the number of repositories or projects in my account on a given platform. Let’s start with SonCloud. Each implementation uses the Spring AI ToolCallbackProvider bean to enable the available MCP server to communicate with the LLM model.

Below is a very similar implementation for GitHub MCP. This controller is exposed under the /github context path.

Finally, there is the controller implementation for CircleCI MCP. It is available externally under the /circleci context path.

The last controller implementation is a bit more complex. First, I need to instruct the LLM model to generate project names in SonarQube and specify my GitHub username. This will not be part of the main prompt. Rather, it will be the system role, which guides the AI’s behavior and response style. Therefore, I’ll create the SystemPromptTemplate first. The user role prompt accepts an input parameter specifying the name of my GitHub repository. The response should combine data on the last commit in a given repository with the status of the most recent SonarQube analysis. In this case, the LLM will need to communicate with two MCP servers running with Docker MCP simultaneously.

Before running the app, we must set two required environment variables that contain the OpenAI and Docker MCP gateway tokens.

Finally, we can run our Spring Boot app with the following command.

Firstly, I’m going to ask about the number of my GitHub repositories.

Then, I can check the number of projects in my SonarCloud account.

Finally, I can choose a specific repository and verify the last commit and the current analysis status in SonarCloud.

Here’s the LLM answer for my sample-spring-boot-kafka repository. You can perform the same exercise for your repositories and projects.

Screenshot 2026 02 05 at 23.00.32

Conclusion

Spring AI, combined with the MCP client, opens a powerful path toward building truly tool-aware AI applications. By using the Docker MCP Gateway, we can easily host and manage MCP servers such as GitHub or SonarQube consistently and reproducibly, without tightly coupling them to our application runtime. Docker provides a user-friendly interface for managing MCP servers, giving users access to everything through a single MCP gateway. This approach appears to have advantages, particularly during application development.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Leave A Reply