C++ RESTful web service with autogenerated MCP Server to connect with LLMs
- Rifx.Online
- Programming , Technology/Web , Machine Learning
- 26 Dec, 2024
This is a 5-minute tutorial on how to add an MCP (Model Context Protocol) Server to your existing Oat++ application so that LLMs can query your API.
Prerequisites
Before starting, you need a working web service built with Oat++.
If you don’t have a working Oat++ web service yet, you can create one by following this tutorial: C++ RESTful web service with Swagger-UI and auto-documented endpoints. Alternatively, you can use one of the Oat++ example projects.
In this tutorial, we’ll use the example-crud project for demonstration purposes.
Add oatpp-mcp to your project
The oatpp-mcp module implements Anthropic’s Model Context Protocol (MCP). It allows you to create standalone MCP servers by manually defining Tools, Prompts, and Resources that will be served to the LLM. Additionally, it can automatically generate Tools from your ApiControl
ler.
Step 1: Clone oatpp-mcp
git clone https://github.com/oatpp/oatpp-mcp
Step 2: Build and Install oatpp-mcp
cd oatpp-mcp/
mkdir build && cd build/
cmake ..
sudo make install
Step 3: Link oatpp-mcp to Your Application
Once installed, you can link the oatpp-mcp module to your application. Update your example-crud/CMakeLists.txt
as follows:
find_package(oatpp 1.4.0 REQUIRED)
find_package(oatpp-swagger 1.4.0 REQUIRED)
find_package(oatpp-mcp 1.4.0 REQUIRED) # <-- Add this
find_package(oatpp-sqlite 1.4.0 REQUIRED)
target_link_libraries(crud-lib
# Oat++
PUBLIC oatpp::oatpp
PUBLIC oatpp::oatpp-swagger
PUBLIC oatpp::oatpp-mcp # <-- And this
PUBLIC oatpp::oatpp-sqlite
)
Expose your REST API via MCP Tools
Oat++ MCP Server supports both HTTP-SSE (Server-Sent Events) and STDIO transports. Depending on which transport you use, there are small differences in how it works and how you run the MCP Server.
HTTP-SSE transport
Let’s start with the SSE transport.
In the case of the HTTP-SSE transport, MCP is served on the same port as your REST API. The MCP Server will add two endpoints to the router:
GET {prefix}/sse
— used for server-sent-eventsPOST {prefix}/sessions/{sessionId}
— LLM-sent-event
Modify you App.cpp
(in my case it’s example-crud/src/App.cpp
)…
Step 1: Create MCP Server
#include "oatpp-mcp/Server.hpp"
...
oatpp::mcp::Server mcpServer; //<-- create mcpServer instance
Step 2: Add MCP Server’s Endpoints to the Router
router->addController(mcpServer.getSseController());
Step 3: Specify Which Endpoints MCP Server Should Serve
mcpServer.addEndpoints(userController->getEndpoints());
That’s it! Now LLMs can connect to your server and its API via MCP.
For a quick test we can use MCP Inspector:
- Select “SSE” in the transport dropdown
- Enter the “SSE” endpoint under “URL” (should be
http://localhost:<port>/sse
). - Connect and go to the “Tools” tab.
For details on how to run MCP Inspector, please refer to the MCP Inspector GitHub repository.
STDIO transport
In the case of STDIO transport, the MCP Server doesn’t add any endpoints to the Router. Instead, it listens to the STDIO stream and forwards API calls to the ApiController. Additionally, we need to silence all logging by redirecting logs to a file to avoid disturbing the STDIO stream.
Modify you App.cpp
(in my case it’s example-crud/src/App.cpp
)…
Step 1: Create MCP Server
#include "oatpp-mcp/Server.hpp"
...
oatpp::mcp::Server mcpServer; //<-- create mcpServer instance
Step 2: Specify Which Endpoints MCP Server Should Serve
mcpServer.addEndpoints(userController->getEndpoints());
Step 3: Listen to the STDIO stream in a thread running parallel to your HTTP server thread.
std::thread http([&server, connectionProvider]{
OATPP_LOGd("Server", "Running on port {}...", connectionProvider->getProperty("port").toString())
server.run();
});
std::thread mcp([&mcpServer]{
OATPP_LOGd("MCP Server", "Serving via STDIO {}...")
mcpServer.stdioListen();
});
http.join();
mcp.join();
Step 4: Redirect Logging to file
class FileLogger : public oatpp::Logger {
private:
std::mutex m_mutex;
public:
void log(v_uint32 priority, const std::string& tag, const std::string& message) override {
std::lock_guard<std::mutex> lock(m_mutex);
std::ofstream fout(LOG_FILE);
fout << tag << ": " << message << std::endl;
}
};
...
oatpp::Environment::init(
std::make_shared<FileLogger>()
);
That’s it! We can test it as we did previously with MCP Inspector, or we can use Claude Desktop to see how the LLM interacts with our service.
Test with Claude for Desktop
Step 1: Add your MCP Server to Claude config
On Mac (for other platforms please refer to Claude’s docs):
vi ~/Library/Application\ Support/Claude/claude_desktop_config.json
Add your server executable (in my case it’s crud-exe
— example-crud service):
{
"mcpServers": {
"test-tool": {
"command": "/path/to/server/executable"
}
}
}
Step 2: Relaunch the Claude application and ask the LLM to query your service.
As you can see, Claude has successfully executed all basic CRUD operations via the MCP Server. All we had to do was add a few lines of code, and the rest was generated by the Oat++ Framework.
For the complete example, please clone the example-crud project ( add_mcp_server
branch).
git clone -b add_mcp_server https://github.com/oatpp/example-crud
Useful links