Create your own MCP client/server: as easy as 1-2-3 with Otoroshi

Create your own MCP client/server: as easy as 1-2-3 with Otoroshi
While Otoroshi with LLM already allows you to simplify the management of your various AI providers, access to models and integration with your teams, we have added simplified management of MCP clients and servers.

AI service providers are on the rise. But more and more, users aren’t looking for generic answers that are more or less effective and relevant. They want assistants who can retrieve context from different services and make them work together.

MCP: Function Calling and beyond

For some time now, models have featured a so-called Function Calling capability. When a particular pattern is identified, it can be used to call a function and provide it with parameters to retrieve a response from a third-party service. And so MCP (Model Context Protocol) was born.

Presented by Anthropic last november, MCP aims to standardize the way context is provided to large language models (LLMs). It comes as a server that exposes functions and prompts, as well as resources such as text, logs, images and so on. And a client that can list all these elements for use. The protocol, which is open and currently evolving, is based on JSON-RPC 2.0 and two transport modes : stdio and SSE (Server-Sent Events), but others can be exploited. If you want to know more, the technical specifications are detailed here.

There are various official implementations, for Kotlin, Python, TypeScript, and a myriad of unofficial solutions. In fact, there are already numerous client/server lists, ranging from the ability to read local files to reading databases or retrieving information from a GitHub account, for example. As well as integration with Claude’s desktop clients (Anthropic’s chat service), several IDEs and extensions are also getting to grips with MCP: Cline, Continue, Zed, etc.

In practice, creating an MCP server is not very complex, especially as SDKs can help you and there are a growing number of guides. But how do you expose them to a multitude of models and then integrate them into your tools? That’s where things can get complicated.

Otoroshi simplifies access to Function calling and MCP

So in December we started working with the Cloud APIM team to simplify the use of this protocol within Otoroshi with LLM, which we offer in the form of an add-on that can be deployed in just a few clicks. It already natively supported Function Calling, enabling any model to be enriched by functions written in Web Assembly (WASM, via Extism) or JavaScript. This was an opportunity to take it a step further.

The new version deployed on Clever Cloud now manages:

  • MCP Endpoint: for exposing functions to MCP clients ;
  • MCP Connectors: for adding functions from an MCP server.

The MCP servers created by Otoroshi can communicate with clients directly via SSE, but also via HTTP requests or WebSocket. It only takes a few minutes to create them. If your client only supports stdio, you can use the proxy developed by the Cloud APIM team to communicate with the server.

We’ve taken the opportunity to add support for DeepSeek as a provider and support for functions in the form of HTTP calls. In this way, Otoroshi can exploit our FaaS, currently in private alpha, with different AI models.

How does it work in practice?

Want to find out more? Join us at 1:00 PM on our Twitch channel! We’re organising a live demonstration. You’ll be able to find out how to configure and use Otoroshi with LLM to create your own MCP clients or MCP servers in just a few minutes!

And don’t forget to come and meet us at the Open Source AI Summit on Wednesday 22 January, where several members of our team will be on hand.

Blog

À lire également

Clever Cloud announces 11 new products at its Clever Cloud Fest

Clever Cloud is celebrating its 15th anniversary with the Clever Cloud Fest on 6 and 7 February 2025 in Nantes. This event will bring together customers and partners, during which Clever Cloud will unveil 11 new products and an international growth strategy.
Company Événements Press

What is Clever AI?

AI services are everywhere, but the sheer number and diversity of them makes it harder for developers to manage them. All the more so when they work in a team. That's why we came up with Clever AI: a multi-vendor, multi-model range of solutions to simplify access to AI for businesses.
Company

Protect yourself: beware of job scams spoofing Clever Cloud’s brand

At Clever Cloud, we provide reliable, secure cloud hosting services for businesses and developers worldwide. Unfortunately, our reputation is being exploited by malicious actors engaging in fraudulent activities under the guise of our company name. We want to set the record straight and help protect you from falling victim to these scams.
Company