Create your own MCP client/server: as easy as 1-2-3 with Otoroshi

Create your own MCP client/server: as easy as 1-2-3 with Otoroshi
While Otoroshi with LLM already allows you to simplify the management of your various AI providers, access to models and integration with your teams, we have added simplified management of MCP clients and servers.

AI service providers are on the rise. But more and more, users aren’t looking for generic answers that are more or less effective and relevant. They want assistants who can retrieve context from different services and make them work together.

MCP: Function Calling and beyond

For some time now, models have featured a so-called Function Calling capability. When a particular pattern is identified, it can be used to call a function and provide it with parameters to retrieve a response from a third-party service. And so MCP (Model Context Protocol) was born.

Presented by Anthropic last november, MCP aims to standardize the way context is provided to large language models (LLMs). It comes as a server that exposes functions and prompts, as well as resources such as text, logs, images and so on. And a client that can list all these elements for use. The protocol, which is open and currently evolving, is based on JSON-RPC 2.0 and two transport modes : stdio and SSE (Server-Sent Events), but others can be exploited. If you want to know more, the technical specifications are detailed here.

There are various official implementations, for Kotlin, Python, TypeScript, and a myriad of unofficial solutions. In fact, there are already numerous client/server lists, ranging from the ability to read local files to reading databases or retrieving information from a GitHub account, for example. As well as integration with Claude’s desktop clients (Anthropic’s chat service), several IDEs and extensions are also getting to grips with MCP: Cline, Continue, Zed, etc.

In practice, creating an MCP server is not very complex, especially as SDKs can help you and there are a growing number of guides. But how do you expose them to a multitude of models and then integrate them into your tools? That’s where things can get complicated.

Otoroshi simplifies access to Function calling and MCP

So in December we started working with the Cloud APIM team to simplify the use of this protocol within Otoroshi with LLM, which we offer in the form of an add-on that can be deployed in just a few clicks. It already natively supported Function Calling, enabling any model to be enriched by functions written in Web Assembly (WASM, via Extism) or JavaScript. This was an opportunity to take it a step further.

The new version deployed on Clever Cloud now manages:

  • MCP Endpoint: for exposing functions to MCP clients ;
  • MCP Connectors: for adding functions from an MCP server.

The MCP servers created by Otoroshi can communicate with clients directly via SSE, but also via HTTP requests or WebSocket. It only takes a few minutes to create them. If your client only supports stdio, you can use the proxy developed by the Cloud APIM team to communicate with the server.

We’ve taken the opportunity to add support for DeepSeek as a provider and support for functions in the form of HTTP calls. In this way, Otoroshi can exploit our FaaS, currently in private alpha, with different AI models.

How does it work in practice?

Want to find out more? Join us at 1:00 PM on our Twitch channel! We’re organising a live demonstration. You’ll be able to find out how to configure and use Otoroshi with LLM to create your own MCP clients or MCP servers in just a few minutes!

And don’t forget to come and meet us at the Open Source AI Summit on Wednesday 22 January, where several members of our team will be on hand.

Blog

À lire également

Create your own MCP client/server: as easy as 1-2-3 with Otoroshi

While Otoroshi with LLM already allows you to simplify the management of your various AI providers, access to models and integration with your teams, we have added simplified management of MCP clients and servers.
Company

Clever Cloud obtains HDS (Health Data Hosting) certification

Clever Cloud achieves HDS Certification, enabling it to host health data in France. Clever Cloud, Europe's leading provider of Platform as a Service cloud solutions, today announced that it has been awarded the Hébergeur de Données de Santé (HDS) certification, in its updated version effective May 16, 2024, for all 6 activities in the standard. This certification reinforces Clever Cloud's position as a trusted partner for companies and organizations in the healthcare sector.
Press

Clever Tools: a year of enhancements for your deployments, on the road to v4

A command line interface (CLI) is at the core of developer experience. At Clever Cloud, we have been providing Clever Tools for almost 10 years.
Engineering Features