Otoroshi with LLM: simplify your API and AI service management on Clever Cloud

Otoroshi with LLM: simplify your API and AI service management on Clever Cloud
Your applications and services are evolving in an increasingly complex environment, requiring effective management of APIs and interactions with artificial intelligence models such as the very popular LLMs (Large Language Models).

To address these needs, we teamed up with Mathieu Ancelin, creator and core developer of Otoroshi and API Management expert who founded Cloud APIM, to integrate Otoroshi with LLM on Clever Cloud. This add-on is now available in beta to all our users. Shipped with a ready-to-use configuration and an intuitive interface, it simplifies the implementation of advanced AI workflows and offers centralized management of your APIs, WAF, etc.

Providing AI services beyond the hype

At Clever Cloud, our mission is not limited to deploying applications and add-ons. We were founded on a desire to simplify developer’s life, and to provide them with services that extend beyond our turnkey deployment solution. Our main focus is to understand the needs of our customers on a daily basis, with a single obsession: to find the best way to meet them, in the best possible way. In the emerging field of AI as elsewhere.

And while we can see how much artificial intelligence has already changed our businesses, and how quickly things are evolving, we have a duty not to simply jump on the latest fashionable feature and jump from trend to trend to make headlines.

Over the last few months, we’ve been looking at how we can contribute to a challenge that seems essential to us: adapting AI to the specific contexts and needs of businesses, not for one service, but for an entire ecosystem.

That’s why we’ve worked to identify robust tools for your teams: centralized rights and billing management, integrations, observability, auditability, resilience, and much more. With Otoroshi with LLM, we offer much more than an API gateway: a turnkey service to manage your APIs and maximize the potential of your AI interactions.

And as usual, we don’t just provide a “1-click” open source deployment, our approach is to integrate professional tools that are truly adapted to everyday use, with real lifecycle management. Otoroshi with LLM, as an add-on to our Marketplace, takes full advantage of our platform: scalability, simplified updates, secure integrations and unified payment.

LLM extension: your single point of entry for artificial intelligence

With its LLM extension, Otoroshi turns into a central hub for all your interactions with large-scale language models. By unifying your access to providers such as OpenAI, Anthropic, Hugging Face, Mistral as well as OVHcloud and Scaleway solutions, it simplifies management and optimizes operations.

For example, you can integrate the tokens of these providers and enable their use with a single URL for all your teams, or one in particular, by selecting the models to which it has access to, with a specific token, its rights, usage limits and so on. You can also adapt the prompts sent to the models on the fly, to guarantee the quality of responses and give them context. You no longer need to manage this on an application-by-application basis; it’s directly included in your API.

In the event of heavy loads, the system intelligently distributes requests between several providers and can cache relevant results to accelerate response times. Finally, every interaction is logged and auditable, offering full traceability to optimize your workflows or meet compliance requirements.

Enhanced security: protect your data and models

Security concerns are at the core of the LLM extension. Thanks to its advanced access control mechanisms, you can prevent the leakage of sensitive information and protect your data through moderation mechanisms, for example. But you can define your own rules, basic or programmatic, to suit your needs.

Access authorizations are defined granularly, based on consumer metadata or query parameters. In addition, tools such as validation rules ensure that sensitive information is not included in queries.

Simplified deployment on Clever Cloud

With Clever Cloud, deploying Otoroshi with LLM is straightforward. You can either do it from the Console, the API or Clever Tools. Once deployed, you have immediate access to a management URL and temporary credentials to customize your instance:

$ clever addon create otoroshi myOtoroshi
Add-on created successfully!
ID: addon_xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Real ID: otoroshi_xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Name: myOtoroshi

Your Otoroshi with LLM is starting:
 - Access it: https://xxxxxxxxxxxxxxxxxxxx-otoroshi.services.clever-cloud.com
 - Manage it: https://console.clever-cloud.com/addon_xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx

An initial account has been created, change the password at first login (Security -> Administrators -> Edit user):
 - Admin user name: xxxxxxxxxxxxxxxxxxxxxxxx
 - Temporary password: xxxxxxxxxxxxxxxxxxxxxxxx

/!\ The otoroshi provider is in beta testing phase
Learn more about Otoroshi with LLM on Clever Cloud: https://developers.clever-cloud.com/doc/addons/otoroshi/

The add-on relies on a pre-configured Java application and a Redis® database to manage your flows. In just a few minutes, you’ll be ready to harness the full power of Otoroshi and its LLM add-on.

Use case: unleash the power of your APIs

With Otoroshi with LLM, you can turn the management of your APIs and AI services into a real asset for your team. Imagine a centralized platform that lets you configure an AI agent to distribute access tokens in a secure and controlled way to your teams or customers. These tokens can be set up for specific uses, limiting access to certain features or resources according to your needs. This guarantees secure distribution tailored to the different people involved in your projects.

Then take advantage of moderation and prompt control tools. By ensuring that no sensitive or inappropriate data is transmitted in your queries, you not only protect your internal information, but also improve the quality and relevance of responses generated by AI models.

Leverage Otoroshi’s auditing capabilities. Every interaction with your APIs is recorded in precise detail, with your account management/SSO, giving you a complete overview of usages. This transparency not only enables you to optimize your workflows, but also to meet any compliance requirements.

All this is available via a web interface, but also via the Otoroshi API, which is accessible and also preconfigured with your add-on. This will enable us to integrate this service directly into Clever Cloud.

Otoroshi with LLM in action: join us on Twitch on December 18

Want to see Otoroshi with LLM in action? We’re hosting a live demonstration on our Twitch channel on December 18 at 1pm. Find out how to configure and use the add-on for your AI and API projects.

We’ll also be giving a sneak peek at what we’re up to with Cloud APIM around Otoroshi. Come and find out more about what we have cooked up for you in 2025.

A constantly evolving tool: your opinion counts!

Otoroshi with LLM perfectly illustrates our commitment to offering high-performance technological solutions, while integrating them into the heart of our ecosystem. Test it today, and get your teams ready to exploit the full potential of your APIs and AI services.

Currently in beta, Otoroshi with LLM is fully functional, but constantly being improved. Your feedback is essential to fine-tune the user experience and proposed functionalities. Join our GitHub community to share your suggestions and ask your questions.

Blog

À lire également

Otoroshi with LLM: simplify your API and AI service management on Clever Cloud

Your applications and services are evolving in an increasingly complex environment, requiring effective management of APIs and interactions with artificial intelligence models such as the very popular LLMs (Large Language Models).
Features

Markitdown-as-a-Service: from AI to production on Clever Cloud

Every day, new tools are released, AI brings new perspectives, you have new ideas. It's one of Clever Cloud's missions to help you to develop and test them in real-life conditions, effortlessly, before making them available to everyone.
Engineering

Why choose a French cloud?

At a time when data security and compliance with European standards have become key concerns, this question is becoming increasingly crucial.
Engineering