One API for all – Mozilla ends LLM chaos
Mozilla relieves developers: any-llm provides a central API for many LLMs in the background. A multi-tenant gateway manages budgets and keys.
(Image: Lightspring/Shutterstock.com)
With the Python package any-llm, Mozilla is releasing a unified API for many LLMs in version 1, which is already intended to be stable for production use. This relieves developers when using the models, as they no longer have to maintain a separate adapter for each individual LLM.
The connected models can be in the cloud or local and can be easily switched via the asynchronous API. To improve performance, client connections are reusable. The tool also provides for reasoning a standardized output. Furthermore, any-llm informs users when API modes or endpoints change.
An optional LLM gateway serves for budget and key management and is multi-tenant. Thus, it can serve as an LLM interface for companies.
Videos by heise
The List of connected providers on the GitHub page is already long and includes Anthropic, Azure, Databricks, Deepseek, Gemini, Groq, Hugging Face, Llama, Mistral, Ollama, Perplexity, Watsonx, and others. It also contains a table showing which functions any-llm supports for each: Response, Reasoning, Image, and so on.
To use the tool, Python 3.11 and the respective API keys are required, which the tool stores in an environment variable. Mozilla has planned a batch function for the next versions, as well as the integration of further LLMs and additional libraries such as the MCP daemon.
(who)