Berri AI
The fastest way to take your LLM app to production
Pinned Loading
Repositories
Showing 10 of 60 repositories
- litellm Public
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]
BerriAI/litellm’s past year of commit activity - terraform-provider-litellm Public Forked from ncecere/terraform-provider-litellm
litellm terraform provider
BerriAI/terraform-provider-litellm’s past year of commit activity - mock-oauth2-mcp-server Public
A mock OAuth2 + MCP (Model Context Protocol) server for testing client_credentials flows. Useful for E2E testing LiteLLM proxy MCP OAuth2 M2M authentication.
BerriAI/mock-oauth2-mcp-server’s past year of commit activity - litellm-observatory Public
End-to-end testing suite for LiteLLM deployments - provider tests, performance metrics, and API validation
BerriAI/litellm-observatory’s past year of commit activity - litellm-pgvector Public
BerriAI/litellm-pgvector’s past year of commit activity - fireworks-ai-cost-agent Public
BerriAI/fireworks-ai-cost-agent’s past year of commit activity - Automated_Perf_Tests Public
BerriAI/Automated_Perf_Tests’s past year of commit activity