📄️ Getting Started Tutorial
End-to-End tutorial for LiteLLM Proxy to:
🗃️ Config.yaml
3 items
🗃️ Setup & Deployment
9 items
🔗 Demo LiteLLM Cloud
🗃️ Admin UI
11 items
🗃️ Architecture
9 items
🔗 All Endpoints (Swagger)
📄️ ✨ Enterprise Features
To get a license, get in touch with us here
🗃️ Authentication
8 items
🗃️ Budgets + Rate Limits
7 items
📄️ Caching
For OpenAI/Anthropic Prompt Caching, go here
🗃️ Create Custom Plugins
Modify requests, responses, and more
📄️ LiteLLM Proxy CLI
The litellm-proxy CLI is a command-line tool for managing your LiteLLM proxy
🔗 Load Balancing, Routing, Fallbacks
🗃️ Logging, Alerting, Metrics
5 items
🗃️ Making LLM Requests
6 items
🗃️ Model Access
4 items
🗃️ Secret Managers
10 items
🗃️ Spend Tracking
4 items