openwebui-llm-proxy
The premium Open Source alternative to LiteLLM
🎯 Best for:Local LLM orchestration
What is openwebui-llm-proxy?
Replaces direct API calls to Anthropic with a local proxy. Integrates Qdrant vector database to provide RAG capabilities for CLI agents.
Tech Stack
JavaScriptAI, ML & Data
Why openwebui-llm-proxy?
- • OpenAI compatibility
- • Qdrant RAG
- • Multi-model support
Limitations
- • Proxy latency
- • Complex configuration
- • Requires Docker
3/15/2026
Last Update
0
Forks
0
Issues
MIT
License
Financial Leak Detected
Stop the "SaaS Tax"
Your team could be burning cash. Switching to openwebui-llm-proxy instantly boosts your runway.
Competitor Cost
-$1,440
/ year (est. based on LiteLLM)
Self-Hosted
$0
/ year
Team Size10 Users
150+
SAVE 100%