What's New
New updates and improvements to Deanout
← Back🎉 New (Experimental) Feature Alert: Self Hosted LLMs! 🚀
New
Today we added support for self-hosted LLMs supported by Open WebUI. This seems to allow offline, telemetry free LLM usage.
When paired with an extension like VS Code, this even allows a CoPilot experience with a direct connection to your server.
When paired with an extension like VS Code, this even allows a CoPilot experience with a direct connection to your server.
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI interface designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs.
- https://docs.openwebui.com/