Use Your Self-Hosted LLM Anywhere with Ollama Web UI
Decoder Decoder
5.35K subscribers
55,552 views
0

 Published On Jan 29, 2024

Take your self-hosted Ollama models to the next level with Ollama Web UI, which provides a beautiful interface and features like chat history, voice input, and user management. We'll also explore how to use this interface and the models that power it on your phone using the powerful Ngrok tool.

Watch my other Ollama videos -    • Get Started with Ollama  

Links:
Code from the video - https://decoder.sh/videos/use-your-se...
Ollama - https://ollama.ai
Docker - https://docs.docker.com/engine/install/
Ollama Web UI - https://github.com/ollama-webui/ollam...
NGrok - https://ngrok.com/docs/getting-started/

Timestamps:
00:00 - Is this free ChatGPT?
00:16 - Tools Needed
00:19 - Tools: Ollama
00:25 - Tools: Docker
00:38 - Tools: Ollama Web UI
00:55 - Tools: Ngrok
01:12 - Ollama status check
01:37 - Docker command walkthrough
04:20 - Starting the docker container
04:33 - Container status check
04:53 - Web UI Sign In
05:17 - Web UI Walkthrough
07:11 - Getting started with Ngrok
07:55 - Running Ngrok
08:29 - Ollama Web UI on our Phone!!
09:37 - Outro - What's Next?

Credits:
Wikimedia.org for the photo of Earth

show more

Share/Embed