Private AI Models at Home

5 min read

Private AI Models at Home

AI is a powerful and valuable technology, and as it continues to evolve, privacy will undoubtedly become a significant tradeoff for many. Personally, I'm hesitant to allow third parties to ingest my most private thoughts, ideas, and conversations. However, I still find value in using AI to brainstorm, assist with writing, code, learn, and explore concepts. Here's a look at my personal AI stack:

My Personal AI Stack

  • Interface: OpenWebUI - [https://docs.openwebui.com/] - This offers a feature-rich UI, robust open-source support, and a familiar interface similar to commercial products.
  • Models: Ollama - [https://ollama.com/search] - It provides access to numerous excellent open-source models. I currently utilize gemma3:4b, qwen3:8b, and deepseek-r1:8b.
  • Docker - [https://www.docker.com/] - Used for hosting the application.
  • Tailscale - [https://tailscale.com/] - This provides a secure solution to connect to your devices and use the application from anywhere, including your phone.
  • Hardware: I built a custom tower with a GPU optimized for machine learning workloads and for hosting my personal AI stack. While my setup is substantial, a Mac Mini – [https://www.apple.com/mac-mini/] – can effectively run one model with low latency.

Benefits

This setup allows you to host all your data directly on your hardware, giving you complete control. My wife and I primarily use this stack to use AI while we're at home. Despite the convenience of commercial products, I've established a clear boundary against sharing very personal information. If you're interested in learning more about this setup, please send me a note.

Open Source Community

This is only possible because of Open Source. Shout out to the whole community!

OpenWebUI, Ollama, Docker, Inc, Tailscale

Alt text

Written by Jhordan

AI/ML Engineer. Building tools and writing about the intersection of AI, military, and society.