nanochat Chat UI
Talk to your nanochat model through a familiar ChatGPT-like web interface. Vanilla HTML/CSS/JS—no React, no build step. The UI is minimal and hackable, so you can customize it to match your workflow.
Run the chat server
source .venv/bin/activate
python -m scripts.chat_web
Open the URL shown (e.g. http://localhost:8000). On cloud GPUs, use the node's public IP and port.
Try it online
You can try nanochat without training: nanochat-ai.com
What to try
- Ask for stories or poems—the model can generate creative text
- Ask "who am I?" to see a classic LLM hallucination
- "Why is the sky blue?" or "Why is it green?"—simple science Q&A
- Give it math problems or coding prompts—capability varies by model size
The speedrun model is ~4e19 FLOPs—like talking to a kindergartener. Expect charm and occasional oddness.
Technical details
The frontend lives in nanochat/ui.html. The server is scripts/chat_web.py. It's pleasantly succinct vanilla JavaScript—no bundler, no framework. You can modify the UI to add features like system prompts, temperature controls, or custom styling.
Security note
When serving the chat UI, bind to the appropriate interface. On cloud instances, use the public IP for access. For local development, localhost is fine. Don't expose the server to the open internet without proper authentication if you're running a shared or sensitive model.