Ratatui TUI that scans your Claude Code project root, generates AI summaries in parallel via Ollama, and caches everything in SQLite. Zero startup cost on repeat launches.
$10 · one-time · up to 3 machines · MIT source
Fires two Ollama models in parallel via tokio::join! — a fast model for one-liner taglines, a larger model for 2-3 sentence technical summaries. Distributes across dual GPUs automatically.
Cache keyed on (path, mtime). Second launch of 183 projects is instant. Touching a project dir invalidates its entry automatically.
Press / and type — filters by name, kind, tagline, and summary as you type. Escape to cancel, Enter to lock in.
Dive into any subdirectory with →, go back with ←. File-browser mode summarizes individual source files.
Classic, Dracula, Nord, Solarized, Matrix. Cycle with Ctrl+T. README and source files get syntax highlighting via Syntect.
Pipe-friendly: creatui -l prints a table, creatui --json emits JSON. Scriptable with -s rust for filtered output.
CRTUI-AB3CD-EF7GH-JK2LM-NP9QRcurl -sSL https://grug.ai/install.sh | sh -s -- CRTUI-YOUR-KEY-HERE
works on Linux x86_64, Linux arm64, macOS x86_64, macOS arm64
creatui ~/Documents/claude_creations
or edit ~/.config/creatui/config.toml to set scan_dir
Requires Ollama running locally with at least one model.
Pull the defaults: ollama pull qwen3.5:4b && ollama pull ministral-3:8b
Prefer to build from source? cargo install --git https://github.com/danindiana/creatui — no license required (MIT).
Stripe checkout · card, Apple Pay, Google Pay · no account needed
cargo install --git https://github.com/danindiana/creatui is always free. The $10 is for the convenience of pre-built binaries and support.max_parallel = 1 in config for CPU/single-GPU machines to avoid overloading.qwen3.5:4b (taglines) and ministral-3:8b (summaries). Both are configurable. Even a single small model works fine.localhost:11434. Nothing leaves your machine except the license validation ping to grug.ai on install.