github install pricing support
creatui --version 0.1.0

navigate 183 AI projects
in under 50ms

Ratatui TUI that scans your Claude Code project root, generates AI summaries in parallel via Ollama, and caches everything in SQLite. Zero startup cost on repeat launches.

$10 · one-time · up to 3 machines · MIT source

creatui ~/Documents/claude_creations [RTX 5080:6% 548/16303M RTX 3080:4% 106/10240M]
04-28 creatui [Rust] AI-powered TUI navigator for Claude Code dirs 04-27 utilyze [Rust] GPU true-utilization TUI with CAP_SYS_ADMIN 04-12 kalshi-tui [Rust] Ratatui TUI for Kalshi BTC prediction markets 04-02 backup-wizard [Sh ] Rsync + restic backup wizard with MOTD 03-27 ollama-delegate [Sh ] Shell toolkit for delegating to local models 03-26 tapeworm [Rust] Shell command recorder with SQLite history 03-18 rpi4-hardening [Doc ] ⏳ summarizing… 03-11 kalshi-watcher [Rust] Real-time Kalshi event monitor with alerts 47/183 j/k:nav Enter:detail /:search r:rescan Ctrl+T:theme ?:help q:quit AI:3/8

everything in one binary

🤖

dual-model AI summaries

Fires two Ollama models in parallel via tokio::join! — a fast model for one-liner taglines, a larger model for 2-3 sentence technical summaries. Distributes across dual GPUs automatically.

SQLite cache — zero re-work

Cache keyed on (path, mtime). Second launch of 183 projects is instant. Touching a project dir invalidates its entry automatically.

🔍

incremental search

Press / and type — filters by name, kind, tagline, and summary as you type. Escape to cancel, Enter to lock in.

📁

directory navigation

Dive into any subdirectory with , go back with . File-browser mode summarizes individual source files.

🎨

5 themes + syntax highlight

Classic, Dracula, Nord, Solarized, Matrix. Cycle with Ctrl+T. README and source files get syntax highlighting via Syntect.

📋

non-interactive mode

Pipe-friendly: creatui -l prints a table, creatui --json emits JSON. Scriptable with -s rust for filtered output.

one command after checkout

01

buy a license — you'll get a key like CRTUI-AB3CD-EF7GH-JK2LM-NP9QR

02

install with one curl command

curl -sSL https://grug.ai/install.sh | sh -s -- CRTUI-YOUR-KEY-HERE

works on Linux x86_64, Linux arm64, macOS x86_64, macOS arm64

03

set your project directory and run

creatui ~/Documents/claude_creations

or edit ~/.config/creatui/config.toml to set scan_dir

Requires Ollama running locally with at least one model. Pull the defaults: ollama pull qwen3.5:4b && ollama pull ministral-3:8b

Prefer to build from source? cargo install --git https://github.com/danindiana/creatui — no license required (MIT).

works like vim, feels like home

j / ↓next item / scroll
k / ↑previous item / scroll
PgDn / PgUpjump 10 items
g / Gtop / bottom
Enteropen detail view
Escback / close
dive into directory
go up / back
/search + filter
oopen in $EDITOR
scycle sort mode
yyank path to clipboard
rrescan directory
Rflush selected cache entry
Ctrl+Dcache DB panel
Ctrl+Tcycle theme (5 themes)
?toggle help overlay
q / Ctrl+Cquit

dead simple

lifetime license $10 one-time
Pre-built binaries for Linux + macOS (both arches)
Install on up to 3 machines
All future releases included
MIT source — build yourself for free anytime
Email support at support@grug.ai
Instant delivery — key appears on checkout confirmation
buy now — $10

Stripe checkout · card, Apple Pay, Google Pay · no account needed

questions

do I need a license to build from source?
No. creatui is MIT licensed. cargo install --git https://github.com/danindiana/creatui is always free. The $10 is for the convenience of pre-built binaries and support.
does it work without a GPU?
Yes. It works with any Ollama setup — single GPU, CPU-only, or remote Ollama instance. Set max_parallel = 1 in config for CPU/single-GPU machines to avoid overloading.
what if I lose my license key?
Email support@grug.ai with your Stripe receipt email. We'll resend it.
which Ollama models does it need?
Any two models — defaults are qwen3.5:4b (taglines) and ministral-3:8b (summaries). Both are configurable. Even a single small model works fine.
does it send my code anywhere?
No. All AI calls go to your local Ollama instance at localhost:11434. Nothing leaves your machine except the license validation ping to grug.ai on install.