How It Works
A high-fidelity pipeline for technical intelligence. Termiflow bridges the gap between raw web data and terminal-native execution.
[ ARCHITECTURE_OVERVIEW ]
REF: DIAGRAM_V2.0_STABLE+-----------------+ +-----------------------+
| YOUR_TERMINAL | ----> | TERMIFLOW_CLOUD |
| (termiflow | <---- | |
| CLI) | | AI curation, search, |
+-----------------+ | scoring, summaries |
tf_xxx key +-----------------------+
30-Second Setup
Run termiflow config init, paste your tf_xxx key, done. No provider keys to manage.
Managed AI
We handle the LLM scoring, web crawling, and summarization. You get curated results without managing API keys or costs.
Synced State
Subscriptions and feed state sync across devices. Local SQLite cache for offline reading.
+-----------------+ +-----------------------+ +-------------------+
| YOUR_TERMINAL | ----> | TERMIFLOW_ENGINE | ----> | YOUR_LLM_KEYS |
| (termiflow | <---- | (local Go library) | <---- | (Anthropic, |
| CLI) | | | | OpenAI, Ollama) |
+-----------------+ +-----------+-----------+ +-------------------+
|
v
+-----------------------+
| YOUR_SEARCH_KEYS |
| (Tavily, Google, |
| Bing, Brave) |
+-----------------------+
Full Sovereignty
All data stays on your machine. Local SQLite storage, no server calls. Fully offline-capable after initial setup.
Bring Your Own
Use your own Anthropic, OpenAI, or local Ollama keys. Pair with Tavily, Google, Bing, or Brave for search.
Open Engine
termiflow-engine is a reusable Go library for search, scoring, filtering, and summarization. MIT licensed.
[ THE_PIPELINE ]
Search (Tavily)
Deep-crawls developer documentation, GitHub repos, and StackOverflow threads.
> [SCANNING] 452 sources identified...
Score (Claude)
LLM evaluates the relevance of each source based on your technical profile.
> Result: Source_A(0.98), Source_B(0.42)...
Filter
Removes duplicates, outdated documentation, and low-confidence snippets.
> Filtering complete. 12 valid entries retained.
Summarize (Claude)
Distills multi-source data into actionable terminal commands or technical summaries.
> Generating markdown buffer... [OK]
Deliver
Final output streamed to your stdout or stored in your local SQLite search cache.
> [DONE] 8 items ready — streaming to stdout
[ EDITIONS_COMPARISON ]
| PARAMETER | MANAGED_CLI | SELF_HOSTED_CLI |
|---|---|---|
| SETUP | Zero-Config (Auth Hook) | Config.yaml Definition |
| API_KEYS | Handled by Cloud Proxy | User-Supplied Keys |
| LLM_OPTIONS | Managed (latest Claude) | Anthropic, OpenAI, Ollama (local) |
| SEARCH | Optimized Tavily Engine | Google, Bing, Brave, Tavily |
| DATA | Encrypted Cache (7 Days) | Persistent Local SQLite |
| COST | $12 / Month (All API Inc.) | Free (Pay your providers) |
| SOURCE | Same CLI (managed config) | MIT Licensed Go Repo |
[ PRIVACY_MANIFESTO ]
Local-First Storage
Your search history and context never touch our servers. Everything is stored in an encrypted local SQLite database: ~/.termiflow/vault.db
Transparent Proxy
We strictly proxy requests to LLM providers. We do not train models on your queries or retain metadata longer than 24 hours.
Opt-Out Telemetry
Zero tracking by default. You have to explicitly enable metrics in your termiflow.toml.
Build The Future Interface
Contribute to our core engine and community plugins.