--- name: openflow description: Executes AI tasks with strict resource limits and progress tracking. Use when the user needs to search the web, explore large codebases, or check dependencies. Prevents runaway execution by enforcing timeouts and limiting searches, file reads, and recursion depth. allowed-tools: Bash(openflow:*) --- # Controlled AI Task Execution with OpenFlow The `openflow` command executes AI tasks with built-in resource limits and real-time progress tracking. A structured workflow prevents runaway execution by enforcing timeouts and limiting operations. ## Prerequisites ```bash openflow --version # Verify installation ``` For setup details, see https://github.com/Done-0/openflow#readme ## Core Workflow 1. **List flows**: `openflow list` — see available workflows 2. **Inspect**: `openflow info ` — view flow details and limits 3. **Execute**: `openflow run --input '{"key": "value"}'` — run with parameters 4. **Monitor**: progress displayed in real-time with resource tracking 5. **Complete**: automatic limit enforcement prevents runaway execution ## Available Flows ```bash openflow list # List all flows openflow list --category research # Filter by category ``` ### search_web Search the web with strict limits to prevent runaway searches. Uses DuckDuckGo Instant Answer API (free) or Brave Search API (set `BRAVE_API_KEY` env var for better results). ```bash openflow run search_web --input '{"topic": "AI trends"}' ``` **Use for:** Research, competitive analysis, gathering information from the web **Limits:** 60s max, 10 results hard limit, 12k tokens **Prevents:** Runaway searches (real case: 234 searches, 45 minutes wasted) ### explore_codebase Explore codebases with file limits to prevent reading entire projects. ```bash openflow run explore_codebase --input '{"query": "authentication logic", "path": "."}' ``` **Use for:** Understanding large codebases, finding relevant code, navigating projects **Limits:** 45s max, 20 files maximum, 15k tokens **Prevents:** Reading entire project and exhausting tokens ### check_dependencies Check dependencies with depth limits to prevent recursive analysis. Supports npm, pip, poetry, cargo, go, bundler. ```bash openflow run check_dependencies --input '{"path": "."}' ``` **Use for:** Dependency audits, version conflict detection, understanding project dependencies **Limits:** 30s max, 50 dependencies maximum, 8k tokens **Prevents:** Recursively analyzing entire dependency tree ## Commands ```bash # Flow Management openflow list # List available flows openflow list --category # Filter by category (research, codebase, dependencies, testing) openflow info # Show flow details, steps, and limits # Execution openflow run --input '{"key": "value"}' # Execute flow with input openflow run --input '...' --max-time 60 # Custom time limit openflow run --input '...' --max-calls 30 # Custom tool call limit openflow run --input '...' --max-tokens 15000 # Custom token limit # Information openflow --version # Show version openflow --help # Show help ``` ## Resource Limits Each flow has default limits that can be customized: | Option | Description | Default | |--------|-------------|---------| | `--max-time` | Maximum execution time (seconds) | Flow-specific | | `--max-calls` | Maximum tool calls allowed | Flow-specific | | `--max-tokens` | Maximum token usage | Flow-specific | ```bash # Example with custom limits openflow run search_web \ --input '{"topic": "AI trends"}' \ --max-time 60 \ --max-calls 10 \ --max-tokens 15000 ``` ## Output Format OpenFlow displays progress in the terminal: ``` [OpenFlow] Execution Plan ━━━━━━━━━━━━━━━━━━━━━━━━ Steps: 1. Search information (est. 10s) 2. Analyze sources (est. 15s) 3. Compare findings (est. 12s) 4. Generate report (est. 8s) Estimated time: 45s Resource limits: max 10 searches, 60s timeout ━━━━━━━━━━━━━━━━━━━━━━━━ [OpenFlow] Progress ━━━━━━━━━━━━━━━━━━━━━━━━ ✓ Search information 10.2s (used 10/10 searches) ✓ Analyze sources 12.1s ⟳ Compare findings 9.8s [running...] ○ Generate report Progress: ████████████░░░░░░░░ 60% Elapsed: 32.1s / Est. 45s | Searches: 10/10 ━━━━━━━━━━━━━━━━━━━━━━━━ [OpenFlow] Completed ━━━━━━━━━━━━━━━━━━━━━━━━ ✓ All steps completed Execution stats: Time: 39.4s (5.6s faster than estimated) Searches: 10/10 (100%) Token usage: 8.2k/12k (68%) ✓ Task completed within resource limits ✓ Prevented runaway execution ━━━━━━━━━━━━━━━━━━━━━━━━ ``` ## Common Workflows ### Search the Web ```bash openflow run search_web --input '{"topic": "competitor analysis"}' ``` ### Explore Large Codebase ```bash openflow run explore_codebase --input '{"query": "payment processing"}' ``` ### Check Dependencies ```bash openflow run check_dependencies --input '{"file": "package.json"}' ``` ### Generate Tests ```bash openflow run generate_tests --input '{"file": "src/api.py"}' ``` ## Command Chaining Commands can be chained with `&&`: ```bash openflow list && openflow info search_web openflow run search_web --input '{"topic": "AI trends"}' && openflow list ``` ## Tips 1. **Use for high-risk tasks** — OpenFlow prevents runaway execution 2. **Check flow details first** — `openflow info ` shows steps and limits 3. **Set custom limits** — Adjust `--max-time`, `--max-calls`, `--max-tokens` as needed 4. **Monitor progress** — Real-time updates show what's happening 5. **Automatic cleanup** — OpenFlow executes synchronously and cleans up automatically ## Troubleshooting - **Flow not found?** `openflow list` to see available flows - **Input format error?** Use single quotes around JSON: `--input '{"key": "value"}'` - **Run diagnostics:** `openflow --version` and check installation ## Cleanup OpenFlow executes synchronously and cleans up automatically. No manual cleanup needed.