--- name: validate-implementation-plan description: Audit and annotate an AI-generated implementation plan for requirements traceability, YAGNI compliance, and assumption risks. Use when reviewing, validating, or auditing an implementation plan or design proposal produced by an AI agent. argument-hint: [write-to-file] [fetch-recent] allowed-tools: Read, Glob, Grep, Edit, Write, AskUserQuestion, WebSearch, WebFetch --- # Validate Implementation Plan You are an independent auditor reviewing an implementation plan written by another agent. Your job is to annotate the plan — not to rewrite or modify it. ## When to Use - Reviewing an implementation plan generated by an AI agent before approving it - Auditing a design proposal for scope creep, over-engineering, or unverified assumptions - Validating that a plan maps back to the original user request or ticket requirements ## Arguments | Position | Name | Type | Default | Description | | -------- | ------------- | ---------------- | ------------ | -------------------------------------------------------------------------------------------------- | | `$0` | plan-path | string | _(required)_ | Path to the plan file to audit | | `$1` | write-to-file | `true` / `false` | `true` | Write the annotated plan back to the file at `$0`. Set to `false` to print to conversation only. | | `$2` | fetch-recent | `true` / `false` | `true` | Use `WebSearch` to validate technical assumptions against recent sources (no older than 3 months). | ### Argument Behavior - If `$1` is omitted or `true` — write the full annotated plan back to the plan file using `Write` - If `$1` is `false` — output the annotated plan to the conversation only - If `$2` is omitted or `true` — run a research step using `WebSearch` before auditing - If `$2` is `false` — skip external research ## Plan Content !`cat $0` ## Core Rules 1. **Preserve the original plan text exactly.** Do not reword, reorder, or remove any of the plan's content. You ARE expected to write annotations directly into the plan — annotations are additions, not mutations. 2. **Add annotations inline** directly after the relevant section or line. 3. **Every annotation must cite a specific reason** tied to one of the audit categories. 4. **Every section must be annotated** — if a section passes all checks, add an explicit pass annotation. 5. **Use `AskUserQuestion` for unresolved assumptions.** When you encounter an assumption that cannot be verified through the plan text, codebase exploration, or web research — STOP and use `AskUserQuestion` to get clarification from the user before annotating. Do NOT defer unresolved questions to the summary. ## Annotation Format Place annotations immediately after the relevant plan content. Each annotation includes a severity level: ``` // annotation made by : ``` ### Severity Levels | Level | Meaning | | ----------- | ------------------------------------------------------------------------------------------------------------------------------- | | 🔴 Critical | Violates a stated requirement, introduces scope not asked for, or relies on an unverified assumption that could derail the plan | | 🟡 Warning | Potentially over-engineered, loosely justified, or based on a plausible but unconfirmed assumption | | ℹ️ Info | Observation, clarification, or confirmation that a section is well-aligned | Use `ℹ️ Info` for explicit pass annotations on clean sections. ### Expert Personas Use these expert personas based on the audit category: | Category | Expert Name | | ------------------------- | -------------------- | | Requirements Traceability | Requirements Auditor | | YAGNI Compliance | YAGNI Auditor | | Assumption Audit | Assumptions Auditor | ## Audit Process ### Step 0: Research (when `$2` is `true` or omitted) Before auditing, validate the plan's technical claims against current sources: 1. Identify technical claims, library references, and architectural patterns mentioned in the plan 2. Use `WebSearch` to validate against current documentation and best practices (no older than 3 months) 3. Note any discrepancies or outdated information found 4. Use research findings to inform annotation severity during the audit Skip this step entirely when `$2` is `false`. ### Step 1: Identify the Source Requirements Extract the original requirements and constraints from which the plan was built. Sources include: - The user's original request or message - A linked Jira ticket or design document - Constraints stated earlier in the conversation Present these as a numbered reference list at the top of your output under a **Source Requirements** heading. Every annotation you write should reference one or more of these by number. ### Step 2: Reproduce and Annotate Reproduce the original plan in full. After each section or step, insert annotations where issues are found. ### Step 3: Apply Audit Categories #### 1. Requirements Traceability - Does every element map to a stated requirement or constraint? - Flag additions that lack explicit justification from the original request. #### 2. YAGNI Compliance - Identify anything included "just in case" or for hypothetical future needs. - Flag speculative features, over-engineering, or premature abstractions. #### 3. Assumption Audit For each assumption identified: 1. Attempt to verify it through the plan text and source requirements 2. Search the codebase with `Grep`/`Glob`/`Read` for evidence 3. If `$2` is `true` or omitted, use `WebSearch` to check against current best practices 4. If the assumption **cannot be verified** through any of the above — use `AskUserQuestion` to ask the user directly 5. Record the user's answer as context and use it to inform the annotation severity ### Step 4: Summary After the annotated plan, provide: - **Annotation count** by category and by expert - **Confidence assessment**: What are you most and least certain about? - **Resolved Assumptions**: List what was clarified with the user via `AskUserQuestion` and how it affected annotations - **Open Questions**: Only for cases where the user chose not to answer or the answer was ambiguous ## Output Structure ```markdown ## Source Requirements 1. 2. ... --- ## Annotated Plan // annotation made by : ... --- ## Audit Summary | Category | 🔴 Critical | 🟡 Warning | ℹ️ Info | | ------------------------- | ----------- | ---------- | ------- | | Requirements Traceability | N | N | N | | YAGNI Compliance | N | N | N | | Assumption Audit | N | N | N | **Confidence**: ... **Resolved Assumptions**: - — User confirmed: . Annotation adjusted to . - ... **Open Questions**: - ``` ## Additional Resources - For a complete example of an annotated audit, see [examples/sample-audit.md](examples/sample-audit.md)