diff --git a/prompts/generate-spec.md b/prompts/generate-spec.md index 3418e76..43ab91d 100644 --- a/prompts/generate-spec.md +++ b/prompts/generate-spec.md @@ -29,14 +29,14 @@ This spec serves as the **planning blueprint** for the entire SDD workflow: **Critical Dependencies:** -- **User Stories** become the basis for task demo criteria +- **User Stories** become the basis for proof artifacts in task generation - **Functional Requirements** drive implementation task breakdown - **Technical Considerations** inform architecture and dependency decisions - **Demoable Units** become parent task boundaries in task generation **What Breaks the Chain:** -- Vague user stories → unclear demo criteria and task boundaries +- Vague user stories → unclear proof artifacts and task boundaries - Missing functional requirements → gaps in implementation coverage - Inadequate technical considerations → architectural conflicts during implementation - Oversized specs → unmanageable task breakdown and loss of incremental progress @@ -51,24 +51,54 @@ To create a comprehensive Specification (Spec) based on an initial user input. T If the user did not include an initial input or reference for the spec, ask the user to provide this input before proceeding. -## Process Overview +## Spec Generation Overview -Follow this exact sequence: +1. **Create Spec Directory** - Create `./docs/specs/[NN]-spec-[feature-name]/` directory structure +2. **Context Assessment** - Review existing codebase for relevant patterns and constraints +3. **Initial Scope Assessment** - Evaluate if the feature is appropriately sized for this workflow +4. **Clarifying Questions** - Gather detailed requirements through structured inquiry +5. **Spec Generation** - Create the detailed specification document +6. **Review and Refine** - Validate completeness and clarity with the user -1. **Initial Scope Assessment** - Evaluate if the feature is appropriately sized for this workflow -2. **Clarifying Questions** - Gather detailed requirements through structured inquiry -3. **Context Assessment** - Review existing codebase for relevant patterns and constraints (optional) -4. **Spec Generation** - Create the detailed specification document -5. **Review and Refine** - Validate completeness and clarity with the user +## Step 1: Create Spec Directory -## Step 1: Initial Scope Assessment +Create the spec directory structure before proceeding with any other steps. This ensures all files (questions, spec, tasks, proofs) have a consistent location. -Before asking questions, evaluate whether this feature request is appropriately sized for this spec-driven workflow. +**Directory Structure:** + +- **Path**: `./docs/specs/[NN]-spec-[feature-name]/` where `[NN]` is a zero-padded 2-digit sequence number (e.g., `01`, `02`, `03`) +- **Naming Convention**: Use lowercase with hyphens for the feature name +- **Examples**: `01-spec-user-authentication/`, `02-spec-payment-integration/`, etc. + +**Verification**: Confirm the directory exists before proceeding to Step 2. + +## Step 2: Context Assessment + +If working in a pre-existing project, begin by briefly reviewing the codebase and existing docs to understand: + +- Current architecture patterns and conventions +- Relevant existing components or features +- Integration constraints or dependencies +- Files that might need modification or extension +- **Repository Standards and Patterns**: Identify existing coding standards, architectural patterns, and development practices from: + - Project documentation (README.md, CONTRIBUTING.md, docs/) + - AI specific documentation (AGENTS.md, CLAUDE.md) + - Configuration files (package.json, Cargo.toml, pyproject.toml, etc.) + - Existing code structure and naming conventions + - Testing patterns and quality assurance practices + - Commit message conventions and development workflows + +**Use this context to inform scope validation and requirements, not to drive technical decisions.** Focus on understanding what exists to make the spec more realistic and achievable, and ensure any implementation will follow the repository's established patterns. + +## Step 3: Initial Scope Assessment + +Evaluate whether this feature request is appropriately sized for this spec-driven workflow. **Chain-of-thought reasoning:** - Consider the complexity and scope of the requested feature - Compare against the following examples +- Use context from Step 2 to inform the assessment - If scope is too large, suggest breaking into smaller specs - If scope is too small, suggest direct implementation without formal spec @@ -102,13 +132,16 @@ Before asking questions, evaluate whether this feature request is appropriately - Creating a single database migration with rollback capability - Implementing one user story with complete end-to-end flow -If the scope appears inappropriate, inform the user and suggest alternatives before proceeding. +### Report Scope Assessment To User -## Step 2: Clarifying Questions +- **ALWAYS** inform the user of the result of the scope assessment. +- If the scope appears inappropriate, **ALWAYS** pause the conversation to suggest alternatives and get input from the user. -Ask clarifying questions to gather sufficient detail. **Always provide numbered or lettered options** to allow the user to make selections easily by responding with *"1A, 2B, 3C"*, etc. Focus on understanding the "what" and "why" rather than the "how." +## Step 4: Clarifying Questions -Adapt your questions based on the user's input. Use the following common areas to guide your questions: +Ask clarifying questions to gather sufficient detail. Focus on understanding the "what" and "why" rather than the "how." + +Use the following common areas to guide your questions: **Core Understanding:** @@ -126,31 +159,61 @@ Adapt your questions based on the user's input. Use the following common areas t - Any existing design mockups or UI guidelines to follow? - Are there any technical constraints or integration requirements? -**Demo & Proof:** +**Proof Artifacts:** -- How will we demonstrate this feature works? -- What proof artifacts will we need (URLs, CLI output, screenshots)? +- What proof artifacts will demonstrate this feature works (URLs, CLI output, screenshots)? +- What will each artifact demonstrate about the feature? **Progressive Disclosure:** Start with Core Understanding, then expand based on feature complexity and user responses. -## Step 3: Context Assessment (Optional) +### Questions File Format -If the feature involves existing systems, briefly review the codebase and existing docs to understand: +Follow this format exactly when you create the questions file. -- Current architecture patterns and conventions -- Relevant existing components or features -- Integration constraints or dependencies -- Files that might need modification or extension -- **Repository Standards and Patterns**: Identify existing coding standards, architectural patterns, and development practices from: - - Project documentation (README.md, CONTRIBUTING.md, docs/) - - Configuration files (package.json, Cargo.toml, pyproject.toml, etc.) - - Existing code structure and naming conventions - - Testing patterns and quality assurance practices - - Commit message conventions and development workflows +```markdown +# [NN] Questions Round 1 - [Feature Name] -**Use this context to inform scope validation and requirements, not to drive technical decisions.** Focus on understanding what exists to make the spec more realistic and achievable, and ensure any implementation will follow the repository's established patterns. +Please answer each question below (select one or more options, or add your own notes). Feel free to add additional context under any question. -## Step 4: Spec Generation +## 1. [Question Category/Topic] + +[What specific aspect of the feature needs clarification?] + +- [ ] (A) [Option description explaining what this choice means] +- [ ] (B) [Option description explaining what this choice means] +- [ ] (C) [Option description explaining what this choice means] +- [ ] (D) [Option description explaining what this choice means] +- [ ] (E) Other (describe) + +## 2. [Another Question Category/Topic] + +[What specific aspect of the feature needs clarification?] + +- [ ] (A) [Option description explaining what this choice means] +- [ ] (B) [Option description explaining what this choice means] +- [ ] (C) [Option description explaining what this choice means] +- [ ] (D) [Option description explaining what this choice means] +- [ ] (E) Other (describe) +``` + +### Questions File Process + +1. **Create Questions File**: Save questions to a file named `[NN]-questions-[N]-[feature-name].md` where `[N]` is the round number (starting at 1, incrementing for each new round). +2. **Point User to File**: Direct the user to the questions file and instruct them to answer the questions directly in the file. +3. **STOP AND WAIT**: Do not proceed to Step 5. Wait for the user to indicate they have saved their answers. +4. **Read Answers**: After the user indicates they have saved their answers, read the file and continue the conversation. +5. **Follow-Up Rounds**: If answers reveal new questions, create a new questions file with incremented round number (`[NN]-questions-[N+1]-[feature-name].md`) and repeat the process (return to step 3). + +**Iterative Process:** + +- If a user's answer reveals new questions or areas needing clarification, ask follow-up questions in a new questions file. +- Build on previous answers - use context from earlier responses to inform subsequent questions. +- **CRITICAL**: After creating any questions file, you MUST STOP and wait for the user to provide answers before proceeding. +- Only proceed to Step 5 after: + - You have received and reviewed all user answers to clarifying questions + - You have enough detail to populate all spec sections (User Stories, Demoable Units with functional requirements, etc.). + +## Step 5: Spec Generation Generate a comprehensive specification using this exact structure: @@ -174,22 +237,31 @@ Generate a comprehensive specification using this exact structure: [Focus on tangible progress and WHAT will be demonstrated. Define 2-4 small, end-to-end vertical slices using the format below.] ### [Unit 1]: [Title] + **Purpose:** [What this slice accomplishes and who it serves] -**Demo Criteria:** [What will be shown to verify working value] -**Proof Artifacts:** [Tangible evidence - URLs, CLI commands, test names, screenshots] + +**Functional Requirements:** +- The system shall [requirement 1: clear, testable, unambiguous] +- The system shall [requirement 2: clear, testable, unambiguous] +- The user shall [requirement 3: clear, testable, unambiguous] + +**Proof Artifacts:** +- [Artifact type]: [description] demonstrates [what it proves] +- Example: `Screenshot: `--help` output demonstrates new command exists` +- Example: `CLI: `command --flag` returns expected output demonstrates feature works` ### [Unit 2]: [Title] -**Purpose:** [What this slice accomplishes and who it serves] -**Demo Criteria:** [What will be shown to verify working value] -**Proof Artifacts:** [Tangible evidence - URLs, CLI commands, test names, screenshots] -## Functional Requirements +**Purpose:** [What this slice accomplishes and who it serves] -[Focus on system behavior and WHAT the system must do. Each should start with "The system shall..." or "The user shall..."] +**Functional Requirements:** +- The system shall [requirement 1: clear, testable, unambiguous] +- The system shall [requirement 2: clear, testable, unambiguous] -1. [**Requirement 1**: clear, testable, unambiguous] -2. [**Requirement 2**: clear, testable, unambiguous] -3. [**Requirement 3**: clear, testable, unambiguous] +**Proof Artifacts:** +- [Artifact type]: [description] demonstrates [what it proves] +- Example: `Test: MyFeature.test.ts passes demonstrates requirement implementation` +- Example: `Order PDF: PDF downloaded from https://example.com/order-submitted shows completed flow demonstrates end-to-end functionality` ## Non-Goals (Out of Scope) @@ -206,12 +278,13 @@ Generate a comprehensive specification using this exact structure: ## Repository Standards [Identify existing patterns and practices that implementation should follow. Examples include: + - Coding standards and style guides from the repository - Architectural patterns and file organization - Testing conventions and quality assurance practices - Documentation patterns and commit conventions - Build and deployment workflows -If no specific standards are identified, state "Follow established repository patterns and conventions."] + If no specific standards are identified, state "Follow established repository patterns and conventions."] ## Technical Considerations @@ -233,7 +306,7 @@ If no specific standards are identified, state "Follow established repository pa 2. [Question 2] ``` -## Step 5: Review and Refinement +## Step 6: Review and Refinement After generating the spec, present it to the user and ask: @@ -247,11 +320,10 @@ Iterate based on feedback until the user is satisfied. ## Output Requirements **Format:** Markdown (`.md`) -**Directory:** Create `./docs/specs/[NN]-spec-[feature-name]/` where `[NN]` is a zero-padded 2-digit sequence number starting from 01 (e.g., `01`, `02`, `03`). For example, `01-spec-user-authentication/`, `02-spec-payment-integration/`, etc. **Full Path:** `./docs/specs/[NN]-spec-[feature-name]/[NN]-spec-[feature-name].md` -**Example:** For feature "user authentication", create directory `01-spec-user-authentication/` and save file as `01-spec-user-authentication.md` inside it +**Example:** For feature "user authentication", the spec directory would be `01-spec-user-authentication/` with a spec file as `01-spec-user-authentication.md` inside it -## Critical Constraints (Negative Instructions) +## Critical Constraints **NEVER:** @@ -265,31 +337,12 @@ Iterate based on feedback until the user is satisfied. **ALWAYS:** - Ask clarifying questions before generating the spec -- Provide numbered/lettered options for easy selection - Validate scope appropriateness before proceeding - Use the exact spec structure provided above - Ensure the spec is understandable by a junior developer -- Include proof artifacts and demo criteria for each work unit +- Include proof artifacts for each work unit that demonstrate what will be shown - Follow identified repository standards and patterns in all requirements ## What Comes Next Once this spec is complete and approved, instruct the user to run `/generate-task-list-from-spec`. This will start the next step in the workflow, which is to break down the specification into actionable tasks. - -## Final Instructions - -Follow this exact sequence: - -1. **Initial Scope Assessment**: Use the provided examples to evaluate if the feature is appropriately sized -2. **Clarifying Questions**: Ask structured questions with numbered/lettered options for easy selection -3. **Context Assessment**: Review existing codebase for relevant patterns and constraints (optional) -4. **Spec Generation**: Create the spec using the exact structure provided - - **Ensure each section has a distinct purpose** - avoid restating content from previous sections - - **User Stories** focus on motivation and WHY - - **Demoable Units** focus on tangible progress and WHAT will be shown - - **Functional Requirements** focus on system behavior and WHAT the system must do - - **Technical Considerations** focus on implementation constraints and HOW it will be built -5. **Save**: Create directory `./docs/specs/[NN]-spec-[feature-name]/` and save file as `[NN]-spec-[feature-name].md` inside it -6. **Review and Refine**: Validate completeness and clarity with the user -7. **Guide User**: Direct user to the next workflow step (`/generate-task-list-from-spec`) -8. **Stop**: Stop working once user confirms spec is complete diff --git a/prompts/generate-task-list-from-spec.md b/prompts/generate-task-list-from-spec.md index 0d3f89d..022cb39 100644 --- a/prompts/generate-task-list-from-spec.md +++ b/prompts/generate-task-list-from-spec.md @@ -29,13 +29,12 @@ This task list serves as the **execution blueprint** for the entire SDD workflow **Critical Dependencies:** - **Parent tasks** become implementation checkpoints in `/manage-tasks` -- **Demo Criteria** guide implementation verification and user acceptance -- **Proof Artifacts** become the evidence source for `/validate-spec-implementation` +- **Proof Artifacts** guide implementation verification and become the evidence source for `/validate-spec-implementation` - **Task boundaries** determine git commit points and progress markers **What Breaks the Chain:** -- Poorly defined demo criteria → implementation verification fails +- Poorly defined proof artifacts → implementation verification fails - Missing proof artifacts → validation cannot be completed - Overly large tasks → loss of incremental progress and demo capability - Unclear task dependencies → implementation sequence becomes confusing @@ -72,7 +71,7 @@ Ensure complete spec coverage by: 2. **Verify functional requirements** are addressed in specific tasks 3. **Map technical considerations** to implementation details 4. **Identify gaps** where spec requirements aren't covered -5. **Validate acceptance criteria** are testable through demo criteria +5. **Validate acceptance criteria** are testable through proof artifacts ## Proof Artifacts @@ -136,20 +135,46 @@ Wait for explicit user confirmation before generating sub-tasks. Then: ## Phase 2 Output Format (Parent Tasks Only) -When generating parent tasks in Phase 2, use this structure WITHOUT sub-tasks: +When generating parent tasks in Phase 2, use this hierarchical structure with Tasks section marked "TBD": ```markdown ## Tasks -- [ ] 1.0 Parent Task Title - - Demo Criteria: "Open /path and complete X end-to-end; acceptance: Y visible/returned" - - Proof Artifact(s): "URL: https://..., CLI: command & expected output, Test: MyFeature.test.ts" -- [ ] 2.0 Parent Task Title - - Demo Criteria: "User can perform Z with persisted state" - - Proof Artifact(s): "Screenshot of flow; link to test suite section" -- [ ] 3.0 Parent Task Title - - Demo Criteria: "Configuration is verifiable via command/output" - - Proof Artifact(s): "CLI: config get … -> expected value; log line; diff link" +### [ ] 1.0 Parent Task Title + +#### 1.0 Proof Artifact(s) + +- Screenshot: `/path` page showing completed X flow demonstrates end-to-end functionality +- URL: https://... demonstrates feature is accessible +- CLI: `command --flag` returns expected output demonstrates feature works +- Test: `MyFeature.test.ts` passes demonstrates requirement implementation + +#### 1.0 Tasks + +TBD + +### [ ] 2.0 Parent Task Title + +#### 2.0 Proof Artifact(s) + +- Screenshot: User flow showing Z with persisted state demonstrates feature persistence +- Test: `UserFlow.test.ts` passes demonstrates state management works + +#### 2.0 Tasks + +TBD + +### [ ] 3.0 Parent Task Title + +#### 3.0 Proof Artifact(s) + +- CLI: `config get ...` returns expected value demonstrates configuration is verifiable +- Log: Configuration loaded message demonstrates system initialization +- Diff: Configuration file changes demonstrates setup completion + +#### 3.0 Tasks + +TBD ``` ## Phase 3 Output Format (Complete with Sub-Tasks) @@ -175,21 +200,44 @@ After user confirmation in Phase 3, update the file with this complete structure ## Tasks -- [ ] 1.0 Parent Task Title - - Demo Criteria: "Open /path and complete X end-to-end; acceptance: Y visible/returned" - - Proof Artifact(s): "URL: https://..., CLI: command & expected output, Test: MyFeature.test.ts" - - [ ] 1.1 [Sub-task description 1.1] - - [ ] 1.2 [Sub-task description 1.2] -- [ ] 2.0 Parent Task Title - - Demo Criteria: "User can perform Z with persisted state" - - Proof Artifact(s): "Screenshot of flow; link to test suite section" - - [ ] 2.1 [Sub-task description 2.1] - - [ ] 2.2 [Sub-task description 2.2] -- [ ] 3.0 Parent Task Title (may not require sub-tasks if purely structural or configuration) - - Demo Criteria: "Configuration is verifiable via command/output" - - Proof Artifact(s): "CLI: config get … -> expected value; log line; diff link" - - [ ] 3.1 [Sub-task description 3.1] - - [ ] 3.2 [Sub-task description 3.2] +### [ ] 1.0 Parent Task Title + +#### 1.0 Proof Artifact(s) + +- Screenshot: `/path` page showing completed X flow demonstrates end-to-end functionality +- URL: https://... demonstrates feature is accessible +- CLI: `command --flag` returns expected output demonstrates feature works +- Test: `MyFeature.test.ts` passes demonstrates requirement implementation + +#### 1.0 Tasks + +- [ ] 1.1 [Sub-task description 1.1] +- [ ] 1.2 [Sub-task description 1.2] + +### [ ] 2.0 Parent Task Title + +#### 2.0 Proof Artifact(s) + +- Screenshot: User flow showing Z with persisted state demonstrates feature persistence +- Test: `UserFlow.test.ts` passes demonstrates state management works + +#### 2.0 Tasks + +- [ ] 2.1 [Sub-task description 2.1] +- [ ] 2.2 [Sub-task description 2.2] + +### [ ] 3.0 Parent Task Title + +#### 3.0 Proof Artifact(s) + +- CLI: `config get ...` returns expected value demonstrates configuration is verifiable +- Log: Configuration loaded message demonstrates system initialization +- Diff: Configuration file changes demonstrates setup completion + +#### 3.0 Tasks + +- [ ] 3.1 [Sub-task description 3.1] +- [ ] 3.2 [Sub-task description 3.2] ``` ## Interaction Model @@ -201,7 +249,7 @@ After user confirmation in Phase 3, update the file with this complete structure 3. **No Auto-progression:** Never automatically proceed to sub-tasks or implementation **Example interaction:** -> "I have analyzed the spec and generated [X] parent tasks that represent demoable units of work. Each task includes demo criteria and proof artifacts. Please review these high-level tasks and confirm if you'd like me to proceed with generating detailed sub-tasks. Respond with 'Generate sub tasks' to continue." +> "I have analyzed the spec and generated [X] parent tasks that represent demoable units of work. Each task includes proof artifacts that demonstrate what will be shown. Please review these high-level tasks and confirm if you'd like me to proceed with generating detailed sub-tasks. Respond with 'Generate sub tasks' to continue." ## Target Audience @@ -211,7 +259,7 @@ Write tasks and sub-tasks for a **junior developer** who: - Is familiar with the existing codebase structure - Needs clear, actionable steps without ambiguity - Will be implementing tasks independently -- Relies on demo criteria to verify completion +- Relies on proof artifacts to verify completion - Must follow established repository patterns and conventions ## Quality Checklist @@ -219,7 +267,7 @@ Write tasks and sub-tasks for a **junior developer** who: Before finalizing your task list, verify: - [ ] Each parent task is demoable and has clear completion criteria -- [ ] Demo Criteria are specific and measurable +- [ ] Proof Artifacts are specific and demonstrate clear functionality - [ ] Proof Artifacts are appropriate for each task - [ ] Tasks are appropriately scoped (not too large/small) - [ ] Dependencies are logical and sequential @@ -239,7 +287,7 @@ Once this task list is complete and approved, instruct the user to run `/manage- 2. Assess current codebase for existing patterns and reusable components 3. Generate high-level tasks that represent demoable units of work (adjust count based on spec complexity) and save them to `./docs/specs/[NN]-spec-[feature-name]/[NN]-tasks-[feature-name].md` 4. **CRITICAL**: Stop after generating parent tasks and wait for "Generate sub tasks" confirmation before proceeding. -5. Ensure every parent task has specific Demo Criteria and Proof Artifacts +5. Ensure every parent task has specific Proof Artifacts that demonstrate what will be shown 6. Identify all relevant files for creation/modification 7. Review with user and refine until satisfied 8. Guide user to the next workflow step (`/manage-tasks`) diff --git a/prompts/manage-tasks.md b/prompts/manage-tasks.md index c357c4a..044cd89 100644 --- a/prompts/manage-tasks.md +++ b/prompts/manage-tasks.md @@ -29,13 +29,12 @@ This implementation phase serves as the **execution engine** for the entire SDD **Critical Dependencies:** - **Parent tasks** become implementation checkpoints and commit boundaries -- **Demo criteria** guide implementation verification and user acceptance -- **Proof artifacts** become the evidence source for `/validate-spec-implementation` +- **Proof artifacts** guide implementation verification and become the evidence source for `/validate-spec-implementation` - **Task boundaries** determine git commit points and progress markers **What Breaks the Chain:** -- Skipping demo criteria → implementation cannot be verified +- Missing or unclear proof artifacts → implementation cannot be verified - Missing proof artifacts → validation cannot be completed - Inconsistent commits → loss of progress tracking and rollback capability - Ignoring task boundaries → loss of incremental progress and demo capability @@ -83,7 +82,7 @@ For each parent task, follow this structured workflow with built-in verification [ ] Locate task file: `./docs/specs/[NN]-spec-[feature-name]/[NN]-tasks-[feature-name].md` [ ] Read current task status and identify next sub-task [ ] Verify checkpoint mode preference with user -[ ] Confirm demo criteria for current parent task +[ ] Review proof artifacts required for current parent task [ ] Review repository standards and patterns identified in spec [ ] Verify required tools and dependencies are available ``` @@ -120,7 +119,7 @@ When all sub-tasks are `[x]`, complete these steps IN ORDER: - **Format**: Use markdown code blocks with clear section headers - **Execute commands immediately**: Capture command output directly in the markdown file - **Verify creation**: Confirm the markdown file exists and contains all required evidence -[ ] **Verify Demo Criteria**: Confirm all demo requirements are met +[ ] **Verify Proof Artifacts**: Confirm all proof artifacts demonstrate required functionality [ ] **Stage Changes**: `git add .` [ ] **Create Commit**: Use repository's commit format and conventions @@ -157,7 +156,7 @@ After each parent task completion, verify: [ ] Proof artifacts exist in correct directory with proper naming [ ] Git commit created with proper format (verify with `git log --oneline -1`) [ ] All tests are passing using repository's test approach -[ ] Demo criteria are satisfied +[ ] Proof artifacts demonstrate all required functionality [ ] Commit message includes task reference and spec number [ ] Repository quality gates pass (linting, formatting, etc.) [ ] Implementation follows identified repository patterns and conventions @@ -207,20 +206,20 @@ Each parent task must include artifacts that: For each parent task completion: [ ] **Directory Ready**: `./docs/specs/[NN]-spec-[feature-name]/[NN]-proofs/` exists -[ ] **Review Task Requirements**: Check what demo evidence the task specifically requires +[ ] **Review Task Requirements**: Check what proof artifacts the task specifically requires [ ] **Create Single Proof File**: Create `[spec-number]-task-[task-number]-proofs.md` [ ] **Include All Evidence in One File**: - ## CLI Output section with command results - ## Test Results section with test output - ## Screenshots section with image references - ## Configuration section with config examples - - ## Demo Validation section showing criteria met + - ## Verification section showing proof artifacts demonstrate required functionality [ ] **Format with Markdown**: Use code blocks, headers, and clear organization [ ] **Verify File Content**: Ensure the markdown file contains all required evidence **SIMPLE VERIFICATION**: One file per task, all evidence included **CONTENT VERIFICATION**: Check the markdown file contains required sections -**DEMO VERIFICATION**: Ensure file demonstrates all demo criteria are met +**VERIFICATION**: Ensure proof artifact file demonstrates all required functionality **The single markdown proof file must be created BEFORE the parent task commit** ``` @@ -270,7 +269,7 @@ Before marking parent task as complete: After completing all tasks in the task list: 1. **Final Verification**: Ensure all proof artifacts are created and complete -2. **Demo Validation**: Verify all demo criteria from original spec are met +2. **Proof Artifact Validation**: Verify all proof artifacts demonstrate functionality from original spec 3. **Test Suite**: Run final comprehensive test suite 4. **Documentation**: Update any relevant documentation 5. **Handoff**: Instruct user to proceed to `/validate-spec-implementation` @@ -319,7 +318,7 @@ Implementation is successful when: - Proof artifacts exist for each parent task - Git commits follow repository format with proper frequency - All tests pass using repository's testing approach -- Demo criteria are met +- Proof artifacts demonstrate all required functionality - Repository quality gates pass consistently - Task file accurately reflects final status - Implementation follows established repository patterns and conventions diff --git a/prompts/validate-spec-implementation.md b/prompts/validate-spec-implementation.md index ad7f5fb..5b22a33 100644 --- a/prompts/validate-spec-implementation.md +++ b/prompts/validate-spec-implementation.md @@ -30,15 +30,14 @@ This validation phase serves as the **quality gate** for the entire SDD workflow **Critical Dependencies:** - **Functional Requirements** become the validation criteria for code coverage -- **Demo Criteria** guide the verification of user-facing functionality -- **Proof Artifacts** provide the evidence source for validation checks +- **Proof Artifacts** guide the verification of user-facing functionality and provide the evidence source for validation checks - **Relevant Files** define the scope of changes to be validated **What Breaks the Chain:** - Missing proof artifacts → validation cannot be completed - Incomplete task coverage → gaps in spec implementation -- Unclear demo criteria → cannot verify user acceptance +- Unclear or missing proof artifacts → cannot verify user acceptance - Inconsistent file references → validation scope becomes ambiguous ## Your Role @@ -52,7 +51,7 @@ Validate that the **code changes** conform to the Spec and Task List by verifyin ## Context - **Specification file** (source of truth for requirements). -- **Task List file** (contains Demo Criteria, Proof Artifacts, and Relevant Files). +- **Task List file** (contains Proof Artifacts and Relevant Files). - Assume the **Repository root** is the current working directory. - Assume the **Implementation work** is on the current git branch. @@ -80,11 +79,11 @@ If no spec is provided, follow this exact sequence: Map score to severity: 0→CRITICAL, 1→HIGH, 2→MEDIUM, 3→OK. -- **R1 Spec Coverage:** Every Functional Requirement is traceable to code changes. +- **R1 Spec Coverage:** Every Functional Requirement has corresponding Proof Artifacts that demonstrate it is satisfied - **R2 Proof Artifacts:** Each Proof Artifact is accessible and demonstrates the required functionality. - **R3 File Integrity:** All changed files are listed in "Relevant Files" and vice versa. - **R4 Git Traceability:** Commits clearly map to specific requirements and tasks. -- **R5 Evidence Quality:** Evidence includes specific file paths, line numbers, and artifact outputs. +- **R5 Evidence Quality:** Evidence includes proof artifact test results and file existence checks. - **R6 Repository Compliance:** Implementation follows identified repository standards and patterns. ## Validation Process (step-by-step chain-of-thought) @@ -117,13 +116,13 @@ Map score to severity: 0→CRITICAL, 1→HIGH, 2→MEDIUM, 3→OK. For each Functional Requirement, Demoable Unit, and Repository Standard: -1) Pose a verification question (e.g., "Is FR-3 implemented in the changed files?"). +1) Pose a verification question (e.g., "Do Proof Artifacts demonstrate FR-3?"). 2) Verify with independent checks: - - Search changed files for requirement implementation (glob/grep) - - Test each Proof Artifact (URLs, CLI commands, test references) - - Verify file content matches requirement specifications - - Check repository pattern compliance -3) Record **evidence** (file paths + line ranges, artifact outputs, commit references). + - Verify proof artifact files exist (from task list) + - Test that each Proof Artifact (URLs, CLI commands, test references) demonstrates what it claims + - Verify file existence for "Relevant Files" listed in task list + - Check repository pattern compliance (via proof artifacts, file checks, and commit log analysis) +3) Record **evidence** (proof artifact test results, file existence checks, commit references). 4) Mark each item **Verified**, **Failed**, or **Unknown**. ## Detailed Checks @@ -139,10 +138,10 @@ For each Functional Requirement, Demoable Unit, and Repository Standard: - Test references exist and can be executed - Screenshots/demos show required functionality -3) **Requirement Implementation** - - Functional requirements are present in changed code - - Demo Criteria are satisfied by the implementation - - Code structure follows spec specifications +3) **Requirement Coverage** + - Proof Artifacts exist for each Functional Requirement + - Proof Artifacts demonstrate functionality as specified in the spec + - All required proof artifact files exist and are accessible 4) **Repository Compliance**: Implementation follows identified repository patterns and conventions - Verify coding standards compliance @@ -159,7 +158,7 @@ For each Functional Requirement, Demoable Unit, and Repository Standard: - Missing or non-functional Proof Artifacts - Changed files not listed in "Relevant Files" without justification in commit messages -- Functional Requirements with no implementation evidence +- Functional Requirements with no proof artifacts - Git commits unrelated to spec implementation - Any `Unknown` entries in the Coverage Matrix - Repository pattern violations (coding standards, quality gates, workflows) @@ -181,8 +180,8 @@ Provide three tables (edit as needed): | Requirement ID/Name | Status (Verified/Failed/Unknown) | Evidence (file:lines, commit, or artifact) | | --- | --- | --- | -| FR-1 | Verified | `src/feature/x.ts#L10-L58`; commit `abc123` | -| FR-2 | Failed | No implementation found in changed files | +| FR-1 | Verified | Proof artifact: `test-x.ts` passes; commit `abc123` | +| FR-2 | Failed | No proof artifact found for this requirement | #### Repository Standards @@ -195,27 +194,33 @@ Provide three tables (edit as needed): #### Proof Artifacts -| Demo Unit | Proof Artifact | Status | Evidence & Output | +| Unit/Task | Proof Artifact | Status | Verification Result | | --- | --- | --- | --- | -| Demo-1 | URL: https://... | Verified | Returns "200 OK" with expected content | -| Demo-2 | CLI: command | Failed | Exit code 1: "Error: missing parameter" | +| Unit-1 | Screenshot: `/path` page demonstrates end-to-end functionality | Verified | HTTP 200 OK, expected content present | +| Unit-2 | CLI: `command --flag` demonstrates feature works | Failed | Exit code 1: "Error: missing parameter" | -### 3) Issues (use rubric → severity) +### 3) Validation Issues -For each issue: +Report any issues found during validation that prevent verification or indicate problems. Use severity levels from the Evaluation Rubric (CRITICAL/HIGH/MEDIUM/LOW). Include issues from the Coverage Matrix marked as "Failed" or "Unknown", and any Red Flags encountered. -- **Severity:** CRITICAL/HIGH/MEDIUM/LOW -- **What & Where:** concise description + concrete paths/lines -- **Evidence:** minimal diff or command output -- **Root Cause:** spec | task | implementation -- **Impact:** functionality | demo | traceability -- **Recommendation:** precise, actionable steps +**Issue Format:** -> **Few‑shot exemplars** -> -> - *HIGH* — Proof Artifact URL returns 404. Evidence: `curl -I https://example.com/demo` → "HTTP/1.1 404 Not Found". **Impact:** Demo criteria cannot be verified. **Fix:** Update URL or deploy missing endpoint. -> - *CRITICAL* — Changed file `src/auth.ts` not in "Relevant Files". Evidence: `git diff` shows new file but task list only references `src/user.ts`. **Impact:** Implementation scope creep. **Fix:** Update task list or revert changes. -> - *Reject (too vague)* — "Some files are missing." +For each issue, provide: + +- **Severity:** CRITICAL/HIGH/MEDIUM/LOW (based on rubric scoring) +- **Issue:** Concise description with location (file paths from task list or proof artifact references) and evidence (proof artifact test results, file existence checks, coverage gaps) +- **Impact:** What breaks or cannot be verified (functionality | verification | traceability) +- **Recommendation:** Precise, actionable steps to resolve + +**Examples:** + +| Severity | Issue | Impact | Recommendation | +| --- | --- | --- | --- | +| HIGH | Proof Artifact URL returns 404. `task-list.md#L45` references `https://example.com/demo`. Evidence: `curl -I https://example.com/demo` → "HTTP/1.1 404 Not Found" | Functionality cannot be verified | Update URL in task list or deploy missing endpoint | +| CRITICAL | Changed file not in "Relevant Files". `src/auth.ts` created but not listed in task list. Evidence: `git log --name-only` shows file created; task list only references `src/user.ts` | Implementation scope creep | Update task list to include `src/auth.ts` or revert unauthorized changes | +| MEDIUM | Missing proof artifact for FR-2. Task list specifies test file `src/feature/x.test.ts` but file does not exist. Evidence: File check shows `src/feature/x.test.ts` missing | Requirement verification incomplete | Add test file `src/feature/x.test.ts` as specified in task list | + +**Note:** Do not report issues that are already clearly marked in the Coverage Matrix unless additional context is needed. Focus on actionable problems that need resolution. ### 4) Evidence Appendix @@ -224,6 +229,20 @@ For each issue: - File comparison results (expected vs actual) - Commands executed with results +## Saving The Output + +After generation is complete: + +- Save the report using the specification below +- Verify the file was created successfully + +### Validation Report File Details + +**Format:** Markdown (`.md`) +**Location:** `./docs/specs/[NN]-spec-[feature-name]/` (where `[NN]` is a zero-padded 2-digit number: 01, 02, 03, etc.) +**Filename:** `[NN]-validation-[feature-name].md` (e.g., if the Spec is `01-spec-user-authentication.md`, save as `01-validation-user-authentication.md`) +**Full Path:** `./docs/specs/[NN]-spec-[feature-name]/[NN]-validation-[feature-name].md` + ## What Comes Next Once validation is complete and all issues are resolved, the implementation is ready for merge. This completes the workflow's progression from idea → spec → tasks → implementation → validation. Instruct the user to do a final code review before merging the changes.