Generate Scope, Impact, Checklist & Test Cases
Route: Open Quality Assurance
Assignee: QA Engineer 🧪
The Quality Assurance module automatically generates four kinds of QA artifacts from an approved BRD and Implementation Plan: Scope → Impact → Checklist → Test Cases. Each step depends on the previous one — the flow runs in a controlled sequential order.
Interface overview
- Left column: List of QA tickets with status badges and a search box.
- Right area: A 4-step workspace.
Ticket status: CREATED → DRAFTING → WAITING_REVIEW → APPROVED / REJECTED.
Step 0: Create a QA Ticket
Click "+" in the left column → fill in the fields (Summary, Description, Type, Due Date, Assignee, Labels, Components) → click "Create".
Step 1: Source — Configure analysis sources
The "Source" tab opens by default. The QA Engineer adds all relevant information sources:
| Source | Description |
|---|---|
| BRD Selections | Approved BRD for the feature under test |
| Implementation Plan Selections | Approved Implementation Plan — the source for generating technical test cases |
| Jira Tickets | Related Jira tickets |
| Confluence Pages | Reference documents |
| Uploaded Files | Additional uploaded documents |
| System Features | Impacted systems and features |
Click "Analyse". The system processes every source and shows "AI is processing data..." during analysis. When finished, it shows "✓ Analysis success".
Step 2: Scope & Impact — Coverage and risk
The "Scope & Impact" tab shows two tables side by side.
Scope — Test coverage
The Scope table lists elements that are in scope and out of scope for testing. Each row has the following columns:
| Column | Content |
|---|---|
| Scope ID | Auto-generated code (prefix SCO-) |
| System | Impacted system |
| Component | Specific component |
| Element | Detailed element/feature |
| Scope Description | Scope description (In Scope / Out of Scope) |
Impact — Impact assessment
The Impact table assesses risk and effects on other modules. Each row has the following columns:
| Column | Content |
|---|---|
| Impact ID | Auto-generated code (prefix IMP-) |
| System | Affected system |
| Component | Affected component |
| Element | Specific element |
| Impact Description | Impact description and risk level |
Working with Scope & Impact
Edit:
- Click "Edit" — the table switches to inline edit mode.
- Edit each cell directly.
- Click "Save" to save changes.
Move on to the Checklist:
Once Scope & Impact is complete and saved, click "Generate Checklist" above the table. The system starts generating the Checklist based on the scope just defined. The button shows "Generating..." during processing — it cannot be clicked again until it finishes.
Step 3: Checklist — Verification list
The "Checklist" tab shows the list of verification actions to perform. Each row is a checklist item:
| Column | Content |
|---|---|
| Checklist ID | Auto-generated code (prefix CHE-) |
| Type | Verification type |
| Item/Ref ID | Reference to the related Scope or Requirement |
| Description | Description of the specific verification action |
Working with the Checklist
Edit:
- Click "Edit" — the table allows inline editing of every cell.
- Add a new row with the "+" button at the bottom of the table.
- Delete a row with the "X" button on that row.
- Click "Save" to save.
Move on to Test Cases:
Once the Checklist is complete and saved, click "Generate Test Cases". The system generates Test Cases from the Checklist. The button shows "Generating..." and is disabled during processing.
Step 4: Test Case — Test details
The "Test Case" tab is split into two areas:
- Left panel: A hierarchical tree of test cases (Group → Suite → Test Case). Click any test case to view its details.
- Right panel: Details of the selected test case.
Test Cases table
Each test case has the following attributes:
| Column | Content |
|---|---|
| Test Case ID | Auto-generated code (prefix CAS-) |
| Group | Functional group |
| Suite | Test suite |
| Title | Test case title |
| Pre-Condition | Preconditions |
| Test Description | Scenario description |
| Priority | Priority level |
| Complexity | Complexity level |
| Test Type | Test type (functional, integration, ...) |
Test Steps — Execution steps
For each test case, the AI generates a sequential list of Test Steps:
| Column | Content |
|---|---|
| Step # | Step number |
| Step Description | Description of the action to perform |
| Expected Result | Expected result after performing the step |
You can edit each step, add a new step with the "+" button at the end of the list, or delete a step with the "X" button on that row.
Create a Test Case manually
Click "Create Test Case" (the + icon) at the bottom of the left panel. A modal opens with the following fields:
- Test Case Title
- Group
- Suite
- Priority
- Complexity
- Test Steps (dynamic list — add/remove steps)
Send for Review
Once the Test Cases are complete:
- Click "Request Review" — the status changes to
WAITING_REVIEW. - The reviewer clicks "Approve" or "Reject".
Scope/Impact → Checklist → Test Cases is a one-way flow. If you change Scope after the Checklist has been generated, you must click "Generate Checklist" again to re-sync. AIPD does not automatically regenerate the Checklist or Test Cases when Scope changes — this avoids losing any hand-edited data.
This module only generates the content of the test cases. Actually running the tests (recording pass/fail results) happens in the separate Test Execution module.