Skip to main content

Generate Scope, Impact, Checklist & Test Cases

Route: Open Quality Assurance
Assignee: QA Engineer 🧪

The Quality Assurance module automatically generates four kinds of QA artifacts from an approved BRD and Implementation Plan: Scope → Impact → Checklist → Test Cases. Each step depends on the previous one — the flow runs in a controlled sequential order.


Interface overview

  • Left column: List of QA tickets with status badges and a search box.
  • Right area: A 4-step workspace.

Ticket status: CREATEDDRAFTINGWAITING_REVIEWAPPROVED / REJECTED.


Step 0: Create a QA Ticket

Click "+" in the left column → fill in the fields (Summary, Description, Type, Due Date, Assignee, Labels, Components) → click "Create".


Step 1: Source — Configure analysis sources

The "Source" tab opens by default. The QA Engineer adds all relevant information sources:

SourceDescription
BRD SelectionsApproved BRD for the feature under test
Implementation Plan SelectionsApproved Implementation Plan — the source for generating technical test cases
Jira TicketsRelated Jira tickets
Confluence PagesReference documents
Uploaded FilesAdditional uploaded documents
System FeaturesImpacted systems and features

Click "Analyse". The system processes every source and shows "AI is processing data..." during analysis. When finished, it shows "✓ Analysis success".


Step 2: Scope & Impact — Coverage and risk

The "Scope & Impact" tab shows two tables side by side.

Scope — Test coverage

The Scope table lists elements that are in scope and out of scope for testing. Each row has the following columns:

ColumnContent
Scope IDAuto-generated code (prefix SCO-)
SystemImpacted system
ComponentSpecific component
ElementDetailed element/feature
Scope DescriptionScope description (In Scope / Out of Scope)

Impact — Impact assessment

The Impact table assesses risk and effects on other modules. Each row has the following columns:

ColumnContent
Impact IDAuto-generated code (prefix IMP-)
SystemAffected system
ComponentAffected component
ElementSpecific element
Impact DescriptionImpact description and risk level

Working with Scope & Impact

Edit:

  1. Click "Edit" — the table switches to inline edit mode.
  2. Edit each cell directly.
  3. Click "Save" to save changes.

Move on to the Checklist:

Once Scope & Impact is complete and saved, click "Generate Checklist" above the table. The system starts generating the Checklist based on the scope just defined. The button shows "Generating..." during processing — it cannot be clicked again until it finishes.


Step 3: Checklist — Verification list

The "Checklist" tab shows the list of verification actions to perform. Each row is a checklist item:

ColumnContent
Checklist IDAuto-generated code (prefix CHE-)
TypeVerification type
Item/Ref IDReference to the related Scope or Requirement
DescriptionDescription of the specific verification action

Working with the Checklist

Edit:

  1. Click "Edit" — the table allows inline editing of every cell.
  2. Add a new row with the "+" button at the bottom of the table.
  3. Delete a row with the "X" button on that row.
  4. Click "Save" to save.

Move on to Test Cases:

Once the Checklist is complete and saved, click "Generate Test Cases". The system generates Test Cases from the Checklist. The button shows "Generating..." and is disabled during processing.


Step 4: Test Case — Test details

The "Test Case" tab is split into two areas:

  • Left panel: A hierarchical tree of test cases (Group → Suite → Test Case). Click any test case to view its details.
  • Right panel: Details of the selected test case.

Test Cases table

Each test case has the following attributes:

ColumnContent
Test Case IDAuto-generated code (prefix CAS-)
GroupFunctional group
SuiteTest suite
TitleTest case title
Pre-ConditionPreconditions
Test DescriptionScenario description
PriorityPriority level
ComplexityComplexity level
Test TypeTest type (functional, integration, ...)

Test Steps — Execution steps

For each test case, the AI generates a sequential list of Test Steps:

ColumnContent
Step #Step number
Step DescriptionDescription of the action to perform
Expected ResultExpected result after performing the step

You can edit each step, add a new step with the "+" button at the end of the list, or delete a step with the "X" button on that row.

Create a Test Case manually

Click "Create Test Case" (the + icon) at the bottom of the left panel. A modal opens with the following fields:

  • Test Case Title
  • Group
  • Suite
  • Priority
  • Complexity
  • Test Steps (dynamic list — add/remove steps)

Send for Review

Once the Test Cases are complete:

  1. Click "Request Review" — the status changes to WAITING_REVIEW.
  2. The reviewer clicks "Approve" or "Reject".
One-way flow — be careful when regenerating

Scope/Impact → Checklist → Test Cases is a one-way flow. If you change Scope after the Checklist has been generated, you must click "Generate Checklist" again to re-sync. AIPD does not automatically regenerate the Checklist or Test Cases when Scope changes — this avoids losing any hand-edited data.

Test Cases vs Test Execution

This module only generates the content of the test cases. Actually running the tests (recording pass/fail results) happens in the separate Test Execution module.