LogoLogo
Product
  • Knowledge Base
  • What's New
  • Guides
  • User Story Coverage
    • Getting Started
    • User Story Challenges & Solution
      • Typical Implementation
      • The Challenges
      • The Solution
    • User Story Coverage Report Overview
      • Release Go / No Go Report
        • How to Generate / Edit the Report
      • User Story Quality Overview
        • How to Generate the User Story View
      • User Story Coverage Analysis
        • How to Generate the Analysis View
      • Uncovered Methods View
        • How to Generate the View
      • Customization
      • Integration
    • Use Cases by Persona
      • Managers
        • Informed Go/No Go Decisions Making
        • Effective Resources Prioritization
        • Overall Progress Monitoring
      • Developers
        • Code Quality Ownership
        • Seamless Collaboration with QA
        • Code Review Facilitator
      • QA Engineers
        • Test Execution Progress Monitoring
        • Testing Effort Prioritization
        • Testing Strategy Planing
    • Technical Overview
      • User Story Coverage Mechanism
      • Technical Architecture
      • Deployment Guide
        • US1_getResults.sh
        • US2_createReport.sh
        • US_UpdateConfluence.sh
  • Test Optimization
    • Getting Started
    • Test Execution Challenges & Solution
      • The Challenges
      • Test Optimization Solution
      • Test Optimization Main Advantages
    • Test Optimization Overview
      • Automated Test Optimization
      • Manual Test Optimization
      • Test Optimization for Pull Request
      • Test Selection Policies
        • Full Run Policy
        • No Code Changes Policy
        • Common Code Policy
        • Fastest Path to 100% Coverage Policy
      • Integrations
    • Use Cases by Persona
      • Managers
        • Fast Delivery
        • Resource Optimization
        • Thorough Testing in Tight Schedule
      • Developers
        • Exploring Only Relevant Test Failures
        • Faster Feedback Loop
        • Shift Left Testing
      • QA Engineers & Manual Testers
        • Faster & Focused Manual Testing
        • Optimizing Test Suite
        • Having Stable Product for Testing
    • Technical Overview
      • Test Optimization Mechanism
        • Associating Code With Tests
          • Statistical modeling
          • One-to-One Mapping
          • Calibration
        • Detecting Modified Code
        • Generating Test Recommendations
      • Technical Architecture
      • Deployment Guide
  • Quality Improvement
    • Getting Started
    • Challenges & Approach Comparison
      • The Challenges
      • Quality Improvement Approaches
      • Choosing the Right Approach
    • Quality Improvement Solution Overview
      • Test Gaps Analysis Report
        • How to Generate / Edit the Report
      • Coverage Trend Report
        • How to Generate / Edit the Report
      • Proof of Testing Report
        • How to Generate / Edit the Report
      • Release Quality Improvement Guide
        • STEP 1: Deploy SeaLights
        • STEP 2: Take a Quality Snapshot
        • STEP 3: Prioritize Code Areas
          • Add Code Labels
          • Ignore Irrelevant Code
          • Perform a Deep CSV Analysis
        • STEP 4: Set Baseline & Threshold
        • STEP 5: Analyze Test Gaps
        • STEP 6: Write Tests
        • Step 7: Make a Go / No Go Decision Based on Quality Gate
        • STEP 8: Measure Defect Escape Rate
      • Over Time Quality Improvement Guide
        • STEP 1: Deploy SeaLights
        • STEP 2: Take a Quality Snapshot
        • STEP 3: Prioritize code areas
          • Add Code Labels
          • Ignore Irrelevant Code
          • Perform a Deep CSV Analysis
        • STEP 4: Set Baseline & Goal
        • STEP 5: Set timeline
        • STEP 6: Write tests
        • STEP 7: Monitor progress
        • STEP 8: Measure Defect Escape Rate
    • Use Cases by Persona
      • Managers
        • Effective Prioritization & Budget Allocation
        • Tracking Progress & Measuring Impact
        • Data-Driven Release Decisions
        • Transparency & Communication
      • Developers
        • Mastering Code Coverage
        • Seamless Collaboration with QA
        • Code Quality Ownership
      • QA Engineers
        • Prioritizing Test Efforts
        • Contributing to Release Informed Decisions
        • Seamless Collaboration with Developers
        • Evaluating Testing Strategy
    • Technical Overview
      • Solution Mechanism
      • Technical Architecture
      • Deployment Guide
  • Value Proposition
    • Overview
    • Quality Use Cases
      • Go/No Go Decisions
      • Quality Improvement & Test Gaps
      • Governance & Quality Gates
      • Compliance & Proof of Testing
    • Test Optimization Use Cases
      • Reduce Costs & Infrastructure
      • Shorten Release Cycles
      • Reduce Troubleshooting
Powered by GitBook
On this page

Was this helpful?

  1. Quality Improvement
  2. Quality Improvement Solution Overview
  3. Release Quality Improvement Guide

STEP 5: Analyze Test Gaps

PreviousSTEP 4: Set Baseline & ThresholdNextSTEP 6: Write Tests

Was this helpful?

Typically, this step is done by QA Manager and the related Engineering Manager / Dev Team Lead / Tech Architect.

Tackle Test Gaps, Achieve "0 Critical": A New Era in Quality Control

Traditionally, Test Analysis meant finding code problems for developers to fix and retesting with existing tests. But with "Test Gaps," the game changes! SeaLights introduces a paradigm shift in your quality workflow with the "0 Critical Test Gaps" KPI. This powerful metric empowers you to:

  1. Identify & Address Missing Tests: During Test Analysis, analyze Untested Code Changes within your targeted code subset. This proactive approach minimizes the risk of escaped defects.

  2. Expand Responsibilities: Beyond identifying fixable code issues, QA, Automation Engineers, and other team members join forces and actively create new tests or modify existing ones to cover critical areas impacted by code changes. This continuous ownership across teams fosters a culture of collective quality.

  3. Prioritize Effectively: Conduct Test Gaps Analysis after each major test cycle using relevant code labels to prioritize critical gaps for immediate action. Data-driven decisions.

The ultimate goal is to deploy the product without any critical test gaps and achieve the 0 critical Test Gaps KPI. This ensures readiness for the go/no-go decision and aligns with the Code Changes Coverage KPI.

By striving for zero critical test gaps, you can achieve comprehensive test coverage and significantly reduce the risk of critical defects in your software.

Can code labels be added to an existing Test Gaps report?

No, it is not possible to add code labels to an existing Test Gaps report. Once a report is created, it cannot be edited to include additional code labels. The only editable aspect of an existing report is the date/build range.

If you have added a new code label and wish to capture data related to the code change associated with that label, you will need to create a new report. This ensures that the report reflects the specific data and coverage related to the code changes associated with the newly added code label.

Please note that generating the TGA report may take a few minutes to complete. To ensure a smooth and efficient experience, we recommend preparing the report data in advance if you plan to use it during a meeting. This preparation will help you make the most out of your meeting time and ensure a productive discussion centered around the test gaps identified in the report.