LogoLogo
Contact SupportProduct
  • Knowledge Base
  • What's New
  • Guides
  • Agent Release Notes
  • Welcome!
  • Intro to SeaLights
    • What is SeaLights
      • Glossary
      • Working in Conjunction with Your Quality / Coverage Tools
        • SeaLights vs SonarQube
        • SeaLights vs JaCoCo
        • SeaLights vs CodeSee
    • Technical Overview
      • Test Stage Cycle
    • FAQ
  • Coverage & Quality Insights
    • Quality Insights Overview
    • User Story Coverage
      • Release Go / No Go Report
        • How to Generate / Edit the Report
      • User Story Quality Overview
        • How to Generate the User Story View
      • User Story Coverage Analysis
        • How to Generate the Analysis View
      • Uncovered Methods View
        • How to Generate the View
      • Customization
      • Integration
      • User Story Coverage Mechanism
    • Coverage Dashboard
    • Coverage Report
    • Test Gaps Report
      • Code Changes Calculation
        • Hidden Changes Detection
    • Test Gaps Analysis Report
      • How to Generate / Edit the Report
  • Coverage Trend Report
    • How to Generate / Edit the Report
  • Proof of Testing Report
    • How to Generate / Edit the Report
  • Test Optimization
    • Test Optimization Overview
      • Automated Test Optimization
      • Manual Test Optimization
      • Test Optimization for Pull Request
      • Test Selection Policies
        • Full Run Policy
        • No Code Changes Policy
        • Common Code Policy
        • Fastest Path to 100% Coverage Policy
      • Integrations
    • Test Optimization Mechanism
      • Associating Code With Tests
        • Statistical modeling
        • One-to-One Mapping
        • Calibration
      • Detecting Modified Code
      • Generating Test Recommendations
  • Test Optimization - Savings Breakdown
    • TIA Configuration
  • Settings
    • Token Access & Management
    • Quality Gates
    • User Management
      • Managing Users
      • Managing Groups
      • Roles & Permissions
      • SSO Authentication
  • Code Labels
  • Code Scope (Ignore/Include)
  • Contact Support
Powered by GitBook
On this page
  • Flexibility is key
  • Implementing Code Labels

Was this helpful?

Code Labels

PreviousSSO AuthenticationNextCode Scope (Ignore/Include)

Last updated 2 days ago

Was this helpful?

Managing a complex codebase can be overwhelming. Enter code labels, your secret weapon for streamlined navigation and efficient prioritization. Benefits at a glance:

  • Meaningful categorization: Assign labels like teams, functionalities, or user stories to organize your codebase logically.

  • Enhanced collaboration: Foster productive discussions and teamwork by making it easier for everyone (Dev, QA, Product) to find relevant code sections.

  • Targeted testing: Prioritize gaps effectively by filtering based on specific labels, focusing efforts on high-impact areas.

Flexibility is key

  • Label anything: Teams, business transactions, features – the possibilities are endless, tailor it to your needs.

  • Granularity control: Label entire applications, classes, folders, or even individual files (though avoiding the latter is recommended, to ensure the inclusion of newly added files.).


Implementing Code Labels

Watch the video tutorial to learn how to label your code, or follow these steps:

  1. Go to Settings > Data Scope > Code Labels

  2. Click on "Create New Categories/Labels/Rules" and add a new label category.

  3. Under the category, click the "+" button to create a label.

  4. Define the rules to tag the relevant code using options like exact match, starts with, ends with, contains, or label the entire app. Click the "+" button to save a rule.

Why is it necessary to rerun tests after defining code labels?

After defining code labels, they are automatically added to the codebase from the first next run of the app. The coverage and test gaps data captured by SeaLights will only be saved with the relevant code labels from that point onward. SeaLights does not label previous builds retroactively. Therefore, to start getting metrics related to code labels, it is necessary to rerun your tests and generate an updated Test Gaps report that incorporates the new labels. This way, you can ensure that the data reflects the current codebase and the relevant code label assignments accurately.

Incorporating a routine of adding Code Labels whenever a build is ready for testing helps maintain consistency and ensures efficiency in your development and testing workflows.

Generating a Test Gaps report with labels allows you to focus on specific areas of the codebase, reducing the number of test gaps that require attention, enhancing your planning process and streamline your testing efforts.

Remember:

  • Future-proof your labeling: Keep it scalable by avoiding overly specific labels or file-level labeling.

  • Consistency is crucial: Establish clear labelling conventions for everyone to follow.