The most basic indicators in the Tests Report you might want to track relate to the tests included in your QA efforts. They show how your testing has evolved and adapted, tracking the growth of new test cases and the archival of those that have served their purpose. Knowing how these indicators look helps you stay on track, balance priorities, and confidently drive improvements.
Indicator | What It Tracks | What Signals It Might Bring |
Active Tests | The total number of active tests during the period, which shows the ongoing scope of testing. Automatically compares with how many tests were active at the end of the prior period, showing progress or stability in test coverage. |
|
New Tests | How many tests were added during the period. Automatically compares to how many tests were added during the same time frame immediately before the period. | Tests are often added in response to new features, updates, or changes in requirements.
|
Archived Tests | How many tests were archived during the period. Automatically compares to how many tests were archived during the same time frame immediately before the period. | Archived tests provide insight into areas that may no longer need active attention, helping you streamline your testing efforts and focus resources on more critical or changing areas. |
New vs. Archived Tests | A visual overview of how many new tests have been created and how many have been archived over time. This helps in understanding your team’s workload and the maintenance of your tests. The timeline is always on a weekly basis. To ensure accurate week-to-week comparisons, make sure your date range starts at the beginning of a week. | This indicator helps demonstrate how the testing scope evolves over time. A healthy balance between new and archived tests shows that while new functionalities are being tested, older, stable components are being archived as no longer needed. This shows that system is under continuous refinement and testing efforts are focused on both innovation and maintaining a stable foundation. |
Average Steps per Test | The average complexity of tests, as shown by the average number of steps. |
|
Breakdown by Priority | The distribution of active tests at the end of the period across priority levels. |
|
Automated Tests | The percentage of all tests that have been automated. |
|
Follow Test Updates
In addition to tests themselves, you can keep track of updates made to tests. You can also filter these indicators for a specific date range and/or specific workspaces.
These indicators highlight how your tests are evolving, ensuring that nothing gets outdated. By tracking updates, you’ll see just how responsive and proactive your testing process is, making sure every test stays sharp and ready for action.
Indicator | What It Tracks | What Signals It Might Bring |
Updated Tests | The number of tests modified within the period. Automatically compares to how many tests were modified during the same time frame immediately before the period. | Tracking updated tests helps you assess how your tests are staying relevant and up-to-date with evolving features or requirements.
|
Avg. Weekly Updates | The average number of updates made to tests per week during the period. Automatically compares to weekly updates during the same time frame immediately before the period. | This metric shows the consistency of test case maintenance over time.
|
Total Updates Made | The cumulative number of updates made to all tests during the period. | Total updates reflects the overall effort the team is putting into maintaining and optimizing the test suite.
|
Updated Tests vs. Total Updates Made | A visual comparison of the number of unique tests that have been modified versus the total updates made within the period. This helps you monitor both the breadth (number of tests updated) and the depth (total updates made) of changes. | This chart can help you identify patterns in test maintenance and ensure that your testing process adapts to evolving requirements.
|