Skip to main content
Skip table of contents

QA standards

Last updated: October 21, 2024

Overview

This page outlines the QA Standards VFS teams need to meet to launch a product through the Collaboration Cycle. The goal of these QA standards is to make sure that the VFS product teams have met business goals and that the product’s functionality behaves as intended.

VFS teams must provide QA artifacts demonstrating that they’ve met the QA Standards. These artifacts will be used to evaluate whether or not the product meets the QA Standards at the Staging Review touchpoint of the Collaboration Cycle.

Note: QA artifacts are required for Staging Reviews. If artifacts are not provided 4 business days before the scheduled Staging Review, the Governance Team will cancel the review and it will need to be rescheduled.

Exceptions

If a VFS team is launching a static page (e.g. a page whose content comes exclusively from Drupal), it won’t be held responsible for meeting and upholding the QA Standards. Since static pages don't have dynamic content, they don’t require extensive QA testing.

VA.gov QA Standards

Please note: These standards will likely evolve over time as tooling evolves and processes mature.

Standard area

ID

Standard description

Severity

Regression Test Plan

QA1

The product must have a regression test plan that proves the new changes don't break previously-integrated functionality.

Launch-blocking

Test Plan

QA2

The product must have a test plan that describes the method(s) that will be used to verify product changes.

Launch-blocking

Traceability Reports

QA3

The product must have a Coverage for References report that demonstrates user stories have been verified by test cases in the test plan. The product must also have a Summary (Defects) report that demonstrates that the defects discovered during QA testing were discovered by executing test cases in the test plan.

Warning, not launch-blocking

E2E Test Participation

QA4

The product must have 1 or more End to End (E2E) tests.

Launch-blocking

Unit Test Coverage

QA5

The overall product must have 80% or higher unit test coverage in each category: Lines, Functions, Statements, and Branches.

Launch-blocking

Endpoint Monitoring

QA6

All endpoints that the product accesses must be monitored in Datadog. The VFS team must complete a playbook that specifies how the team will handle any errors that fire.

Launch-blocking

Logging Silent Failures

QA7

The product code must log all silent failures to StatsD.

Warning, not launch-blocking

How standards are validated

Please note that you will likely need access to TestRail in order to access many of these artifacts. See “TestRail access” under Request access to tools: additional access for developers for information on getting access.

Regression Test Plan

A Regression Test Plan is a document that maps user stories to tests and which includes the results of executing those tests, thereby providing a strategy for verifying the functionality of your product prior to the work moving through the Collaboration Cycle. You are free to provide this information to Governance Team in whatever format is most convenient, though Platform recommends using TestRail (account required).

Example: You can create a Regression Test Plan under the Test Runs & Results tab of TestRail. For an example regression plan, see the VSP-Facility Locator regression plan (TestRail account required).

Note: VFS Teams not using TestRail are still expected to share a Regression Test Plan at the Collaboration Cycle Staging Review.

Test Plan

A Test Plan is a formal declaration of the approach, objectives, scope, and overall strategy for verifying the new or modified behaviors of a given software product. Platform recommends that product stakeholders consider the potential impact to existing features of the product as well as the likelihood that the features of a product will be affected when creating a test plan. Product teams should execute their test plan and track their results.

Example: You can create a Test Plan under the Test Runs & Results tab of TestRail. For an example test plan, see the Cross-Browser Search test plan (TestRail account required).

Note: VFS Teams not using TestRail are still expected to share a Test Plan at the Collaboration Cycle Staging Review.

Traceability Reports

Traceability Reports have two components:

  • Coverage for References: the percent of the initiative’s user stories are covered by tests in the Test Plan. This can be reported as a chart, a table, or simply a number.

  • Summary(Defects): demonstrate what percent of defects uncovered during testing were uncovered by executing tests in the Test Plan as well as which defects are resolved or outstanding. This can be reported as a chart, a table, or simply a number.

Traceability Reports can be automatically generated in TestRail after executing tests in TestRail.

Example: You can create a Traceability Report by clicking on the appropriate link in the Create Report panel on the Reports tab of TestRail. For an example of a Coverage for References report, see this Coverage Report for Search (TestRail account required). For an example of a Summary (Defects) report, see this Summary (Defects) report for Search (TestRail account required).

Note: VFS Teams not using TestRail are still expected to share Traceability Reports at the Collaboration Cycle Staging Review.

E2E Test Participation

VFS teams should create at least one Cypress test spec in a tests/e2e folder in your product’s directory that follows the conventions described in the Writing an end-to-end test document. See End-to-end testing with Cypress for more information about writing Cypress tests on the platform.

Unit Test Coverage

The overall product must have 80% or higher unit test coverage in each category: Lines, Functions, Statements, and Branches. Either provide a link to your product’s entry in the Unit Test Coverage Report of the Frontend Dashboard or generate the code coverage report locally for an app by running the following command within vets-website: yarn test:coverage-app {app-name}

Endpoint Monitoring

All endpoints that the product accesses need to be monitored in Datadog. Monitors should be configured according to the guidance in this document. To satisfy this standard, VFS teams must complete a playbook that details how they will respond to any errors that fire.

Minimum requirements to meet this standard

  1. End point monitoring must cover the three fundamental types of problems:
    A. Unexpectedly high errors
    B. Unexpectedly low traffic
    C. Silent failures (silent to the end user) of any kind

  2. PII and PHI must never be saved in logs.

  3. Errors that appear in Datadog should be routed to a Slack channel. (You should name this channel something like #(team name)-notifications.)

  4. The Slack channel should be monitored daily by the VFS team.

  5. The team should establish a process for acknowledging and assigning errors to a team member to investigate and resolve.

Behaviors to avoid

  • Do not silence errors if there are too many. Instead we expect you to tune monitoring thresholds to be useful.

  • Do not assign monitoring to one person.


Logging Silent Failures

A silent failure occurs when a Veteran or other user of VA.gov takes an action that results in a failure, but the Veteran is not notified of the failure, e.g. a form submission fails downstream but the user is not informed. VFS teams must link directly to line(s) of code in their products that log silent failures to StatsD. All operations that have the potential to generate silent failures should log these failures and should do so according to the guidelines in this silent failures guide.

FAQ

Do VFS teams need to create new unit tests for each trip through the Collaboration Cycle?

No, VFS teams don’t need to create a specific unit test for every iteration they make to a product they bring through the Collaboration Cycle. Instead, we’re asking VFS teams to provide updated testing artifacts that reflect the change(s) they're making to their product. VFS teams don't need to provide a full test plan for their small incremental change.

When should VFS teams complete and submit their QA artifacts for review?

All QA materials and artifacts should be completed and linked to the Collaboration Cycle Github ticket 4 business days before the scheduled Staging Review touchpoint. There is a specific section of the Github ticket for teams to link these artifacts.

The QA artifacts section of the Collaboration Cycle Request Ticket

The QA artifacts section of the Collaboration Cycle Request Ticket

Are VFS Teams required to use TestRail?

No. VFS Teams can use any testing tool (or tool in general) that demonstrates that QA Standards are being met.

Are Regression Test Plans required for a new product?

Yes. Even though the product may be new to VA.gov, it's not necessarily a totally new product. The goal of the regression test plan would be to ensure that the product is still meeting the requirements of the legacy product. For example, a digitized form on VA.gov should meet the requirements of the paper form.


JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.