Skip to main content
Skip table of contents

Prepare for an accessibility staging review

Last updated: April 14, 2025

Conducting your own accessibility testing allows you to resolve potential launch-blocking issues before they get flagged during your staging review. For VFS teams:

During your staging review, an accessibility specialist will review your testing artifact, complete advanced testing, and conduct additional tests as appropriate for your product. You may receive individual issue tickets documenting accessibility barriers or violations of the VA.gov experience standards, and you may receive general advice related to your accessibility testing as documented in your testing artifact. See What to expect from accessibility specialists during staging review below.

Foundational accessibility tests (required)

Foundational accessibility testing will detect common issues and can be completed without a deep background in accessibility or the use of assistive technology. Foundational testing consists of:

We recommend that teams be aware of these tests throughout their product lifecycle and consider them as part of their design and development process. Passing these tests will be easier if they’re just validating the accessible choices you’ve already made.

These tests should be completed for every page of your product and for every variation of your product (eg. unauthenticated vs. authenticated, error states, etc.). The accessibility specialist conducting your staging review will attempt to review as much as possible, but will prioritize your product’s happy path. But you know your product best, so be sure to check every edge case.

Use of color and color contrast

Who: Design, any team member with time to review

When: We recommend that designers consider color accessibility best practices throughout design, engineers check color contrast when coding, and that color combinations be validated prior to staging review.

Color contrast

Use Who Can Use to check color combinations for contrast. Ensure that:

  • All text of 20px or smaller has a 4.5:1 contrast ratio to its background (or better)

  • All text of 20px or larger has a 3:1 contrast ratio to its background (or better)

  • Non-text elements have a 3:1 contrast ratio to their background and to neighboring elements (or better)

In general, using color combinations that follow Design System guidelines will typically meet contrast requirements. However, pay particular attention to:

  • White text on yellow, orange, or red

  • Black text on darker blues or purples

  • Gray text on white or textured backgrounds

Colorblindness

We recommend that you use the Chrome Colorblindly plugin to check for colorblindness issues.

  • Color is not the only way to distinguish links from other text (eg. links are underlined)

  • Any charts, maps, infographics, and tables convey all information without only relying on color

  • Content does not refer to color, especially when providing user instructions (eg. "Click the blue button")

Dark, contrast, and inverted color modes (mobile app only)

If your team is building for the VA mobile app, check color contrast combinations and colorblindness in dark mode and other contrast modes. This includes:

Definition of done

  • Text and non-text elements meet color contrast ratio requirements relative to the background and neighboring elements.

  • All links, buttons, icons, and other elements retain their meaning when color blindness filters are enabled.

  • For VA mobile app features:

    • Text and non-text elements meet color contrast ratio requirements for all supported dark, contrast, and inverted color modes.

    • All links, buttons, icons, and other elements retain their meaning for all supported dark, contrast, and inverted color modes.

If you have unresolved color test issues, make sure these are documented in your accessibility testing artifact.

Related resources

Automated testing with axe by Deque

When: As part of daily development

Who: Frontend engineer, any team member with time to review

axe by Deque Systems is our required automated testing tool, and is available both as a free browser extension and as a mobile app testing suite. Although automated testing can't detect all accessibility issues, axe is very good at detecting certain common problems in your code.

Steps to test (web)

  1. Install the axe browser extension for Chrome, Firefox or Edge.

  2. Right click on the page you're testing.

  3. Select Inspect to open your browser's developer tools.

  4. Select the axe DevTools tab.

  5. Select Scan all of my page to run axe.

Steps to test (mobile app)

  1. Install and configure axe DevTools for Mobile (preferably on an actual mobile device, but it should work for Xcode as well.)

  2. Follow Deque’s documentation to run a test.

Expected result

Stable code with no violations reported in axe checks.

Definition of done

No violations indicated in axe scan results within the scope of your product. (You can ignore any flagged issues that aren’t your responsibility.)

If axe scans detect issues that you are unable to resolve prior to your staging review, make sure these are documented in your accessibility testing artifact.

axe scans in end-to-end tests

Having automated accessibility checks in place is a launch requirement for Section 508 compliance. Learn more about automated and integrated 508 compliance (GitHub authentication required).

All pages and applications must have axe checks that run on every continuous integration/continuous deployment (CI/CD) build. React applications require these tests to be added as part of the end-to-end test suite. Tests should open modal windows, accordions, hidden content, and re-run the axe check for high test coverage.

Steps to test
  1. Read the basic end-to-end test setup document

  2. Add axe checks to all of your end-to-end tests

  3. Run tests on localhost to confirm proper functionality

Expected result

Stable code with no violations reported in end-to-end tests.

Note: hecAxe checks that result in 1 or more violations will break branch builds and prevent merging into the main branch until they are resolved.

Definition of done

No violations indicated in end-to-end tests.

If e2e tests result in accessibility violations that you are unable to resolve prior to your staging review, make sure these are documented in your accessibility testing artifact.

 Content zoom, reflow, and text resizing

Who: Design or any team member with time to review

When: We recommend that designers think about how content will resize and reflow in high-zoom situations prior to handing off to engineers. High-fidelity prototypes at high zoom aren't required, but designers may want to create low-fi wireframes for complex page layouts. Once coded, content zoom and reflow should be tested prior to Staging review.

All page elements must be readable and usable at 200%, 300%, and 400% zoom. That means:

  • No functionality has been hidden or removed from the page. (Example: a button that is visible on desktop but hidden on a mobile device will likely be hidden at high zoom.)

  • Nothing overlaps. (Example: text getting covered up by other elements.)

  • No horizontal scrolling. (Exceptions: data tables, charts, and images.)

If your team is building for the VA mobile app, all text must remain legible when your device is set to scale text to the maximum possible size.

Steps to test (web)

  • Set browser width to 1280px

    • In Chrome you can right click on any webpage, and select Inspect from the menu. This will open the Developer Tools console. Drag your browser window narrower or wider until the number in the top right corner of your browser window reads "1280"

    • In Firefox you must turn on rulers before you can accurately resize your browser window. When rulers are enabled, right click on a page and select Inspect Element from the menu. When the Firefox Developer tools are open, click the ruler icon. The ruler is the third icon from the top right of the Developer tools panel.

  • Zoom your browser to 200%, 300% and then 400%

    • Windows: hold down Ctrl and select + until you reach your desired zoom level

    • Mac: hold down Cmd and select + until you reach your desired zoom level

Steps to test (mobile app)

Definition of done

All page elements are readable and usable at each zoom level. Layouts do not scroll sideways or have content to the edges. (Horizontal scrolling is permitted for content like images, maps, diagrams, presentations, and data tables.)

If your team is building for the VA mobile app, text is not clipped, obscured, overlapped, or covered by other text or other elements at the largest possible text size.

If you have content zoom, reflow, or text resizing issues that you are unable to resolve prior to your staging review, make sure these are documented in your accessibility testing artifact.

Related resources

 Keyboard navigation

Who: Any team member with time to review

When: At a minimum, prior to your staging review. However, there are a lot of factors to consider when optimizing designs and code for keyboard navigation. We recommend that you learn about developing for keyboard navigation best practices early on to prevent issues.

All pages and user flows must be navigable using a keyboard, including form controls and inputs, links, buttons, etc.

If your team is building for the VA mobile app, testing for keyboard navigation is still required. Users may connect a Bluetooth or USB keyboard to their device, or they may use alternative input devices that map user actions to certain keystrokes. (You must enable full keyboard access for iOS devices.)

Steps to test

To test keyboard navigation, [Tab] to move focus to the next element that can receive keyboard focus. [Shift] + [Tab] to move focus to the previous element.

  • Confirm that each link, button, form input, checkbox, radio button, select menu, and custom element can receive keyboard focus.

  • Confirm that each link, button, form input, checkbox, radio button, select menu, and custom element responds to expected keys for that type of element.

  • Confirm that all elements under focus have a visible focus indicator.

  • Confirm that the order of [Tab] stops made sense and was appropriate for completing tasks. In particular:

    • The order of [Tab] stops matches the expected order of tasks for completing the workflow.

    • Users don’t need to press [Tab] a large number of times to move forward or backward to reach the element they want.

Keyboard-only end-to-end testing

In addition to manual testing, teams are highly encouraged to write keyboard end-to-end tests using Cypress. With robust e2e testing in place, manual keyboard navigation testing can be a quick validation rather than an intensive testing process.

Definition of done

There are no keyboard navigation issues navigating pages and user flows.

If you have keyboard navigation issues that you are unable to resolve prior to your staging review, make sure these are documented in your accessibility testing artifact.

Related resources

Advanced accessibility tests (recommended)

Advanced accessibility testing detects accessibility barriers that may be missed by foundational testing, and may require more background knowledge or the use of assistive technology. Advanced accessibility testing consists of:

Teams are encouraged to do as much advanced testing as they can and to document their results, but are not required to submit anything as part of the Collaboration Cycle. Any issues you’re able to detect and resolve prior to your staging review will make it less likely that launch-blocking issues are logged in that review. The accessibility specialist conducting your staging review will also complete all of the advanced accessibility testing as part of their review.

WAVE spot checks (web)

Who: Frontend engineer, QA, anyone reviewing coded product

When: As part of daily development, and prior to staging review

WAVE is an automated testing tool built by WebAIM, available as a free browser extension. It is similar to axe but uses different testing scripts and may detect a different set of issues. Teams should use WAVE to spot-check selected pages of their product to verify the results of their automated testing from axe. In general, a product that is passing all of its axe tests should also pass all of the WAVE tests, and any issues flagged by WAVE should get a closer look.

Steps to test

  1. Install the WAVE browser extension for Chrome, Firefox or Edge.

  2. When viewing the page you’re testing, click the WAVE icon in your browser toolbar.

  3. Review any items identified as Errors and Alerts.

WAVE will also highlight structural elements, reveal hidden or alternative text, and otherwise visually display useful accessibility annotations alongside your page content. You may find it useful to review those annotations.

Expected result

No errors or alerts detected by WAVE.

Definition of done

No errors or alerts indicated in WAVE scan results within the scope of your product. (You can ignore any flagged issues that aren’t your responsibility.)

If WAVE scans detect issues that you are unable to resolve prior to your staging review, make sure these are documented in your accessibility testing artifact.

Android Accessibility Scanner spot checks (mobile app)

Who: Frontend engineer, QA, anyone reviewing coded product

When: As part of daily development, and prior to staging review

Android Accessibility Scanner is an automated testing tool built by Google, available as a free app for Android devices. Like axe and WAVE, it is able to detect a subset of accessibility issues automatically for mobile apps on Android devices. Teams should use Android Accessibility Scanner to spot-check their work in the VA mobile app.

Steps to test

  1. Follow the instructions in Get started with Accessibility Scanner.

  2. When reviewing a single screen in the mobile app, use “scan a snapshot.” When reviewing interactions that involve multiple screens, use “scan a recording.”

  3. Review any items identified by the scanner.

Like WAVE for the web, the Android Accessibility Scanner will also highlight structural elements and display visually display useful accessibility annotations alongside your page content. You may find it useful to review those annotations.

Expected result

No issues detected or suggestions provided by Android Accessibility Scanner.

Definition of done

No issues detected or suggestions provided by Android Accessibility Scanner results within the scope of your product. (You can ignore any flagged issues that aren’t your responsibility.)

If Android Accessibility Scanner detects issues that you are unable to resolve prior to your staging review, make sure these are documented in your accessibility testing artifact.

Code quality review

Who: Frontend engineer

When: As part of daily development

Semantic HTML is the cornerstone of accessibility, and ARIA attributes must be used with care. Periodic code quality reviews by frontend engineers can prevent incompatibilities with assistive technologies.

Code quality checks

  • Confirm that any Design System components are used as intended, according to Design System documentation.

  • Confirm that all HTML elements are implemented consistent with specifications and best practices, including:

  • Confirm that all ARIA attributes are implemented according to the W3C’s ARIA Authoring Practices Guide.

Definition of done

Design System components, HTML elements, and ARIA attributes are all implemented consistent with documentation and best practices.

If there are code quality issues that you are unable to resolve prior to your staging review, make sure these are documented in your accessibility testing artifact.

Mouse-only and touchscreen

Who: Frontend engineer

When: As part of daily development

All tasks must be completable using only a mouse or one-finger gestures. Any fields requiring text input will work with the device’s native on-screen keyboard.

Mouse-only and touchscreen checks

  • Confirm that tasks do not depend on specific keyboard actions (eg. keyboard shortcuts).

  • Confirm that all touch/click targets are 44 by 44 pixels or larger, excluding links in a block of text.

  • Confirm that actions do not depend on specific touch/click timings (eg. click-and-hold).

  • Touch/click behavior:

    • Touch/click down on a button/link/feature. The button/link/feature should not trigger.

    • Untouch/click up on a button/link/feature. The button/link/feature should trigger.

  • Using a mobile device, make sure you can use the full functionality of a feature with only one finger to gesture (tapping, swiping, etc.). Gestures may be completed using multiple fingers but should not require multiple fingers. Likewise, gestures must must not require a specific path.

    • If you don’t have a mobile device to test with or you’re testing something that you’re unable to access with your mobile device, use your browser’s developer tools to change your view to mobile (instructions for changing device mode on Chrome). Make sure you can use the full functionality of the feature using only the mouse pointer in this mode.

Definition of done

All functionality is available to mouse-only users or touchscreen users with one-finger gestures.

If there are tasks that depend on keyboard functionality or multiple-finger gestures that you are unable to resolve prior to your staging review, make sure these are documented in your accessibility testing artifact.

Screen readers

Who: Any team member with time to review

When: Spot checks during daily development, and a full product run-through before your staging review. There is a bit of a learning curve to screen reader testing that comes with experience.

Screen readers read the text of a page out loud to users, and announce certain other information to users (eg. state of certain form inputs). For VFS teams, we recommend testing with at least one desktop screen reader and one mobile screen reader. See Screen reader testing for more information.

Common screen reader barriers

Pay close attention to:

  • Mismatches between text displayed on screen and text announced by the screen reader. (Extra information is okay when it’s useful. Conflicting, confusing, or needlessly redundant information is bad.)

  • Buttons or links that do not provide clear understanding of their function (eg. an “Edit” button that relies on visual context to know what’s being edited).

  • Alternatives announced for images, icons, and other visual components – and whether they are an appropriate alternative for this context.

  • Whether the order in which information is announced makes sense for completing tasks.

  • How alerts, status updates, or other information refreshes are announced.

  • Whether (and how) focus changes are announced after a user action.

  • Improper use of headings, including:

    • Headings that increment by more than one (eg. an h2 heading followed by an h4 heading).

    • “Fake” headings that are visually styled as headings but are coded using p, span, or div elements.

    • Headings that don’t reflect the actual content organization.

  • Missing HTML landmarks (eg. whether a navigation menu is announced as “navigation”).

Definition of done

All page content is announced by the screen reader, with the appropriate context and ordering to complete tasks.

If there are known screen reader issues that you are unable to resolve prior to your staging review, make sure these are documented in your accessibility testing artifact.

Voice commands

Who: Designers, content specialists, information architects, frontend engineers

When: We recommend that designers, content specialists, and information architects consider voice command accessibility before handing off to engineers. As frontend engineers write code, we recommend they check voice command as part of daily development.

All tasks must be completable using voice command software, such as:

Voice command checks

When possible, voice command checks should be completed with spoken commands using voice command software. If your team is building for the VA mobile app, testing with Voice Control for iOS and Voice Access for Android are especially important.

Even if it is not possible to test using voice software, you can check for:

  • Custom interactive components that may not match a standard voice command (anything that’s not a link, button, or common HTML form element).

  • Components whose visual presentation doesn’t match their semantic meaning (eg. a link styled to look like a button).

  • Interactive elements with no visible label.

  • Interactive elements whose visible labels do not match their accessible name.

  • Interactive elements whose label is hard to say (difficult to pronounce, tongue-twisters, excessively long, uses an acronym or abbreviation).

Definition of done

All functionality is available to voice command users, verified either through actual testing or the voice command checks above.

If there are known voice command issues that you are unable to resolve prior to your staging review, make sure these are documented in your accessibility testing artifact.

Using the accessibility testing artifact

The accessibility testing artifact is provided to VFS teams to:

  • Help walk you through foundational accessibility testing with easy-to-follow checklists.

  • Create a standard format for reporting your testing results, which helps guide your conversation with the accessibility specialist conducting your staging review.

You are required to submit your accessibility testing artifact at least four days prior to your scheduled staging review, along with the other artifacts required for that touchpoint. Failure to submit your accessibility testing artifact may result in your review being postponed.

If your accessibility testing detects potential barriers, document those detected issues using the accessibility issue template or your team’s preferred issue tracking template. When submitting your testing artifact, please link to any open issues for known barriers.

What to expect from accessibility specialists during staging review

After you request a staging review, the accessibility specialist will review your testing artifact. When the foundational tests have been verified, the specialist will complete any required advanced tests and any other accessibility testing appropriate to your product.

Any accessibility barriers detected as part of the staging review will be reported via GitHub issues and will be discussed at the staging review meeting. Issues are labeled according to the defect severity rubric, and any issues that must be addressed prior to launch will be labeled launch-blocking.

You may also receive comments on your accessibility testing artifact. These comments will be limited to:

  • Responses to any questions or requests you included in your artifact submission.

  • Recommendations for improving your testing process before your next trip through the Collaboration Cycle.

We recommend you work with an accessibility specialist to validate and resolve feedback received at your staging review. If you do not have an accessibility specialist available to work with your team, please consider:


JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.