Skip to main content

Accessibility Testing Standards

Standards

As required by the Illinois Information Technology Accessibility Act (PA 095-0307/30 ILCS 587) accessibility testing will be performed to confirm conformance with:

WCAG 2.2 and Level AAA criteria may also be tested but should be identified as such in reports.

Testing Tools

Accessibility testing will be performed with the following tools/techniques:

Additional testing may be performed by trained testers using the following assistive technology tools:

Impact Ratings

If functional impact ratings are provided, the following scale will be used:

  • Critical – Will prevent some users from completing essential tasks
  • High – Will be very difficult or confusing to some users, but means of completing task exist
  • Med – May be difficult or confusing to some users, but means of completing task exist 
  • Low – Violates accessibility standards but unlikely to significantly affect users

Testing Process

Accessibility testing will include the following processes:

  1. Keyboard Tests
  2. Visual Tests
  3. Automated Tests
  4. Assistive Technology Tests

1. Keyboard Tests

Keyboard testing is performed using standard keyboard commands only – the mouse should not be used. Keyboard testing may be completed by following existing test scripts substituting keyboard commands for mouse actions. If test scripts are not available, keyboard testing may be performed a screen/page at a time, completing all the functions available on the screen/page.

  1. Browse to the screen to be tested.
  2. If necessary, explore the screen with the mouse to identify all interactive elements.
  3. Click in the browser address bar; then set the mouse aside.
  4. Press the Tab key several times to move focus past any browser toolbars and into the page.
  5. Use the following keyboard commands to move to and operate all interactive elements: 
    • Tab - Move to the next interactive element (link or control). 
    • Shift + Tab - Move backwards to the previous interactive element (if necessary). 
    • Alt + Down Arrow - Open a dropdown list. 
    • Up/Down Arrow - Move vertically through options in a dropdown list, list box, radio button group, menu, or rows in a grid. 
    • Right/Left Arrow - Move horizontally in a menu bar, tab list, or grid. 
    • Spacebar - Check a checkbox or click a button. 
    • Enter - Click a link or select an item in a list box or menu. 
    • Other - For other types of controls, look up and use the Keyboard Interactions in the ARIA Authoring Practices Guide (APG) (https://www.w3.org/WAI/ARIA/apg/patterns)
  6. At each step, check for and document any: 
    • Interactive elements that do not receive keyboard focus. (WCAG 2.1.1)
    • Instances where focus doesn’t follow a logical order. (WCAG 2.4.3)
    • Elements that cause unexpected changes when they receive focus. (WCAG 3.2.1)
    • Elements that do not show a visual indicator (e.g., outline) when focused. (WCAG 2.4.7)
    • Elements than cannot be operated using standard keyboard commands (listed above). (WCAG 2.1.1)
    • Elements that cause unexpected changes when values are entered/changed. (WCAG 3.2.2)
    • Elements that trap focus (do not allow it to move to the next element). (WCAG 2.1.2)
  7. If any of the issues listed above are found, the keyboard test fails. In the ticket, clearly identify the element and which check it failed. Include steps to reproduce any unexpected behavior. 

2. Visual Tests

Visual testing is performed using a combination of a Window High Contrast color theme and browser Zoom set to 200%:

  1. Set your display resolution or browser window to 1366 px wide.
  2. Press left Alt + left Shift + PrtSc (Print Screen). If prompted to turn on High Contrast, click Yes. By default, Windows will activate a theme with a black background, white text, yellow links, green disabled text, and light blue selection highlights. (If High Contrast does not activate, check Windows Control Panel > Ease of Access Center > Make the computer easier to see, and confirm “Turn on or off High Contrast…” is checked.)
  3. In the browser, open the Settings menu (Alt + F) and set Zoom to 200%.
  4. Browse to the screen to be tested. If there are elements of the screen that are not initially displayed, such as collapsed sections, display as many of them as possible.
  5. Visually review all elements of the screen except logos, decorative (meaningless) images, or images of text that are duplicative of actual text provided elsewhere on the screen. Check for and document:
    • Text that does not take on high-contrast colors. (WCAG 1.4.5)
    • Information that was conveyed by color that is now not shown. (WCAG 1.4.1)
    • Text that did not increase in size by 200%. (WCAG 1.4.4)
    • Text that was truncated, overlapped, or otherwise became unreadable. (WCAG 1.4.4)
  6. For any text that did not take on high-contrast colors (except logos, decorative, or duplicative images), use the Colour Contrast Analyser (https://www.tpgi.com/color-contrast-checker) to sample the foreground (text) color and background colors. Check for and document:
    • Regular-sized text with a contrast ratio less than 4.5:1. (WCAG 1.4.3)
    • Large text (24px or 19px bold or larger) with a contrast ratio less than 3:1. (WCAG 1.4.3)
  7. If any of the issues listed above are found, the visual test fails. In the ticket, clearly identify the element and which check it failed. In the case of color contrast failures, include the hexadecimal color codes of the foreground and background colors and the computed color contrast ratio. 

3. Automated Tests

Automated accessibility testing may be included as part of a continuous integration and delivery (CI/CD) pipeline, or performed using a browser extension. Automated testing is typically performed one page/screen at a time. If a screen has multiple states, such as expanding/collapsing sections or pop-up dialogs, it may be necessary to run the automated test multiple times in the different states.

  1. If necessary, install the Accessibility Insights or axe DevTools browser extension.
  2. Browse to the screen to be tested. If there are elements of the screen that are not initially displayed, such as collapsed sections or pop-up dialogs, display as many of them as possible. (If there are elements that cannot be displayed simultaneously, re-run the test in each state.)
  3. Open the Accessibility Insights or axe DevTools extension and select FastPass Automate Checks or “Scan ALL of my page”, respectively.
  4. Check any failed instances/issues with keyboard, visual, and assistive technology tests to determine their impact and rule out false positives. 
  5. If there are issues regarding ARIA, or any other issues that are unclear or questionable, refer them to the DoIT Office of Information Accessibility (DoIT.Accessibility@Illinois.gov) before reporting.
  6. If there are issues that are not false positives and do not require review by the DoIT Office of Information Accessibility, export the results (Accessibility Insights) or copy and paste the top line issue descriptions (axe) into a single ticket per screen.

4. Assistive Technology Tests

Assistive technology testing should only be performed by testers who have been trained to use assistive technology tools. Testers must be certain not to “cheat,” for example by using the mouse or visually reading information on the screen when testing with a screen reader. Do not test with assistive technology unless you are completely confident in your ability to use it as it would be used by someone with a disability.

Assistive technology testing should be performed after automated, visual, and keyboard testing has been completed and passed. Assistive technology testing is normally performed only on a representative sample of screens/pages. The representative sample should be selected by someone with knowledge of the scope and functions of the application and should include:

  • Screens required for the most essential functions of the application, and
  • Examples of each design patterns or interface component not present on the essential screens.

All findings for a given screen may be reported in a single ticket OR findings may be reported individually, including steps to reproduce, depending on complexity and requirements of the system developer/vendor.

  • Screen Reader - Screen reader testing should be performed using the browser and screen reader most likely to be used by users of the system, i.e., Edge & JAWS for internal and Chrome & NVDA for public applications. For details, see Screen Reader Testing (https://doit.illinois.gov/initiatives/accessibility/testing/screen-reader)
  • Screen Magnifier - Screen magnifier testing should be performed using ZoomText with magnification set to 4 x, color enhancement active, and speech disabled.
  • Speech Recognition - Speech recognition should be performed using Dragon Professional or Windows Speech Recognition. Voice training should be completed, and recognition accuracy confirmed before testing. Preference should be given to commands targeting specific elements, such as “click first name.” Mouse movement commands, including mouse grid, should be used only if specific commands do not work.  

Documentation

When reporting multiple issues in a single ticket, provide a list of all the issues in a format such as:

Issue Standard Impact
Concise description including element(s) affected; Recommendation (optional)  x.x.x (version level) see scale above

For example:

Issue Standard Impact
Language of the page is not specified; add html lang="en" 3.1.1 (2.0 A) Low
Country dropdown causes page to reload when selected option is changed; add a ‘select’ button or attach to blur instead of change event 3.2.2 (2.0 A) High
Submit button does not receive keyboard focus; remove tabindex="-1" 2.1.1 (2.0 A) Critical

When issues must be reported individually, use the following format:

Title Application - Screen - Issue (YYYY-MM-DD)
URL URL
OS Name and version
Browser Name and version
Testing Tools Name(s) and version(s)
Screenshot  
Violations WCAG criteria number, version, level, short name
Impact see scale above
Steps to Reproduce
  1.  
  2.  
  3. ...
Expected Behavior
  1.  
  2.  
  3. ...
Notes Optional, e.g., code snippets and/or recommended corrections

If necessary, a screen recording of the Steps to Reproduce in MP4 format may also be included

Footer