Accessibility Testing Standards
Revised 9/3/2024
Standards
As required by the Illinois Information Technology Accessibility Act (IITAA) and the Americans with Disabilities Act (ADA), testing is performed to confirm conformance with:
- Web Content Accessibility Guidelines (WCAG) 2.1 Level AA (https://www.w3.org/TR/WCAG21)
- Accessible Rich Internet Applications (ARIA) 1.2
(https://www.w3.org/TR/wai-aria-1.2)
WCAG 2.2 and Level AAA criteria may be included and will be identified in reports.
Testing Tools
Accessibility testing will be performed with the following tools/techniques:
- Accessibility Insights for Web (https://accessibilityinsights.io/downloads)
- Colour Contrast Analyser (https://www.tpgi.com/color-contrast-checker)
- High Contrast & Zoom (https://doit.illinois.gov/initiatives/accessibility/testing/visual)
- Keyboard Commands (https://doit.illinois.gov/initiatives/accessibility/testing/keyboard)
Additional testing may be performed by trained testers using the following assistive technology tools:
- NVDA screen reader (https://www.nvaccess.org)
- JAWS screen reader (https://www.freedomscientific.com/products/software/jaws)
- ZoomText screen magnifier (https://www.freedomscientific.com/products/software/zoomtext)
- Dragon speech recognition (https://www.nuance.com/dragon/business-solutions/dragon-professional)
Impact Ratings
If functional impact ratings are provided, the following scale will be used:
- Critical – Will prevent some users from completing essential task(s).
- High – Will be difficulty for some users, but task(s) can be completed.
- Med – May be confusing to some users, but task(s) can be completed.
- Low – Violates accessibility standards, but unlikely to affect users
Testing Process
Accessibility testing will include the following processes:
- Automated Tests
- Keyboard Tests
- Visual Tests
- Assistive Technology Tests
1. Automated Tests
Automated accessibility testing is useful but has some significant limitations:
- Current testing tools are able to test fewer than 60% of accessibility criteria, so additional manual testing is necessary.
- Current testing tools are susceptible to false positives, so all findings must be confirmed through manual testing.
Because of these limitations, the State of Illinois only uses automated testing to:
- Provide an initial overview of potential accessibility issues, and
- Identify elements that may need extended manual testing.
For automated testing, Illinois primarily uses Microsoft’s Accessibility Insights for Web, as it (and other tools based on the axe-core engine) has a low incidence of false positives and is able to test within shadow DOMs.
To test with Accessibility Insights:
- Browse to the screen to be tested.
- If there are elements of the screen that are not initially displayed, such as collapsed sections or pop-up dialogs, display as many of them as possible. (If there are elements that cannot be displayed simultaneously, re-run the test in each state.)
- Open Accessibility Insights and select FastPass Automated Checks.
- Export or make note of the failed instances for confirmation with manual tests.
- If there are findings regarding ARIA, or any other issues that are unclear, refer them to the DoIT Office of Information Accessibility (DoIT.Accessibility@Illinois.gov) for confirmation.
Note: Do not simply copy and paste failure details directly into an accessibility report. Findings must be confirmed, and, if so, clearly documented in terms that developers will understand.
2. Keyboard Tests
Keyboard testing is performed using standard keyboard commands only – a mouse must not be used. Keyboard testing may be completed by following existing test scripts substituting keyboard commands for mouse actions. Otherwise, keyboard testing should be performed by doing the following on each page/screen:
- Browse to the screen to be tested.
- Explore the screen with the mouse to identify and determine the function of all interactive elements.
- Click in the browser address bar and press enter to reload the page; then set the mouse aside.
- If necessary, press the Tab key several times to move focus past any browser toolbars and into the page.
- Once focus is in the page, use the following keyboard commands to move to and operate all interactive elements:
- Tab - Move to the next interactive element.
- Shift + Tab - Move backwards to the previous interactive element (when needed).
- Right or Left Arrow - Move horizontally through items in a menu bar, tab list, or grid.
- Up or Down Arrow - Move vertically through items in a dropdown, list box, radio button group, menu, or grid.
- Alt + Down Arrow - Open a dropdown list or calendar popup.
- Spacebar - Check a checkbox or click a button.
- Enter - Click a link.
- Alt + Left Arrow - After clicking a link, go back to the previous page.
- Other - For any other controls, look up the Keyboard Interactions in the ARIA Authoring Practices Guide (APG) (https://www.w3.org/WAI/ARIA/apg/patterns)
- Tab - Move to the next interactive element.
- At each step, check for and document any:
- Interactive elements that do not receive keyboard focus. (WCAG 2.1.1)
- Elements that do not show a visual indicator (e.g., outline) when focused. (WCAG 2.4.7)
- Elements that do not recieve focus in a logical order (e.g., left to right, top to bottom). (WCAG 2.4.3)
- Elements that cause unexpected changes (e.g., reload the page) when they receive focus. (WCAG 3.2.1)
- Elements than cannot be operated using standard keyboard commands. (WCAG 2.1.1)
- Elements that cause unexpected changes when values are changed. (WCAG 3.2.2)
- Elements that trap focus (i.e., prevent focus from moving to the next element). (WCAG 2.1.2)
- If any of the issues listed above are found, report a failure. In the report, clearly identify the element involved, which check it failed, and any steps required to reproduce the failure.
3. Visual Tests
Visual testing is performed using a combination of a Window High Contrast color theme and browser Zoom:
- Set your display resolution or browser window to 1280 px wide.
- Press left Alt + left Shift + PrtSc (Print Screen). If prompted to turn on High Contrast, click Yes. By default, Windows will activate a theme with a black background, white text, and blue or yellow links. (If High Contrast does not activate, check Windows Control Panel > Ease of Access Center > Make the computer easier to see, and confirm “Turn on or off High Contrast…” is checked.)
- In the browser, open the Settings menu (Alt + F) and set Zoom to 200%.
- Browse to the screen to be tested.
- If there are elements of the screen that are not initially displayed, such as collapsed sections or pop-up dialogs, display as many of them as possible.
- Visually review all elements of the screen except logos, decorative (meaningless) images, or images of text that are duplicative of actual text provided elsewhere on the screen. Check for:
- Text that did not increase in size by 200%. (WCAG 1.4.4)
- Text that did not take on high-contrast colors. (WCAG 1.4.5)
- Text that was cut off, overlapped, or otherwise became unreadable. (WCAG 1.4.4)
- Information that was conveyed by color that is now now distinguishable by any means. (WCAG 1.4.1)
- Press left Alt + left Shift + PrtSc to turn off High Contrast.
- Follow the instructions in the Color Contrast Guide to measure the contrast of any color combinations except in logos, decorative images, duplicative images, or disabled elements. Check for and document:
- Regular-sized text with a contrast ratio less than 4.5:1. (WCAG 1.4.3)
- Large text (24px or 19px bold, or larger) with a contrast ratio less than 3:1. (WCAG 1.4.3)
- User interface components (e.g., links, buttons, etc.) or graphics (e.g., icons, charts, etc.) with a contrast ratio less than 3:1 (WCAG 1.4.11)
- If any of the issues listed above are found, report a failure. In the report, clearly identify the element involved, which check it failed, and any steps required to reproduce the failure. For color contrast failures, include the hexadecimal color codes of the foreground and background colors and the computed color contrast ratio.
4. Assistive Technology Tests
Assistive technology testing should only be performed by testers who have been trained to use assistive technology tools. Testers must be certain not to “cheat,” for example by using the mouse or visually reading information on the screen when testing with a screen reader. Do not test with assistive technology unless you are completely confident in your ability to use it as it would be used by someone with a disability.
Assistive technology testing should be performed after automated, keyboard, and visual testing has been completed and passed. Assistive technology testing is normally performed only on a representative sample of screens/pages. The representative sample should be selected by someone with knowledge of the scope and functions of the application and should include:
- Screens required for the most essential functions of the application, and
- Examples of each design patterns or interface component not present on the essential screens.
All findings for a given screen may be reported in a single ticket OR findings may be reported individually, including steps to reproduce, depending on complexity and requirements of the system developer/vendor.
- Screen Reader - Screen reader testing should be performed using the browser and screen reader most likely to be used by users of the system, i.e., Edge with JAWS for internal and Chrome with NVDA for public applications. For details, see Screen Reader Testing (https://doit.illinois.gov/initiatives/accessibility/testing/screen-reader)
- Screen Magnifier - Screen magnifier testing should be performed using ZoomText with magnification set to 4 x, color enhancement active, and speech disabled.
- Speech Recognition - Speech recognition should be performed using Dragon Professional and/or Windows Speech Recognition. Voice training should be completed and recognition accuracy confirmed before testing. Preference should be given to commands targeting specific elements, such as “click first name.” Mouse movement commands, including mouse grid, should be used only if specific commands do not work.
Documentation
When reporting multiple issues in a single ticket, provide a list of all the issues in a format such as:
# | Issue | WCAG | Impact |
---|---|---|---|
# | Clear, concise identification of the specific element(s) affected and the problem with the element(s). Optionally, a recommendation for fixing the problem. |
x.x.x | See scale above |
For example:
# | Issue | WCAG | Impact |
---|---|---|---|
1 | The page title "Untitled" does not indicate the topic of the page. | 2.4.2 | Med |
2 | The "Office Supply" logo image does not have a text alternative. The image should have alt="Office Supply". | 1.1.1 | High |
3 | The "Submit" button does not receive keyboard focus and cannot be operated using keyboard commands. The tabindex="-1" attribute should be removed. |
2.1.1 | Critical |
When issues must be reported individually, use the following format:
Title | Application Name - Screen Name - Issue (YYYY-MM-DD) |
---|---|
URL | URL |
OS | Name and version |
Browser | Name and version |
Testing Tools | Name(s) and version(s) |
Screenshot | 1280 x 720 resolution, including browser address bar. Include additional screen shots to show lower parts of page if necessary. Do not show automated testing tool results or highlights. |
Violations | WCAG criteria number and short name |
Impact | See scale above |
Steps to Reproduce |
|
Expected Behavior |
|
Notes | Optional, e.g., code snippets and/or recommended corrections |
If necessary, a screen recording of the Steps to Reproduce in MP4 format may also be included