← Back to Blog

Web Accessibility Testing: A Practical Guide

Make your website usable for everyone with systematic accessibility testing

One billion people worldwide have disabilities. If your website isn't accessible, you're excluding 15% of the global population—and potentially facing legal liability. Accessibility lawsuits in the US increased 350% between 2020-2025. But accessibility isn't just about compliance—it improves usability for everyone. Captions help non-native speakers, keyboard navigation helps power users, high contrast helps people in bright sunlight. This guide shows you how to test and ensure your site is accessible.

Understanding WCAG Standards

WCAG (Web Content Accessibility Guidelines) is the international standard for web accessibility. It defines three conformance levels and four core principles (POUR).

For more insights on this topic, see our guide on Typography in Web Design: A Complete Guide.

The four POUR principles:

  • Perceivable: Users must be able to perceive information. Provide text alternatives for images, captions for videos, ensure sufficient color contrast.
  • Operable: Users must be able to operate the interface. Ensure keyboard navigation, provide enough time to interact, avoid seizure-inducing content.
  • Understandable: Content and operation must be understandable. Use clear language, predictable navigation, help users avoid/correct errors.
  • Robust: Content must work with current and future technologies, including assistive technologies. Use semantic HTML, valid code.

WCAG conformance levels:

  • Level A (minimum): Basic accessibility. Failure to meet Level A means significant barriers for users with disabilities. Examples: Provide alt text, ensure keyboard access.
  • Level AA (target): Recommended standard for most organizations. Required by many laws (ADA, Section 508, EU accessibility directive). Examples: 4.5:1 color contrast, multiple ways to find pages, clear focus indicators.
  • Level AAA (enhanced): Highest level. Often difficult to achieve for entire site. Examples: 7:1 color contrast, sign language interpretation for videos. Target AAA for critical pages (checkout, forms).

Legal requirements by region:

  • USA: ADA applies to public businesses. Section 508 for federal sites. No specific WCAG level in ADA, but courts often use WCAG 2.1 AA as standard.
  • European Union: European Accessibility Act requires WCAG 2.1 AA for public sector and many private businesses.
  • UK: Equality Act 2010 requires accessible websites. WCAG 2.1 AA is the standard.
  • Canada: Accessible Canada Act and provincial laws. WCAG 2.0 AA minimum.

Automated Accessibility Testing Tools

Automated tools catch 30-40% of accessibility issues—things like missing alt text, color contrast failures, and invalid HTML. They're a crucial first step but can't replace manual testing.

Essential automated tools:

  • Lighthouse (Chrome DevTools, free): Built into Chrome. Audits accessibility, performance, SEO. Generates score and specific issues. Run: DevTools > Lighthouse > Accessibility.
  • axe DevTools (browser extension, free): Comprehensive automated testing. Highlights issues directly on page. Provides fix guidance. Most accurate automated tool. Available for Chrome, Firefox, Edge.
  • WAVE (browser extension, free): Visual feedback tool. Shows icons on page for errors, alerts, features. Great for learning what to fix. Available for Chrome, Firefox.
  • Pa11y (CLI tool, free): Command-line testing for CI/CD integration. Test multiple pages automatically on deploy. Great for developers.
  • Accessibility Insights (Microsoft, free): Comprehensive tool with guided assessments. Includes automated and manual testing workflows.

Common issues automated tools detect:

  • Missing alt attributes on images
  • Insufficient color contrast (text vs. background)
  • Missing form labels
  • Invalid HTML (unclosed tags, duplicate IDs)
  • Missing page language attribute (<html lang="en">)
  • Headings out of order (H1 → H4, skipping H2-H3)
  • Missing link text ("click here" vs. "read our accessibility policy")
  • Images of text instead of actual text

Running automated tests:

  • Test multiple pages: Don't just test homepage. Test forms, article pages, product pages, checkout flow—every template type.
  • Test in authenticated states: Login and test dashboard/account pages. Many issues hide behind authentication.
  • Test dynamic content: Modals, dropdowns, tooltips, tabs—interactions that change page content.
  • Integrate into CI/CD: Run Pa11y or axe-core in your deployment pipeline. Fail builds with critical accessibility errors.

Manual Testing and What Automation Misses

Automated tools can't judge if alt text is meaningful, if focus order is logical, or if content makes sense to screen reader users. Manual testing is essential and catches the majority of accessibility issues.

Keyboard navigation testing:

  • Tab through entire page: Can you reach all interactive elements (links, buttons, form fields) using Tab key? Are any elements unreachable?
  • Check focus indicators: Is there a visible focus ring/outline? Does it have sufficient contrast (3:1 minimum)? Never set outline: none; without providing custom focus styles.
  • Logical tab order: Does tab order follow visual order? Can you complete tasks (fill form, checkout) without mouse?
  • Skip links: Does page have "Skip to content" link for keyboard users to bypass navigation? Test by pressing Tab on page load.
  • Modal focus trapping: When modal opens, does focus move to modal? Does Tab stay within modal? Can you close with Escape key?
  • Dropdown menus: Can you open with Enter/Space? Navigate options with arrow keys? Select with Enter?

Keyboard testing checklist:

  • Tab, Shift+Tab: Navigate forward/backward
  • Enter: Activate links and buttons
  • Space: Activate buttons, toggle checkboxes
  • Arrow keys: Navigate dropdowns, radio groups, sliders
  • Escape: Close modals, cancel actions

Content and structure testing:

  • Heading structure: Use browser extension to view heading outline. Do headings create logical hierarchy? Is there one H1 per page? No skipped levels?
  • Alt text quality: Do images have meaningful alt text? Decorative images should have empty alt (alt=""), not missing alt. Complex images (graphs, diagrams) need longer descriptions.
  • Link text: Do links describe destination? "Click here" and "Read more" without context fail accessibility. Should be "Read our privacy policy" or "Learn more about pricing."
  • Form labels: Every input must have associated label. Placeholder text alone doesn't count. Use <label for="..."> or aria-label.
  • Error messages: Are form errors clearly announced? Associated with fields? Provide guidance on how to fix?
  • Landmarks: Is page divided into semantic regions (header, nav, main, aside, footer)? Screen readers use these for navigation.

Screen Reader Testing

Screen readers are assistive technology that reads page content aloud and provides keyboard navigation. Testing with screen readers reveals how blind and low-vision users experience your site.

Popular screen readers by platform:

  • NVDA (Windows, free): Most popular free screen reader. Used by 40%+ of screen reader users. Pairs with Firefox or Chrome.
  • JAWS (Windows, paid): Most feature-rich. Used by 50%+ of professionals. Expensive ($1,000+) but offers trial. Pairs with Chrome or Edge.
  • VoiceOver (macOS/iOS, free): Built into Apple devices. 20%+ of screen reader users. Activate: Cmd+F5 on Mac, triple-click home button on iOS.
  • TalkBack (Android, free): Built into Android. Growing market share. Activate in Settings > Accessibility.

Basic screen reader testing steps:

  • Learn basic commands: Each screen reader has different shortcuts. NVDA/JAWS use Insert key, VoiceOver uses Ctrl+Option. Spend 30 minutes learning basics before testing.
  • Navigate by headings: Press H (NVDA/JAWS) or VO+Cmd+H (VoiceOver) to jump between headings. Can users skim content this way?
  • Navigate by landmarks: Press D (NVDA/JAWS) or VO+U (VoiceOver) to navigate by regions. Are all page sections accessible?
  • Tab through links: Does screen reader announce link purpose clearly? Are link destinations obvious?
  • Fill out forms: Can you complete forms without seeing screen? Are labels announced? Are errors clear?
  • Test dynamic content: When modal opens, does screen reader announce it? Are updates to page content announced (via ARIA live regions)?

Common screen reader issues:

  • Unlabeled buttons/links: Icon-only buttons without aria-label are announced as "button" with no context.
  • Images read as filename: Missing alt text causes screen reader to read "IMG_2847.jpg" instead of meaningful description.
  • Form inputs without labels: Screen reader can't announce field purpose, making forms impossible to complete.
  • Focus moving unexpectedly: JavaScript that moves focus without user action is disorienting.
  • Content updates not announced: Dynamic content changes (filtering results, adding to cart) happen silently unless ARIA live regions are used.

Color and Visual Testing

Color contrast and visual presentation affect users with low vision, color blindness, and anyone viewing in bright sunlight or on poor displays.

Contrast ratio testing:

  • Minimum standards: 4.5:1 for normal text, 3:1 for large text (18pt+ or 14pt+ bold), 3:1 for UI components.
  • Tools: WebAIM Contrast Checker (manual), Chrome DevTools (shows ratios when inspecting elements), axe DevTools (scans entire page).
  • Common failures: Gray text on white (#999 on #fff fails). White text on light brand colors. Links that don't have sufficient contrast with surrounding text.

Color blindness testing:

  • Don't rely on color alone: If error states are only shown in red, colorblind users can't distinguish them. Add icons or text labels.
  • Use simulators: Browser extensions like "Colorblind - Dalton" show how your site looks to users with different types of color blindness.
  • Test graphs/charts: Pie charts and graphs that rely solely on color fail. Use patterns, labels, or direct labeling.

Zoom and reflow testing:

  • Zoom to 200%: WCAG requires content to be usable at 200% zoom without horizontal scrolling. Test: Ctrl/Cmd + to zoom in browser.
  • Check mobile reflow: Does content reflow properly on narrow screens? Fixed-width layouts often fail.
  • Text spacing: Users should be able to adjust line height, letter spacing, word spacing via browser extensions. Test with "Stylus" extension to apply custom CSS.

Building an Accessibility Testing Workflow

Accessibility testing should be continuous, not a one-time audit. Integrate it into your development process.

Development phase:

  • Design reviews: Review mockups for color contrast, clear focus indicators, touch target sizes (minimum 44x44px) before development.
  • Component testing: Test each UI component (button, form, modal) for accessibility before integrating into pages.
  • Automated tests in CI/CD: Run Pa11y or axe-core on pull requests. Fail PRs with critical errors.
  • Code reviews: Check for semantic HTML, ARIA attributes, keyboard event handlers alongside functional code.

QA phase:

  • Automated scan: Run full-site scan with axe DevTools or WAVE. Create tickets for all issues.
  • Manual keyboard testing: Tab through all pages and flows. Document any unreachable elements or broken focus order.
  • Screen reader spot checks: Test critical paths (homepage, sign-up, checkout) with NVDA or VoiceOver.
  • Contrast validation: Scan all pages for contrast failures. Fix before launch.

Post-launch:

  • Monthly audits: Run automated scans to catch regressions from new features.
  • User feedback: Provide accessibility feedback mechanism. Users with disabilities will find issues you miss.
  • Regular manual testing: Quarterly keyboard and screen reader testing on key flows.
  • Stay updated: WCAG updates periodically. Subscribe to A11y Weekly newsletter for latest guidance.

Prioritizing accessibility fixes:

  • Critical (fix immediately): Keyboard traps, contrast failures on body text, forms impossible to complete, missing alt on informative images.
  • High (fix within sprint): Missing focus indicators, poor heading structure, unlabeled form inputs, confusing link text.
  • Medium (schedule): Contrast failures on non-essential elements, missing skip links, minor ARIA issues.
  • Low (backlog): Enhancements beyond WCAG AA, AAA level improvements, nice-to-haves.

Accessibility Statement and Documentation

An accessibility statement demonstrates commitment, provides transparency about limitations, and offers users a way to report issues.

What to include in accessibility statement:

  • Conformance level: "We aim to conform to WCAG 2.1 Level AA."
  • Known issues: Be transparent about current limitations and timeline for fixes.
  • Feedback mechanism: Email or form for users to report accessibility barriers.
  • Testing methods: Automated tools used, assistive technologies tested.
  • Last updated date: Show statement is actively maintained.
  • Contact information: Who to reach for accessibility concerns.

Accessibility statement template resources: W3C provides templates at w3.org/WAI/planning/statements/. Adapt to your organization.

Related Reading

Need Help with Accessibility Testing?

Accessibility testing requires expertise, the right tools, and systematic processes. We conduct comprehensive accessibility audits, provide detailed remediation guidance, and help build accessibility into your development workflow.

Get Accessibility Audit