Skip to main content

Accessibility Tool Vendor Scorecard

TLDR

Accessibility vendors vary wildly in what they actually detect, how they fix issues, and what they charge. This scorecard gives you a structured way to compare them before committing to a contract or annual plan.

How to Use This Scorecard

Rate each vendor on a 1-5 scale across six categories. A vendor that scores 5 in detection but 1 in remediation approach might create more problems than it solves. The goal is finding the tool that best fits your team’s technical ability, budget, and compliance timeline.

This scorecard works whether you’re evaluating tools for a single site or comparing vendors for an agency managing dozens of client properties. Adjust the weight of each category based on what matters most to your situation.

Scoring scale:

  • 5 — Best in class, no significant gaps
  • 4 — Strong, minor limitations
  • 3 — Adequate, notable trade-offs
  • 2 — Weak, significant gaps
  • 1 — Fails to deliver on this category

Category 1: Detection Coverage (Weight: High)

Detection is the foundation. A tool that misses issues gives you false confidence, which is worse than having no tool at all.

What to evaluate:

  • Automated rule coverage: How many WCAG 2.2 AA success criteria does automated scanning cover? Most tools detect 30-40% of WCAG criteria automatically. Any vendor claiming higher than 50% automated coverage should explain their methodology.
  • Manual test guidance: Does the tool flag issues that need human review and provide instructions for how to verify them? Color contrast in images, reading order, and cognitive clarity all require human judgment.
  • Page coverage: Can it scan your full site, or only pages you manually submit? JavaScript-rendered content (SPAs, React/Angular apps) is especially tricky. Test the scanner on your most complex page.
  • Component-level detection: Does it identify which specific HTML element has the issue, or just flag the page? Element-level targeting saves hours in remediation.
  • False positive rate: Run a test on pages you’ve already manually audited. If the tool flags issues that aren’t real problems, or misses ones you know exist, that tells you a lot.

Score 5 if: Covers 40%+ of WCAG 2.2 AA automated criteria, handles JS-rendered content, flags manual review items with clear guidance, low false positive rate.

Score 1 if: Only checks a handful of rules, can’t handle SPAs, high false positive rate, no manual test guidance.

VendorAutomated RulesJS RenderingManual GuidanceFalse Positive RateScore (1-5)
accessiBe30-40% of criteriaYes (overlay)LimitedModerate2
AudioEye~50% of issuesYes (overlay)Expert-supplementedModerate3
UserWay30-40% of criteriaYes (overlay)LimitedModerate2
EqualWeb30-40% (self-acknowledged)Yes (overlay)LimitedModerate2
Deque axe57% of issues, 80% with IGTsYesExtensive (IGTs)Low5
Siteimprove~30% of criteriaYesEnterprise guidanceLow4
WAVE~30% of criteriaNo (manual only)Visual overlayLow3
Pope Tech~30% of criteria (WAVE engine)NoLimitedLow3
Level Access~30% + extensive manualYesExpert-ledLow4
A11yProof80%+ detectable issues (3-pass AI)YesAI-generated fixesLow4

Accessibility Tool Vendor Scorecard

A scoring framework to compare accessibility vendors objectively across detection coverage, remediation approach, reporting, and pricing.

No spam, ever. Unsubscribe anytime.