TLDR
Most agencies do accessibility auditing inconsistently — some projects get a Lighthouse run, some get nothing, and clients get varying quality of documentation. A repeatable workflow fixes this: automated scan at project start, triage by severity, remediation tracking through development, and a final scan-to-report delivery.
- Accessibility audit
- A systematic evaluation of a website against WCAG (Web Content Accessibility Guidelines) criteria, identifying barriers that prevent users with disabilities from accessing content or completing tasks. Audits combine automated scanning (which catches structural failures) with manual testing (which catches interaction and judgment issues automated tools cannot evaluate).
DEFINITION
- WCAG 2.1 AA
- The Web Content Accessibility Guidelines version 2.1 at conformance level AA. This is the current legal standard referenced in ADA Title III accessibility requirements and most accessibility policy frameworks. AA conformance includes both Level A (most critical barriers) and Level AA requirements.
DEFINITION
- Remediation
- The process of fixing accessibility issues identified in an audit. Remediation may include code changes (adding alt text, fixing form labels, correcting heading structure), design changes (improving color contrast), or content changes (rewriting ambiguous link text). Remediation is done in source code, not via runtime overlays.
DEFINITION
- Triage
- The process of reviewing identified accessibility issues and prioritizing them for remediation by severity (critical issues that block users from completing key tasks vs. advisory issues that are minor friction). Effective triage ensures developer effort addresses the most impactful issues first.
DEFINITION
Building an accessibility practice in a web agency starts with having a repeatable process. Without one, audits are inconsistent, clients get different quality levels depending on who ran the project, and accessibility becomes a source of scope creep rather than a packaged service.
This guide describes a workflow that agencies can implement systematically. It is built around automated scanning as the foundation — not because automated tools catch everything, but because they establish a consistent baseline and produce the structured output that makes everything else (triage, reporting, remediation tracking) manageable at scale.
Step 1: Establish Scope Before Scanning
Before scanning anything, document what you are auditing. For a full site audit: which pages, which user flows, which interactive components. For a maintenance retainer: which pages are in scope each month.
Scope creep in accessibility auditing is common. A site with 500 pages cannot be fully audited at launch pricing. Agencies should define what constitutes a representative sample — key user flows, high-traffic pages, transactional pages — and document that scoped definition for the client. This prevents disputes about whether an obscure archived page was in scope for the audit.
Step 2: Run the Automated Scan
Automated scanning against WCAG 2.1 AA criteria catches the structural failures that are easiest to catch and most consistent across sites: missing alternative text, improper form label associations, heading hierarchy failures, low color contrast, missing landmark regions, and focus management issues.
Set the scanner to crawl the scoped pages and return issues with:
- Element-level identification (which specific HTML element is failing)
- WCAG criterion reference (which rule the element fails, e.g., 1.4.3 Contrast)
- Severity level (critical, serious, moderate, advisory)
A11yProof returns this structure by default. axe DevTools and WAVE return similar formats. Lighthouse returns scores without element-level detail — useful as a summary, not as a remediation guide.
Step 3: Triage and Prioritize
Raw scanner output is not a client report. Triage converts it into prioritized developer tasks.
Sort issues into three buckets:
Critical: Issues that completely block task completion for users with disabilities. Examples: form fields with no label (screen reader user cannot identify the field), images with no alt text in a context where the image communicates content, interactive elements that cannot receive keyboard focus.
Serious: Issues that create significant barriers but not complete blockers. Examples: low contrast text that fails WCAG 1.4.3 thresholds, heading hierarchy problems that disrupt screen reader navigation, missing error identification in forms.
Advisory: Issues that are worth fixing but have lower user impact. Examples: redundant alt text, non-optimal (but functional) landmark structure, minor focus order issues on non-critical pages.
Fix criticals before launch. Seriouses in the first sprint post-launch if not addressed before. Advisory items tracked and addressed in ongoing maintenance.
Step 4: Create the Developer Issue List
Triage output becomes a developer task list. The format that works: one issue per row, including the page, the element selector or description, the WCAG criterion, the severity, and a plain-language description of what needs to be fixed.
Developers do not need to understand WCAG 2.1 in depth to fix most issues. “Add an aria-label to the search input field because it has no associated label” is more actionable than “Fails WCAG 1.3.1: Info and Relationships.”
Keep the issue list in the project management tool where development tasks already live, not in a separate accessibility spreadsheet that doesn’t get looked at.
Step 5: Remediation and Rescan
Developers fix issues and mark them complete. Before declaring remediation done, rescan the affected pages. Automated fixes sometimes introduce new failures (e.g., adding alt text that is not descriptive, changing markup that breaks another element’s association).
A rescan also generates the “before and after” evidence that makes for a credible audit report. Clients who want compliance documentation want to see that issues were found and confirmed fixed, not just that a scan was run.
Step 6: Deliver the Client Report
The client report is the project artifact. It contains:
- Executive summary: Overall compliance status, number of issues found by severity, number remediated, outstanding items if any
- Methodology note: What was scanned, what tools were used, what is not covered by automated scanning
- Issue log: Before and after, with remediation confirmation
- Recommendations: Outstanding advisory items, suggested ongoing monitoring cadence
The report should be agency-branded, not a vendor’s dashboard export. White-label reporting from a tool like A11yProof covers this; axe DevTools and Lighthouse output require manual reformatting.
Building This Into Retainers
The same workflow applies to ongoing maintenance retainers, on a scheduled cadence. Monthly or quarterly automated scans, triage of any new issues introduced by content or code changes, developer fix tickets for critical and serious issues, and a summary report to the client.
This is the accessibility service product: a recurring scan, fix cycle, and report that clients pay for as a maintenance retainer. It is most credible when the process is documented and consistent — which requires having defined it in the first place.
Q&A
What is a standard accessibility audit workflow for web agencies?
A repeatable agency audit workflow: (1) automated scan of the site, (2) triage issues by severity and impact, (3) generate client report and developer issue list, (4) developer remediation, (5) rescan to confirm fixes, (6) final report delivery. Steps 3-5 may repeat in iterations for complex projects.
Q&A
What tools do agencies use for accessibility auditing?
Common agency toolkit: an automated scanner for portfolio-level scanning and client reports (A11yProof, or Lighthouse/axe for manual scans), axe browser extension for developer-level review during remediation, and a screen reader (NVDA, VoiceOver) for manual testing of interactive components. Automated tools catch structural failures; manual testing catches interaction and cognitive accessibility issues.
Q&A
How should agencies handle clients who ask for ADA compliance certification?
Agencies cannot certify ADA compliance — legal compliance is ultimately a legal determination. What agencies can deliver is: WCAG 2.1 AA conformance documentation (a scan report showing zero critical issues and a list of identified issues that were remediated), a VPAT (Voluntary Product Accessibility Template) for clients who need formal documentation, and a recommendation that the client consult legal counsel for compliance questions specific to their risk profile.
Want to learn more?
Frequently asked