Aoda Training

Manual vs Automated Accessibility Testing: Which Do You Need?

Every AODA compliance audit involves two types of testing: automated scanning and manual evaluation. They are not alternatives — they are complementary. Automated tools, manual testing, and AT testing each find different issues. A credible WCAG audit uses all three.

93% Student Satisfaction (2K ratings)

Quantity Discounts

Price per course

6 to 20 courses

$20.95

21 to 50 courses

$17.95

51 to 200 courses

$13.95

201+ courses

Custom offer

Every AODA compliance audit involves two types of testing: automated scanning and manual evaluation. They are not alternatives — they are complementary. Automated tools find different issues from manual testing, and manual testing finds different issues from assistive technology testing. A credible WCAG audit uses all three.

The number that changes how people think about this

Automated accessibility tools, at their most capable, detect approximately 30–40% of WCAG 2.0 Level AA failures. Whether alt text is meaningful, whether a modal is navigable by keyboard in a way that makes sense, whether a screen reader announces form errors coherently — these require human judgment. No tool can provide it.

🤖

What automated testing does

Automated accessibility testing uses tools — browser extensions, command-line scanners, or CI/CD integrations — to analyse a page’s HTML, CSS, and ARIA against a set of predefined rules. When a rule is violated, the tool flags it. The process is fast, consistent, and scalable.

What automated tools are good at

What automated tools are good at

Automated tools are binary: a rule either fires or it does not. They cannot evaluate context, intent, or user experience. This creates a class of WCAG failures that are genuinely invisible to automated tools:

What manual testing does

Manual testing is where a trained accessibility specialist evaluates a website against WCAG criteria that cannot be checked algorithmically. This means reading alt text and asking whether it conveys the same information as the image. Navigating a page by keyboard and evaluating whether the experience makes logical sense.

What manual testing covers that automated tools cannot

If you have 10 or more items not complete in Section A, your website has significant accessibility barriers. Automated tools alone will not reveal all of them. A professional website accessibility audit is the most efficient way to get a complete, prioritized remediation list.
🔊

What assistive technology testing adds

Assistive technology (AT) testing means using a screen reader to navigate and interact with a website as a blind user would. It is the most revealing form of accessibility testing because it reproduces the actual experience of the users who face the most significant barriers.

AT testing frequently uncovers issues that both automated and manual testing miss:

AT testing is not the same as manual testing

Some auditors describe their manual testing process as including 'screen reader checks' when they mean they have reviewed ARIA markup and heading structure visually. Visual review of ARIA is manual testing — it is not the same as navigating a site using a screen reader with speech output. Real AT testing means operating the screen reader as a blind user would, including browsing by heading, by landmark, by link list, and by interacting with forms and components using AT keyboard commands.

📊

WCAG criterion coverage: automated vs manual vs AT testing

The table below maps key WCAG 2.0 Level AA criteria to the testing approach that detects them. ✓ = reliably detected, △ = partially detected or context-dependent, ✗ = not detectable by this method.

WCAG Description Automated Manual AT Testing
1.1.1 Non-text content (alt text)
1.2.2 Captions (pre-recorded)
1.3.1 Info and relationships
1.3.2 Meaningful sequence
1.3.3 Sensory characteristics
1.4.1 Use of colour
1.4.3 Contrast (minimum)
1.4.4 Resize text
2.1.1 Keyboard accessible
2.1.2 No keyboard trap
2.4.1 Bypass blocks (skip nav)
2.4.3 Focus order
2.4.4 Link purpose
2.4.6 Headings and labels
3.1.1 Language of page
3.3.1 Error identification
3.3.2 Labels or instructions
4.1.1 Parsing (valid HTML)
4.1.2 Name, role, value
Key: ✓ Reliably detected · △ Partially / context-dependent · ✗ Not detectable Criteria where automated tools show △ (partial): the tool can detect the technical attribute but cannot evaluate whether it is correct or meaningful. For example, automated tools detect the presence of an alt attribute but cannot assess whether its content is accurate.
⚖️

Automated vs manual: head-to-head

Automated testing Manual testing
Fast — scans a page in seconds Slow — a thorough manual review of a complex page takes hours
Consistent — same rules applied identically every time Variable — quality depends on tester skill and experience
Scalable — can run across thousands of pages Labour-intensive — depth of coverage is limited by tester capacity
Integrated — can run automatically in CI/CD pipelines Scheduled — must be planned and resourced separately
Catches 30–40% of WCAG failures Catches the majority of the remaining 60–70%
Cannot evaluate context or meaning Evaluates whether content and interactions make sense
No expertise required to run Requires trained accessibility specialist to be meaningful
Result: a starting point Result: a complete audit picture (with AT testing)
⚠️

When to use automated testing, manual testing, and AT testing

Situation Approach Rationale
Developer building a new feature Automated (axe in browser DevTools or CI/CD) Catches technical failures as code is written. Fast feedback loop prevents issues from accumulating.
Content editor publishing a new page Automated (WAVE for visual check) + manual alt text and link text review Editors need immediate, visual feedback. Automated gives a quick check; manual ensures content quality.
Pre-launch QA for a new website Automated + manual + AT testing Full audit before launch. Identifies all three categories of issue before real users encounter them.
Annual AODA compliance review Automated + manual + AT testing Compliance evidence requires demonstrating that all testable WCAG criteria have been evaluated.
Post-complaint investigation Manual + AT testing (automated as supplementary) Complaints usually relate to usability issues — exactly what automated tools miss. Manual and AT testing are primary.
Ongoing monitoring between formal audits Automated scanning (scheduled or CI/CD) Catches new issues introduced by content updates or code changes between full audits.
Filing an AODA compliance report Automated + manual + AT testing A compliance report is a legal declaration. It requires a full audit, not automated scanning alone.
🔄

Building a combined testing process

The most effective accessibility testing programmes use all three approaches in sequence, with each building on the last.

A combined testing workflow
  • Step 1 — Automated scan: Run axe DevTools across all scoped pages. Export results. This gives you high-volume technical failures fast.
  • Step 2 — Triage automated results: Review flagged items. Some will be straightforward failures; some will be "needs review" items that require manual assessment. Remove any false positives (rare with axe).
  • Step 3 — Manual testing: Work through the WCAG criteria that automated tools cannot assess. Keyboard navigation, alt text quality, heading logic, link text, form usability, colour use, motion controls.
  • Step 4 — AT testing: Navigate key pages and user journeys using NVDA with Firefox and VoiceOver with Safari. Complete the most important tasks as a blind user would. Note where the experience is confusing, broken, or missing.
  • Step 5 — Consolidate and document: Combine findings from all three testing phases into a single issues list, classified by severity and WCAG criterion.
  • Step 6 — Report: Produce an actionable audit report with prioritized findings, screenshots, code examples, and remediation recommendations.

Frequently asked questions

Is automated testing sufficient for AODA compliance?
  • No. Automated testing catches 30–40% of WCAG 2.0 Level AA failures. The remaining 60–70% require manual testing and screen reader evaluation. An AODA compliance audit that relies only on automated tools will miss the majority of real accessibility barriers — including the ones most likely to prevent disabled users from completing core tasks.
  • There is no automated way to know what automated testing missed — that is the nature of the limitation. The only way to find the issues that automated tools cannot detect is to conduct manual testing and AT testing. A professional accessibility auditor will work through the full set of WCAG success criteria, evaluating each one using the appropriate testing method.
  • A developer with WCAG knowledge can conduct useful manual testing — particularly keyboard testing and code review. However, manual testing is most effective when conducted by someone with practical experience evaluating web content for accessibility, including experience using assistive technologies. The gap between a technically correct implementation and one that works in practice for AT users is often only visible to someone who uses AT regularly.
  • Automated testing tools are typically low cost or free (axe DevTools free, WAVE free, Lighthouse built into Chrome). Manual testing and AT testing require specialist time. A full manual + AT audit of a small website typically costs between $1,500 and $4,000. Automated scanning alone costs a fraction of this but provides a fraction of the compliance picture.
  • Always run automated tests first. Automated scanning identifies high-volume technical failures quickly. Fixing these before manual testing ensures manual testing time is spent on issues that automated tools genuinely cannot assess, rather than on issues that tools should have caught.
  • For a professional AODA audit, yes. NVDA and VoiceOver handle ARIA and HTML differently in significant ways. A component that works correctly with NVDA may fail with VoiceOver, or vice versa. For most Ontario business websites, NVDA with Firefox (Windows) and VoiceOver with Safari (macOS and iOS) cover the most important AT combinations.

Get a full audit — automated, manual, and AT testing combined

Our AODA compliance audits use all three testing methods in sequence: automated scanning with axe and WAVE, manual WCAG testing by a trained specialist, and screen reader testing with NVDA and VoiceOver. The result is a complete picture — not the 30–40% that automated tools alone provide.