Accessibility – Test & testing with automated test tools

16
Accessibility – Test & testing with automated test tools [email protected]

Transcript of Accessibility – Test & testing with automated test tools

Accessibility –

Test & testing with

automated test tools

[email protected]

Kristian Munter SimonsenUniversal design specialist

• Originally a web developer

• Accessibility expert

• Certified Professional in Accessibility

Core Competence Quality

• Master degree in universal design of ICT

What does this graph show?

Methods Employed by

Norwegian Professionals in

Universal Design of IT

But what is missing…?

Automated testing

Begnum, M. E. N. (2016) Methodology for Universal Design of ITs; Epistemologies Among Norwegian Experts. International

Conference on Computers Helping People with Special Needs. LNCS, Vol. 9758, p.121-128. Springer.

Benefits with automated tools

• Automated tools is a great starting point

• Can highlight repeating errors• Covers a large ammount of code

• Low effort, high reward• Low barrier to entry

• Actionable results linking to WCAG requirements

• However:• Developers tend the overemphasize the results

Tversky, A., & Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases. Science, 185(4157), 1124–1131.

Pitfals of automated testing tools

• Even the best tools only cover

about 40% of WCAG requirements

• WCAG is not «humanly testable»

• …not completely machine testable

either

• Avoid using results from

automated tools as only source.

But wait… not humanly testable?!

• A study, comprising 22 experts

and 27 non-experts, shows that

approximately 50% of success

criteria fail to meet the 80%

agreement threshold; experts

produce 20% false positives and

miss 32% of the true problems.Brajnik, G., Yesilada, Y., & Harper, S. (2010). Testability and Validity of WCAG 2.0: The

Expertise Effect. Proceedings of the 12th International ACM SIGACCESS Conference on

Computers and Accessibility, 43–50. https://doi.org/10.1145/1878803.1878813

A comparsion

• Shows how different toolsgive different results• Some tools are better at

specific WCAG requirements

• Robustness is easiest to test

• Understandable is hardest

• Might be an idea to focus on 3.X.X requirements during user and expert testing

Vigo, M.; Brown, J.; Conway, V. Benchmarking web accessibility evaluation tools: measuring the harm of sole reliance on automated tests. In Proceedings of the 10th International Cross-Disciplinary Conference onWeb Accessibility, Rio de Janeiro, Brazil, 13–15 May 2013; ACM: New York, NY, USA, 2013; p. 1 (17) (PDF) Comparing Web Accessibility Evaluation Tools and Evaluating the Accessibility of Webpages: Proposed Frameworks.

Axe / Axe core

LINKS:

https://github.com/de

quelabs/axe-core

Licence

pa11y

• Runs in the command line or JS

• NPM package

• Can run in most JS projects.

LINKS

https://github.com/pa

11y/pa11y

Pa11y

User testing is subjective

• People have different

expectations

• These expectations might differ

and be in direct opposition of

each other

• Typically low diversity in user

testing.

Automated tests

• Quick and simple

• Covers large amount of content

User involvement

• Listen to the users

• Uncover issues early

Expert inspections

• Qualitative

• Supports growth of competence

• Low test-criteria coverage

• Low test result quality

• Time demanding and expensive

• Danger of beeing too subjective

• WCAG is not humanly testable

• Challenging to put one selves in the position of the diverse set of users

Triforce of UD

User

testing

Automated

testing

Expert

evaluation

Universal

design

Other tools

• Hundreds of accessibility

WCAG tools

• W3C WAI maintains a list

LINK

https://www.w3.org/W

AI/ER/tools/

There are no average

users