← back to brueder

bruesli briefing Road to a11y: testing, auditing, and iteration

Posted on:

German version
An image collage consisting of a black-and white illustration of a turtle drawing on a graphics tablet while simultaneously typing on an old-timely keyboard, in front of big letters spelling a11y in light green on a dark green background.

If you really want to make sure a website comes with as few barriers as possible, it needs to be tested in many different ways – and by many different people.

A few months ago, we turned our website process upside down – and as part of that, we also started building a testing setup for digital accessibility. Our goal: to be able to run some tests ourselves, but also to have more meaningful conversations with people who actually rely on assistive technologies.

Iterative testing cycles instead of one big audit

We moved away pretty quickly from the idea of a single, giant audit – simply because iterative cycles of testing and bug fixing saved a lot of time for us. Right now, our setup looks something like this, with each testing round followed by a bug fixing round:

  • Defining the scope (conformance level, relevant pages, interactive components, user flows)
  • Testing with automated tools
  • Desktop testing with different screen reader/browser combinations
  • Mobile testing with different screen reader/browser combinations
  • Testing with other assistive technologies
  • User testing
  • User feedback (post-launch)
a screenshot of a video call with shared screen, the background of the image shows a spreadsheet with the headline conformance test and several indistinct lines with labels in red and green. On the right side is a gallery of four portraits of the participants of the call.

Still plenty to do – discussing a website audit in the a11y jour fixe

Testing for both accessibility and usability

The end goal is for as many people as possible to be able to use digital products comfortably and effectively. That’s why we don’t just test against WCAG success criteria, but also for general usability – like making sure it’s possible to navigate the site quickly using voice or keyboard input, or that labels and alt text are complete and make sense when read out loud by a screen reader. The test documentation can then be translated into an accessibility statement. That’s also where you note which requirements are not yet met – and what’s planned for the future.

Keeping at it after launch

The accessibility statement is also a great place to invite users to share feedback after (re)launch. That way, the site can keep improving and be optimized for use cases that weren’t considered before. In general, regular technical maintenance should also include accessibility checks. Legal requirements change, as do assistive technologies and browser interfaces – which means small updates will keep popping up over time.

Questions? Thoughts? Drop us a line: info@diebrueder.com