Web Accessibility Audit: What Gets Tested and Why It Matters
Running a web accessibility audit is one of the most practical steps any organisation can take to understand how well their website works for people with disabilities. Whether your site serves the public, supports patients or generates commercial enquiries, accessibility gaps create real barriers for real users. Working with a specialist team that provides website accessibility services for UK businesses gives you a structured, thorough review of where your site stands and what needs to change. The point of an audit isn’t to produce a long list of failures for the sake of it. It’s to identify the issues that affect how people experience your website, prioritise them and give your team a clear route to fixing them.
Legal requirements push some companies towards accessibility audits, but others just want their websites to work for everyone. The Equality Act 2010 requires service providers to make reasonable adjustments so disabled people don’t encounter substantial disadvantages. Public sector organisations face stricter demands under the Public Sector Bodies (Accessibility Regulations) 2018, which require WCAG 2.2 AA compliance. Strip away the legal framework though and you’re looking at something much simpler: broken websites shut people out and audits reveal exactly where that’s happening.
What a web accessibility audit covers
Any worthwhile accessibility audit uses WCAG as its testing foundation. WCAG 2.2 AA compliance becomes the target for most organisations, built around four principles: perceivable, operable, understandable and.
Automated scanners won’t give you a proper audit on their own. They’ll spot certain issues at scale but completely miss massive problems across your site. Testing that relies purely on automation catches less than half of all WCAG failures, according to accessibility specialists who’ve studied this repeatedly. Manual testing by someone who understands how disabled users navigate websites becomes for everything else.
| Testing method | What it catches | Limitations |
|---|---|---|
| Automated scanning | Missing alt text, colour contrast failures, missing form labels, broken ARIA attributes | Can’t assess context, meaning or real user experience |
| Manual expert review | Keyboard traps, focus order problems, unclear link text, logical heading structure | Time-intensive, requires trained accessibility specialists |
| Assistive technology testing | Screen reader compatibility, voice navigation issues, switch access barriers | Requires access to multiple tools and expertise in using them |
Mix automated scanning with expert manual reviews and real assistive technology testing. Screen readers, voice recognition software and switch devices all need to be part of the process. You’ll end up with reports that reflect genuine user experiences rather than just technical code violations, which makes all the difference when you’re trying to fix what’s broken.
The key areas that get tested
Every web accessibility audit covers the same core areas. We might be looking at different pages and components on your site, but the main categories don’t change from one WCAG review to another.
Navigation and keyboard access. Can someone reach every interactive element without touching a mouse? Links, buttons, form fields, dropdown menus, modal dialogs and custom components all need to work with keyboards, switch devices and voice commands. Focus visibility has to be clear and the tab order can’t jump around randomly. Keyboard traps are absolute killers though, where users get stuck inside a component with no way out.
Images and non-text content. Alt text needs to exist and make sense for images that carry information. Decorative images get marked so screen readers skip right over them. We check everything from charts and infographics to icons, making sure any visual content with meaning gets proper descriptions that help people understand what’s there.
Colour and visual design. Normal text needs 4.5:1 contrast ratio and large text needs 3:1 under WCAG 2.2 AA standards. But proper contrast goes beyond just meeting minimums because people with low vision or colour vision deficiencies struggle to read anything that doesn’t have enough contrast. Never use colour as the only way to show error states or important information either.
Every form field needs its label properly connected and error messages have to be clear for screen readers. Forms and interactive elements. Validation can’t block people who use different interaction methods either. When forms break down, real damage happens because someone can’t book that appointment or complete their purchase. Even the best web design work can miss accessibility problems that only show up when you test with assistive technologies.
The power of the Web is in its universality. Access by everyone regardless of disability is an aspect.” Sir Tim Berners-Lee, W3C Director
Content and Messaging
Content structure and headings. H3 elements before H2s or missing section headings leave users stranded without a mental map of your content. Screen reader users jump between headings to navigate your pages, so broken heading hierarchy destroys their experience completely.
We see the same issues crop up again and again, which means you can get a decent sense of what needs fixing before we even start looking at your site. Most websites fail their first accessibility audit in predictable ways.
Images get uploaded with no alternative text at all or someone’s just copied the filename into the alt attribute. Alt text problems show up everywhere. Decorative images that should be hidden from screen readers aren’t marked properly and you’ll find alt text that tells you absolutely nothing useful about what’s in the image.
That lovely light grey text your designer chose looks elegant on white backgrounds but becomes completely unreadable for people with visual impairments. Brand colours cause real headaches when they haven’t been tested for contrast ratios and don’t get us started on pale placeholder text in contact forms.
Someone builds a fancy dropdown menu or modal window using divs and JavaScript, but forgets that keyboard users can’t operate the thing. Custom components break keyboard navigation more than anything else. Standard HTML buttons work perfectly, but styled divs need proper event handlers and ARIA roles to function for everyone.
- Missing form labels or labels not programmatically associated with their fields
- Links with vague text such as “click here” or “read more” that mean nothing out of context
- Videos without captions or transcripts
- PDF documents that aren’t tagged for accessibility
- Auto-playing media without controls to pause or stop it
- Touch targets that are too small for users with motor impairments
- Time limits on forms or sessions without the option to extend
Most sites fail basic WCAG requirements with the same issues cropping up repeatedly: missing alt text, contrast problems, unlabelled forms, empty links and pages without proper language declarations. Each year the WebAIM Million report examines accessibility across the top million home pages and the findings stay depressingly consistent.
Why automated tools aren’t enough on their own
Tools like axe and WAVE will catch obvious violations quickly and they’re brilliant for spotting issues while you’re building. Automated testing alone means you’re only seeing part of the problem though and that’s risky territory.
Automated checkers spot whether an image has an alt attribute but they’re clueless about whether that text helps anyone. You’ll get images tagged “graph.png” that technically pass while screen reader users remain completely in the dark about what data they’re missing.
Professional audits from trained specialists fill in the massive gaps that scanners miss completely. Human judgement matters when you’re evaluating whether content makes sense, whether keyboard navigation flows logically or whether your custom dropdown behaves the way someone using assistive technology would expect it to. Automated scanners catch maybe many real accessibility problems and too many organisations run a quick scan, see mostly green ticks and assume they’re sorted when massive barriers still block their users.
The audit report and what to do with it
Break down every single problem by how severe it is, which WCAG rules it breaks and exactly where it happens on your site. The best reports don’t just dump a list of issues on you though. They explain why each problem matters to real people trying to use your website, tell you exactly how to fix it and rank everything so your developers know what to tackle first.
Critical issues stop users dead in their tracks while high-priority problems create serious barriers but people might find ways around them. Medium and low-priority issues make things harder without completely blocking access.
Start with the critical that affects your most important pages and user flows. Don’t try fixing everything simultaneously because you’ll burn out your team and probably break something else. A development team experienced in WordPress development can knock out common fixes at the template level, which means one change fixes the problem across hundreds of pages.
| Priority level | Description | Example |
|---|---|---|
| Critical | Users can’t access content or complete key tasks | Keyboard trap in main navigation, form can’t be submitted via keyboard |
| High | Significant barrier with no easy workaround | Missing labels on contact form fields, no skip navigation link |
| Medium | Barrier that affects experience but has workarounds | Colour contrast just below threshold, heading hierarchy skips a level |
| Low | Minor issue with limited user impact | Decorative image has redundant alt text, link opens in new tab without warning |
You can’t just audit once and forget about it. Websites change constantly as new content gets published, features are added and designs get updated and each change can introduce new accessibility barriers that weren’t there before. Regular audits combined with ongoing monitoring are the only way to keep your website consistently accessible over time.
The organisations seeing real results weave accessibility considerations into every single stage of their digital workflow, right from content creation through design, development and quality assurance. Treating accessibility like a yearly box-ticking exercise gets you nowhere fast.
Content teams need proper training on writing accessible content. So learning practical skills like writing meaningful alt text, using heading structures correctly, creating descriptive link text and making sure content doesn’t rely on instructions like “click the red button” to make sense. The GOV.UK accessibility blog gives you practical guidance for creating accessible digital content.
Fixing accessibility problems after development starts will drain your budget fast. Smart teams bring in people who know SEO and accessibility together because the structural changes help both search rankings and real users. You’ll save weeks by sorting out colour contrast, focus states and touch target sizes while you’re still sketching wireframes.
Automated tools in your deployment pipeline will spot the obvious before it goes live, which helps. Manual testing with actual keyboards and screen readers during QA makes the difference though.
An audit tells you what’s broken right now and gives you a fix list. But here’s what matters: making accessibility part of your daily workflow instead of something you check once a year. Teams that build it into everything they do don’t spend their time panicking about compliance. The others just keep hoping their annual scan won’t find anything terrible.
Want to know how accessible your site really is? Get a proper audit and you’ll have the data, priorities and action plan your team needs to build something that works for everyone.
FAQs
What is the difference between automated and manual accessibility testing?
Automated scanning tools catch obvious issues like missing alt text, poor colour contrast and broken ARIA attributes, but research consistently shows they miss more than half of all WCAG failures. Manual expert review is needed to assess things like whether navigation makes sense to a screen reader user, whether keyboard focus moves logically through the page and whether someone can actually complete a form using only a keyboard. A thorough accessibility audit combines automated scanning, manual expert review and testing with real assistive technologies to get the full picture.
What are the most common accessibility issues found during a website audit?
Missing or inadequate alt text is consistently one of the most frequent findings, along with colour contrast failures where brand colours do not meet the minimum 4.5:1 ratio against their backgrounds. Custom JavaScript components that break keyboard navigation are another major issue, particularly modals, carousels and dropdown menus that lack proper ARIA roles and focus management. Form labels that are not programmatically associated with their fields and vague link text like “click here” also appear on the majority of sites audited for the first time.
Do private sector businesses in the UK need to comply with web accessibility standards?
Yes. While the Public Sector Bodies Accessibility Regulations 2018 apply specifically to government departments and publicly funded bodies, the Equality Act 2010 covers any organisation serving the public online, including private businesses. The Act requires reasonable adjustments so that disabled users are not disadvantaged when accessing your digital services. Beyond legal obligations, customers increasingly expect accessible websites, procurement teams often demand WCAG compliance and the reputational impact of excluding users continues to grow.