WCAG Audit Guide: What Gets Tested and What to Fix First

Accessibility audit checklist icon

Want to know if your website actually works for everyone? A WCAG audit cuts straight through the guesswork and shows you exactly where you stand with the Web Content Accessibility Guidelines. This isn’t just about ticking a compliance box (though it does that too). You’re getting a proper breakdown of every barrier that stops disabled users from navigating your site. And forget the idea that one quick automated scan will sort you out, a real audit means automated testing, manual checks and assistive technology walkthroughs all working together.

Some organisations don’t get a choice in this.

What a WCAG Audit Involves

If you’re covered by the Public Sector Bodies Accessibility Regulations 2018 or bidding for contracts that demand WCAG AA compliance, you’re already committed. But here’s the thing, even if you’re not legally required to do this, it makes business sense. Inaccessible sites push away customers, leave you exposed under the Equality Act 2010 and usually signal broader usability problems that frustrate everyone who visits.

Running Axe or WAVE and calling it job done? That’s not going to cut it. Yes, automated tools give you a starting point, but research shows they only catch about 25% to 40% of actual WCAG failures. The rest need a human brain to spot them, which is exactly why proper audits combine three different testing approaches rather than relying on software alone. Missing alt text, unlabelled form fields, dodgy colour contrast and pages without language declarations, that’s the stuff automated tools excel at finding. They’ll rip through hundreds of pages quickly and flag the obvious problems. WordPress sites can even run these checks from inside the CMS, so you catch issues before they go live instead of discovering them weeks later.

Where do automated tools hit their limit? That’s where manual testing steps in. We get a trained auditor to work through each page using nothing but keyboard navigation, no mouse allowed. They’re checking that every button, link and form field can be reached and actually used. Does the focus order make logical sense as you tab through? Do modal windows trap focus properly instead of letting it escape into the background? Custom components like accordions and dropdown menus get the full treatment too. And they’re not just testing functionality, they’re asking whether headings create a document outline that actually makes sense, whether link text means something useful when read out of context and whether the reading order matches what you’d expect from looking at the page.

Then comes assistive technology testing, the third layer that catches what everything else misses.

The POUR Principles and What They Cover

Warning icon for accessibility issues

POUR, that’s how WCAG organises everything. Four principles with a memorable acronym and every single guideline fits under one of these headings. Once you understand them, audit findings start making sense and you’ve got a proper framework for talking accessibility improvements with your developers and content teams.

Perceivable content means users can actually perceive it (shocking, right?). But seriously, if something exists only as a visual element with no text alternative, blind users can’t access it at all. This covers the obvious stuff like alt text for images and captions for videos, but also adaptable layouts that play nice with assistive technologies and colour contrast that doesn’t make your eyes bleed. Most automated tool findings cluster here because missing alt text and poor contrast ratios are dead easy to detect programmatically.

Can you navigate your entire website using just a keyboard? That’s what Operable is really asking. Every button, link, form field and menu needs to work without a mouse and we’re not just talking about hitting Tab until your finger goes numb. Users need enough time to actually read things, content can’t strobe like a nightclub (seizure triggers are a real concern) and there should be multiple ways to find what you’re looking for. JavaScript-heavy sites trip up here constantly because developers build shiny components that completely ignore keyboard events.

Understandable means your site doesn’t leave people scratching their heads. Screen readers need to know what language you’re using so they don’t butcher the pronunciation, forms need proper labels and instructions and error messages should actually tell people what went wrong and how to fix it. But here’s the thing, consistency matters just as much. Navigation that jumps around between pages? That fails. Form submissions that suddenly redirect without warning? Also fails.

Think of Robust as future-proofing for assistive tech.

Common Failures Ranked by Severity

So how do we make sense of audit findings? WCAG splits everything into conformance levels and Level A is your bare minimum, fail these and you’re literally blocking people from using your site. Level AA covers the middle ground and happens to be what UK law demands for public sector sites. Level AAA exists but expecting full compliance across an entire website is frankly unrealistic. Most companies building an accessible website aim for AA conformance because that’s where the sweet spot sits between legal compliance and practical development.

Priority WCAG Level Common Failures User Impact
Critical (fix immediately) A Missing alt text, no keyboard access to interactive elements, no form labels, missing page language Users completely blocked from accessing content or completing tasks
High (fix within first sprint) A / AA Insufficient colour contrast, missing skip navigation link, auto-playing media without controls, focus not visible Significant difficulty for users with visual or motor impairments
Medium (fix in planned cycle) AA Inconsistent navigation, missing error suggestions, content not reachable via multiple paths, text cannot be resized to 200% without loss Reduced usability and frustration for assistive technology users
Lower (address in ongoing maintenance) AA / AAA Missing abbreviation definitions, no sign language interpretation for video, reading level too high for intended audience Affects specific user groups in specific contexts

Every site is different though (that table’s just a starting point). When you’ve got a contact form missing proper labels versus some decorative blog image without alt text, which one’s actually stopping people from getting things done? Context matters here. You can’t just work down a findings list like it’s your weekly shopping, you need to think about which pages matter most and what your users are actually trying to accomplish.

How to Prioritise Fixes After an Audit

Hundreds of findings from a WCAG audit? Welcome to large site reality.

Fix Level A failures on your busiest pages first. And start with the user journeys that matter most to your business. Contact form that keyboard users can’t navigate? That’s blocking your main conversion path right there. Homepage navigation that won’t work without a mouse? Every screen reader user hits that wall on every single visit. Go after the problems that are creating the biggest barriers for the most people, that’s where you’ll make the biggest difference.

Once you’ve sorted Level A problems on critical pages, tackle Level AA issues on those same pages before you start expanding outward. This way your most important pages (the ones getting the most traffic and driving the most value) reach AA conformance first. But there’s another benefit here, stakeholders can see real progress happening quickly, which keeps everyone committed to the remediation work when things get tough.

Bundle similar problems together when you can. You’ve got 50 images missing alt text? That’s one job for your content team, not 50 tiny tickets cluttering up the backlog. Same goes for those missing keyboard focus indicators, if it’s a site-wide CSS issue where someone’s thrown outline: none into the stylesheet, one fix sorts the lot. This kind of batching stops your developers from ping-ponging between endless micro-tasks and actually gets things done.

  • Fix all Level A blockers on key conversion pages (contact forms, quote requests, booking flows) before anything else
  • Address site-wide issues through global template and stylesheet changes rather than page-by-page fixes
  • Tackle colour contrast failures in your design system so all components benefit from a single set of changes
  • Update reusable components (navigation, footer, header, sidebar widgets) once so every page inherits the fix
  • Schedule content-level fixes (alt text, link text, heading structure) as a separate workstream that content teams can handle alongside developers

Don’t treat accessibility fixes like some special side project.

What AA Conformance Means in Practice

Getting to WCAG AA conformance doesn’t mean your site’s accessibility nirvana, it means you’ve hit a specific set of testable benchmarks that tackle the biggest barriers users face. The WCAG 2.2 specification lists 87 success criteria split across A, AA and AAA levels and AA conformance means nailing all 33 Level A requirements plus the 24 Level AA ones. So that’s 57 individual boxes to tick.

Testing some of these is dead simple. Colour contrast? You can measure that down to decimal points. Alt text present or missing? Binary choice. Language declaration on the page? Yes or no, job done. But then you hit the tricky stuff where human judgement comes in, is that alt text actually useful for the image’s context, does the heading structure make logical sense when you read through it, are your error messages telling users what they actually need to know to fix their mistakes?

Why does accessibility feel like painting the Forth Bridge? Because your website never stops changing. Content gets updated, new pages appear, templates evolve and those third-party widgets you installed last month might be causing havoc with screen readers. Pass an audit in January and you could be failing spectacularly by June if nobody’s thinking about accessibility during day-to-day updates. And here’s the kicker . building accessibility into your design process from day one costs a fraction of fixing everything later.

Automated Testing Tools and Their Limitations

Search and testing icon

Testing tools have exploded in recent years.

Tools like these work by scanning your rendered DOM against WCAG-derived rules. Brilliant for catching the black-and-white stuff that code can definitively answer. Does this form input have a proper label? Yes or no. Has someone forgotten the alt text on this image? Clear answer. Does this text meet the minimum 4.5:1 contrast ratio? The maths don’t lie. For these binary decisions, automated tools will beat any human auditor on speed and consistency every time.

But automation hits a wall when context matters. Sure, a tool can spot that your image has alt text . but can it tell you whether “image of a thing” actually describes what’s in the picture? It’ll confirm your button has an accessible name, though it won’t judge whether “click here” makes any sense to someone using a screen reader. And whilst it can verify you’ve got headings on the page, it can’t work out if they create a logical structure that actually helps users navigate your content.

Missing form labels, low contrast text, images without alternatives, these basic accessibility failures show up on most websites according to the WebAIM Million study. And here’s the frustrating part: they’re among the simplest problems to solve. The annual analysis of the top million sites proves that technical difficulty isn’t what’s stopping progress. It’s awareness and process that create the real barriers.

WordPress sites need accessibility thinking at every decision point, theme choice, plugin selection, daily content updates. Your content team gets a head start with a properly built WordPress build that includes semantic markup and keyboard-friendly components. But no theme can stop an editor uploading images without alt text.

Building Accessibility into Development Workflows

Forget the audit-then-fix approach, it doesn’t work for WCAG conformance. You need accessibility baked into every stage of development and content creation instead.

Why wait until after development work to discover accessibility problems? Check colour contrast before you finalise palettes. Define keyboard behaviour for tabs and modals upfront (along with their ARIA roles). Map out how screen reader users will complete each step in your user flows. Catching issues during design saves serious money later on.

Automated testing needs to be baked right into your build process. We’re talking about tools like Axe-core that’ll actually fail your builds when someone introduces new accessibility violations (which happens more often than you’d think). And don’t forget the manual stuff either, keyboard testing should be on every QA checklist where there’s interactive elements involved. Screen reader testing on complex components before they get merged? That’s non-negotiable.

Here’s the thing about accessibility audits, they show you exactly where you stand right now, but your development process decides whether you’ll improve or just slide backwards again.

Content teams can’t wing it when it comes to accessibility. They need proper guidelines that cover the practical stuff, writing alt text that actually makes sense for different image types, crafting link text that tells you where you’re going, structuring content with headings instead of just making things look pretty, adding captions and transcripts for videos. A practical set of dos and don’ts that editors can actually use beats a massive policy document that nobody reads.

Why wait for problems to pile up? Regular spot-checks every quarter (or after major changes) catch issues before they snowball into bigger headaches. You don’t need to audit every single page either, just sample your key templates, new content and anything that’s been recently updated. Combine this with automated monitoring that runs all the time and you’ve got a system that keeps accessibility front and centre instead of something you only think about when complaints start rolling in or contract renewals demand proof of compliance.

FAQs

What exactly does a WCAG audit involve and how is it different from automated testing?

A WCAG audit combines three testing approaches: automated tools, manual keyboard testing and assistive technology walkthroughs. Automated tools only catch 25% to 40% of actual WCAG failures, so proper audits need human experts to spot issues like confusing navigation, poor heading structure and unusable custom components that software can’t detect.

Which accessibility issues should I fix first after getting my audit results?

Start with Level A failures on your most important pages like contact forms and checkout processes, as these completely block users from completing key tasks. Focus on problems affecting your main conversion paths first, then tackle site-wide issues like missing keyboard focus indicators that can be fixed once but benefit every page.

Can I just use automated tools like WAVE or Axe instead of paying for a full accessibility audit?

Automated tools are brilliant for catching obvious problems like missing alt text and colour contrast issues, but they miss most accessibility barriers that require human judgement. You’ll need manual testing to check if your site actually works with keyboard navigation and whether your content makes sense to screen reader users.

What does WCAG AA conformance actually mean for my website?

WCAG AA conformance means your site meets 57 specific accessibility requirements that tackle the biggest barriers disabled users face. It’s the legal standard for UK public sector sites and covers everything from keyboard navigation to proper form labels, though reaching conformance doesn’t mean you can stop thinking about accessibility.

How long does WCAG AA conformance last once I've fixed all the issues?

Accessibility isn’t a one-time fix because websites constantly change with new content, updates and features that can introduce fresh barriers. You need accessibility built into your ongoing development and content processes, or you could pass an audit in January and be failing badly by June.

Avatar for Paul Clapp
Co-Founder at Priority Pixels

Paul leads on development and technical SEO at Priority Pixels, bringing over 20 years of experience in web and IT. He specialises in building fast, scalable WordPress websites and shaping SEO strategies that deliver long-term results. He’s also a driving force behind the agency’s push into accessibility and AI-driven optimisation.

Related Insights

Practical advice on B2B digital marketing, from lead generation and brand strategy to campaign performance.

WordPress 7.0 and AI: Future-Proofing Your Website for the AI Era
B2B Marketing Agency
Have a project in mind?

Every project starts with a conversation. Ready to have yours?

Start your project
Web Design Agency