We Scanned 30 SaaS Pricing Pages for Accessibility. 70% Failed.


Nine out of 30 SaaS pricing pages had zero WCAG 2.1 AA violations when we ran them through axe-core this week. Figma, Netlify, Twilio, Zendesk, Calendly, Loom, Miro, Grammarly, Webflow. Clean results across the board.

The other 21 didn’t come close.

What we tested and how

We pointed axe-core 4.11 at the pricing page of 30 SaaS products — not their homepages, not their docs, specifically their pricing pages. The standard was WCAG 2.1 AA, and we ran every scan in a consistent headless Chromium environment with no browser extensions or user scripts that might interfere.

Why pricing pages? Because they’re where the money changes hands. They tend to get heavy custom design treatment: comparison tables with intricate layouts, toggle switches between monthly and annual billing, gradient backgrounds behind plan names, animated feature lists. All of that increases the surface area for accessibility failures in ways that a simpler marketing page might not.

The scan completed on April 12, 2026. Every result below comes directly from axe-core output — no manual evaluation, no subjective judgment calls.

The numbers

Across 30 sites, axe-core flagged 65 total violations touching 548 DOM nodes. The average was 2.2 violations per site, but that average hides a wide spread. Nine sites had nothing. Three sites — Linear, Render, and Intercom — had five violations each.

Color contrast was the most common single violation by a wide margin, appearing on 12 of the 30 pricing pages (40%). That tracks with what we see in basically every cohort we scan, but the prevalence on pricing pages is worth noting. These aren’t obscure blog posts with inherited styles. Pricing pages get direct design attention, and still, 40% of them had text that didn’t meet minimum contrast ratios against its background.

The second most common violation was list — malformed list structures — appearing on 8 sites. This one’s a pricing page special. When you build a feature comparison table or a bulleted list of what’s included in each tier, it’s easy to use <div> elements styled to look like lists without actually being lists. Screen readers can’t parse the structure, and the user loses the ability to navigate between items.

After that: aria-allowed-attr on 5 sites, link-name and button-name each on 4 sites, and then a longer tail of ARIA-related issues.

Who had the most violations

Linear’s pricing page had 5 violations affecting 14 DOM nodes, including a critical aria-required-parent issue and color contrast failures across 3 elements. The list markup was malformed too — <li> elements outside proper list containers, which affected 8 nodes.

Render also had 5 violations, but its node count was the highest in the entire cohort: 82 affected DOM elements. The bulk of that was 45 nodes failing color contrast and 34 buttons without accessible names. When axe-core flags 34 buttons on a single page without discernible text, that usually points to an icon button pattern where the icons are decorative and no aria-label was added.

Intercom rounded out the top three with 5 violations and 28 affected nodes. Three of those violations were critical severity, including 17 elements with ARIA roles that lacked required parent roles and 6 buttons without accessible names.

Below the top three, Vercel, PlanetScale, Stripe, SendGrid, HubSpot, Monday, and Asana each had 4 violations. Asana’s case is interesting — it only had 4 violation types, but 73 DOM nodes were affected, nearly all of them from invalid ARIA roles (70 nodes). That’s likely a single component pattern replicated across every feature row in their pricing table.

Slack had just 2 violations, but one of them — aria-command-name — hit 96 DOM nodes. Sometimes a low violation count masks a large blast radius.

Who got it right

The nine clean sites: Figma, Netlify, Twilio, Zendesk, Calendly, Loom, Miro, Grammarly, Webflow.

What do they have in common? Honestly, not as much as you’d hope for a neat narrative. They span different industries (design tools, communications, scheduling, writing, web hosting). They use different tech stacks. Some have elaborate pricing pages with feature grids; others keep it simple.

If there’s a through-line, it might be that several of these companies have publicly stated accessibility commitments. Figma has talked about accessibility in their design tool itself. Webflow literally sells website building and has a vested interest in demonstrating that their own output is accessible. Twilio and Zendesk both operate in spaces where enterprise customers with accessibility requirements are a significant part of their revenue.

But I’m speculating. The data just says they passed. Drawing causal conclusions from nine data points would be overreach.

Why pricing pages specifically

We’ve scanned other cohorts before — landing pages, documentation sites, forms. Pricing pages consistently perform worse, and we think there are a few structural reasons.

Pricing pages are marketing pages that behave like application interfaces. They have interactive elements (toggles, sliders, accordions for FAQ sections, comparison table filters) layered on top of heavy visual design. That combination creates accessibility failure modes that a static marketing page wouldn’t have.

There’s also the custom-build problem. When a product team builds a comparison table for their three pricing tiers, they’re often working from a one-off Figma design rather than pulling from a shared accessible component system. Custom means untested. Untested means aria-allowed-attr violations slip through.

And then there’s the visual hierarchy pressure. You want the recommended plan to stand out. You want the CTA button to pop. That pressure toward visual emphasis creates exactly the conditions where contrast ratios get sacrificed — a light gray “per user/month” label against a white card background, or a pastel-colored “most popular” badge that doesn’t meet the 4.5:1 ratio.

The list violations tell the same story from a different angle. Feature lists on pricing pages are almost never actual <ul> elements. They’re styled <div> stacks with checkmark icons, because that’s what looks good in the design. The visual result is fine. The semantic result is invisible to assistive technology. If you’re new to ARIA and semantic markup, this is one of the most common patterns to watch for.

What this doesn’t tell you

Automated scanning catches a specific category of issues. axe-core is good at finding color contrast failures, missing labels, malformed ARIA, and structural problems. It’s not good at evaluating whether a screen reader user can actually complete the task of understanding and comparing pricing tiers. It can’t tell you whether the tab order makes sense, or whether the plan toggle between monthly and annual actually announces its state change.

So the nine sites with zero violations aren’t necessarily fully accessible. And the 21 sites with violations aren’t necessarily unusable. What the data does tell you is the minimum bar — these are issues that an automated tool can catch in under 10 seconds per page, and 70% of well-funded SaaS companies haven’t cleared that bar on one of their most important pages.

We’ll run this cohort again in a few months and see what changes.