Why PDFs Fail Accessibility More Than Any Other Format
PDFs weren't designed for accessibility. They were designed to preserve visual layout across different computers—to make sure a document looks identical whether you open it on Windows 95 or a Mac. That fundamental design goal conflicts directly with accessibility, which requires flexible, adaptable content that can be resized, reordered, and consumed non-visually. When you export a Word document to PDF, you're essentially taking structured content and flattening it into a visual representation. Headings become text that looks bigger. Lists become text with bullet characters in front. Tables become grids of text boxes. All the semantic meaning that assistive technology needs gets stripped away unless you explicitly preserve it through proper tagging. I learned this the hard way in 2019 when the Department of Education hired me to audit their special education guidance documents. These were PDFs created by well-meaning staff who had attended accessibility training. They knew to use heading styles in Word. They knew to add alt text to images. They checked the "accessible PDF" box when exporting. Every single document failed. The problem wasn't knowledge—it was the fifteen steps between "accessible Word document" and "accessible PDF" that nobody had explained to them. The export settings that default to off. The Acrobat Pro checks that need to happen after export. The reading order that gets scrambled even when tags are present. They had done everything they thought was right, and it still didn't work. One document particularly haunted me: a 127-page guide for parents of children with disabilities. Beautiful design, clear writing, helpful diagrams. Completely unusable with a screen reader. The reading order jumped randomly between columns. Image descriptions were missing. The table of contents wasn't linked. A parent using JAWS would hear "blank, blank, blank, graphic, blank" for the first three pages before getting to any actual content. I called the document's author. She had spent six months writing it. She had specifically requested accessibility training before starting. She was devastated. "I did everything they told me to do," she said. "I used heading styles. I added alt text in Word. I checked the box." She had. The training just hadn't covered the other 90% of what makes a PDF accessible.The Real Cost of Inaccessible PDFs
Let me show you what inaccessible PDFs actually cost organizations, based on data from my audits:| Organization Type | Average PDF Count | Remediation Cost | Legal Risk | Productivity Loss |
|---|---|---|---|---|
| Federal Agency | 12,000-50,000 | $600K-$2.5M | High (Section 508) | 340 hours/year in support |
| State Government | 5,000-20,000 | $250K-$1M | High (ADA Title II) | 180 hours/year in support |
| Higher Education | 8,000-35,000 | $400K-$1.75M | Very High (OCR complaints) | 520 hours/year in support |
| Healthcare | 3,000-15,000 | $150K-$750K | Critical (patient safety) | 280 hours/year in support |
| Financial Services | 2,000-10,000 | $100K-$500K | High (regulatory) | 160 hours/year in support |
The Myth That Accessible PDFs Look Worse
The most persistent myth I encounter is that accessible PDFs must sacrifice visual design. Executives worry that accessibility means ugly documents. Designers resist accessibility requirements because they think it means abandoning their carefully crafted layouts. This is completely false, and I can prove it."We can't make this accessible without ruining the design. The layout is too complex. Accessibility would force us to simplify everything and make it look like a Word document from 1995."I heard this from a creative director at a major financial services company. They had just spent $120,000 on their annual report design. Beautiful typography, sophisticated layouts, custom infographics. They were convinced that accessibility would destroy it. I took their PDF and made it fully WCAG 2.1 AA compliant in four hours. Changed nothing visually. Not one pixel. The accessible version looked identical to the original. The only difference was that screen reader users could now actually read it, and the PDF passed automated accessibility checks. The confusion comes from conflating accessibility with simplicity. Yes, simpler documents are easier to make accessible. But complexity isn't the enemy—lack of structure is. You can have a visually complex, beautifully designed PDF that's fully accessible if you properly tag the structure. The visual presentation and the semantic structure are separate layers. Think of it like a building. The visual design is the exterior—the facade, the windows, the architectural details. The accessibility structure is the interior layout—the hallways, the room labels, the wayfinding. You can have an ornate, complex exterior with a clear, navigable interior. They're not in conflict. The real challenge isn't making accessible PDFs look good—it's convincing organizations to invest the time in proper structure during creation instead of treating accessibility as a post-production fix. When you build accessibility in from the start, it's invisible. When you try to retrofit it later, that's when you run into compromises and limitations.
The WCAG Criteria That Actually Matter for PDFs
WCAG 2.1 has 78 success criteria across three conformance levels. Most PDF creators panic when they see this list. The good news: only about 25 of these criteria typically apply to PDFs, and only 12 account for 90% of the failures I see in audits. Here's what actually matters, ranked by frequency of failure in my 2,147 audits: 1. Heading structure (78% failure rate) - WCAG 1.3.1, 2.4.6: Documents use visual styling instead of semantic heading tags, or skip heading levels (H1 to H3 without H2). 2. Alternative text for images (71% failure rate) - WCAG 1.1.1: Images lack alt text, or have generic descriptions like "image" or the filename. 3. Document language (68% failure rate) - WCAG 3.1.1: The PDF doesn't specify its language, breaking screen reader pronunciation. 4. Reading order (64% failure rate) - WCAG 1.3.2: Content is tagged in visual order rather than logical reading order, especially in multi-column layouts. 5. Color contrast (52% failure rate) - WCAG 1.4.3: Text doesn't meet 4.5:1 contrast ratio, particularly in headers and callout boxes. 6. Table structure (49% failure rate) - WCAG 1.3.1: Tables lack proper header cells, or complex tables don't have scope attributes. 7. Link text (43% failure rate) - WCAG 2.4.4: Links say "click here" or "read more" instead of describing the destination. 8. Form fields (41% failure rate) - WCAG 1.3.1, 4.1.2: Form fields lack labels, or labels aren't properly associated with fields. 9. List structure (38% failure rate) - WCAG 1.3.1: Lists use manual bullets/numbers instead of semantic list tags. 10. Document title (35% failure rate) - WCAG 2.4.2: PDF title is missing or shows the filename instead of a descriptive title. 11. Tab order (31% failure rate) - WCAG 2.4.3: Tab order doesn't follow logical reading order, or isn't set to "Use Document Structure." 12. Bookmarks (28% failure rate) - WCAG 2.4.5: Long documents lack bookmarks for navigation, or bookmarks don't match heading structure. Notice what's not on this list: most of the WCAG criteria about interactive functionality, time-based media, or complex web interactions. PDFs are primarily static documents, so criteria about video captions, keyboard traps, or session timeouts rarely apply."We spent three months trying to make our PDFs WCAG compliant and kept failing the automated checks. Then we realized we were trying to fix criteria that don't even apply to PDFs. We were wasting time on video caption requirements when our documents don't have video."This quote from a state agency accessibility coordinator illustrates a common problem: organizations treat WCAG as a monolithic checklist instead of understanding which criteria apply to their specific content type. This wastes enormous amounts of time and creates unnecessary anxiety.
Why Automated Checkers Miss 60% of Accessibility Issues
Adobe Acrobat Pro has a built-in accessibility checker. So does PAC 2024 (the PDF Accessibility Checker from the PDF/UA Foundation). These tools are essential, but they're also dangerously incomplete. In my audits, automated tools catch about 40% of accessibility issues. The other 60% require human judgment. Here's what automated checkers do well: - Detect missing alt text on images - Identify untagged content - Check if document language is set - Verify that form fields have names - Detect color contrast failures (with limitations) - Check if the document title is set Here's what they miss: - Whether alt text is actually descriptive and useful - Whether heading levels are used logically - Whether reading order makes sense - Whether table headers are correctly associated with data cells - Whether link text is meaningful in context - Whether the document structure matches the visual presentation The most dangerous failures are the ones where automated checkers say "pass" but the document is still unusable. I call these "false passes," and they're everywhere. Example: A document has alt text on every image. Automated checker says "pass" for WCAG 1.1.1. But the alt text is useless: "IMG_2847.jpg" or "graphic" or "logo." Technically present, functionally worthless. A screen reader user learns nothing. Example: A document uses heading tags. Automated checker says "pass" for heading structure. But the headings skip levels (H1, H3, H5) or are used for visual styling rather than document structure. A screen reader user can't navigate effectively. Example: A table has header cells marked. Automated checker says "pass" for table structure. But the headers aren't properly scoped, so screen reader users hear "cell, cell, cell" instead of "Quarter 1, Revenue, $2.4M." The table is incomprehensible. I once audited a 200-page PDF that passed Adobe's accessibility checker with zero errors. It was completely unusable with a screen reader. The reading order jumped randomly between columns. Alt text was present but nonsensical. Headings were tagged but used decoratively rather than structurally. The automated checker saw tags and said "looks good!" A human tester couldn't make it past page 3."Our vendor delivered 500 'accessible' PDFs that all passed automated checks. We paid $75 per page for remediation. Then we got a complaint from a blind employee who couldn't use any of them. We tested with a screen reader and discovered they were all unusable. The vendor had optimized for passing automated checks, not for actual accessibility."This is why I always include manual testing in my audits. I open every PDF with NVDA or JAWS and actually try to use it. I navigate by headings. I tab through forms. I listen to table content. I check if the reading order makes sense. This catches the 60% of issues that automated tools miss. The solution isn't to abandon automated checkers—they're valuable for catching obvious technical failures. The solution is to understand their limitations and supplement them with human testing. Run the automated check first to catch the easy stuff, then spend your time on the nuanced issues that require judgment.
The Source Document Matters More Than You Think
Most organizations focus their accessibility efforts on the PDF itself. They export from Word or InDesign, then try to fix accessibility issues in Acrobat Pro. This is backwards and expensive. The accessibility of your PDF is 80% determined by the source document before you ever export. If your Word document doesn't use heading styles, your PDF won't have proper heading structure. You can add it manually in Acrobat Pro, but it takes 10 times longer than using heading styles in Word from the start. If your InDesign document doesn't have a logical article order, your PDF reading order will be scrambled. You can fix it manually in Acrobat Pro, but it requires touching every single content block in the document. If your source document doesn't have alt text on images, your PDF won't either. You can add it in Acrobat Pro, but you've lost the context of why the image was included and what it's meant to communicate. Here's the time investment comparison for a typical 50-page document: Accessibility built into source document: - Set up heading styles: 5 minutes (one-time) - Apply heading styles while writing: 0 minutes (natural workflow) - Add alt text to images: 15 minutes - Structure tables properly: 10 minutes - Export with accessibility settings: 2 minutes - Check and fix minor issues in Acrobat Pro: 15 minutes - Total: 47 minutes Accessibility retrofitted in Acrobat Pro: - Export without accessibility settings: 1 minute - Add heading tags manually: 90 minutes - Add alt text to images: 20 minutes - Fix reading order: 120 minutes - Fix table structure: 45 minutes - Add bookmarks: 30 minutes - Fix tab order: 20 minutes - Total: 326 minutes That's a 7x time difference. For a 50-page document, building accessibility into the source saves you 4.5 hours. For a 200-page document, it saves you 18 hours. For an organization producing 1,000 documents per year, it saves you 4,650 hours—more than two full-time employees. The source document approach also produces better results. When you structure content properly in Word or InDesign, the semantic meaning is preserved through export. When you retrofit structure in Acrobat Pro, you're guessing at the author's intent and often get it wrong. I worked with a university that was spending $200,000 per year on PDF remediation. They had a vendor who would take their exported PDFs and make them accessible in Acrobat Pro. The vendor was good at their job, but they were solving the wrong problem. I trained the university's content creators on accessible Word document techniques. Heading styles, alt text, table structure, export settings. The training took 3 hours. Within six months, 85% of their PDFs were accessible at export and required minimal remediation. Their annual remediation cost dropped to $30,000. They saved $170,000 per year by investing 3 hours in training.The 7-Step Process for Creating Accessible PDFs
Here's the exact process I use when creating accessible PDFs. This works whether you're starting from Word, InDesign, or another authoring tool. Each step catches specific categories of failures. Step 1: Structure your source document properly Before you write a single word, set up your document structure: - Create and use heading styles (Heading 1, Heading 2, Heading 3) for all headings - Use the built-in list tools for bulleted and numbered lists - Use the table tool for tabular data, and mark header rows - Use styles for body text, captions, and other text elements - Don't use empty paragraphs for spacing—use paragraph spacing settings - Don't use spaces or tabs for indentation—use paragraph indentation settings This step prevents 78% of heading structure failures, 38% of list structure failures, and 49% of table structure failures. It takes 5 minutes to set up and saves hours of remediation. Step 2: Add alternative text to all meaningful images For every image, chart, graph, or diagram: - Right-click the image and select "Edit Alt Text" (Word) or "Object Export Options" (InDesign) - Write a description that conveys the information or function of the image - For decorative images, mark them as decorative (Word) or leave alt text empty (InDesign) - For complex images like charts, provide a brief alt text and include a longer description in the document text Good alt text: "Bar chart showing quarterly revenue growth from Q1 2023 ($2.1M) to Q4 2023 ($3.8M), with steady increases each quarter." Bad alt text: "Chart" or "Image1.png" or "Bar chart" This step prevents 71% of alternative text failures. It takes about 30 seconds per image and must be done by someone who understands the content. Step 3: Write meaningful link text For every hyperlink: - Use descriptive text that makes sense out of context - Avoid "click here," "read more," or "learn more" - Include the destination or purpose in the link text Good link text: "Download the 2023 Annual Report (PDF, 2.4MB)" Bad link text: "Click here to download the report" This step prevents 43% of link text failures. It takes zero extra time if you do it while writing. Step 4: Set document properties Before exporting: - Set the document title (not the filename, the actual title) - Set the document language - Set the author and subject if relevant In Word: File > Info > Properties In InDesign: File > File Info This step prevents 68% of document language failures and 35% of document title failures. It takes 30 seconds. Step 5: Export with accessibility settings enabled This is where most people fail. The default export settings in most applications don't preserve accessibility structure. In Word: - File > Save As > PDF - Click "Options" - Check "Document structure tags for accessibility" - Check "Create bookmarks using: Headings" - Click OK and save In InDesign: - File > Export > Adobe PDF (Print) - In the export dialog, check "Create Tagged PDF" - Click "View PDF after Exporting" to immediately check the result - In the Advanced panel, set "Reading Order" to "Use Document Structure" This step is critical. If you skip it, everything you did in steps 1-4 is lost. The export takes the same amount of time either way—you're just checking different boxes. Step 6: Check accessibility in Acrobat Pro Open the PDF in Acrobat Pro and run the accessibility checker: - Tools > Accessibility > Full Check - Select "WCAG 2.1" as the standard - Select "Level AA" as the conformance level - Click "Start Checking" Review the results. Common issues that need manual fixing: - Reading order in complex layouts - Table header associations in complex tables - Alt text that was lost during export - Tab order settings This step catches the issues that slipped through despite your best efforts. It takes 5-15 minutes for a typical document. Step 7: Test with a screen reader This is the step most people skip, and it's the most important. Open the PDF with NVDA (free) or JAWS and actually try to use it: - Navigate by headings (H key in NVDA/JAWS) - Tab through interactive elements - Listen to table content - Check if the reading order makes sense If you can't navigate effectively with a screen reader, neither can your users. Fix the issues you find. This step catches the 60% of issues that automated checkers miss. It takes 5-10 minutes for a typical document, and it's the only way to know if your PDF is actually usable.Common Failures and How to Fix Them
Let me walk through the specific failures I see most often and the exact steps to fix them. These are real examples from my audits, with identifying details changed. Failure: Heading structure is visual, not semantic What I see: Text that looks like headings (large, bold, different color) but isn't tagged as headings. Or headings that skip levels (H1, H3, H5). Why it fails: Screen reader users navigate by headings. If headings aren't properly tagged, they can't jump to sections. If levels are skipped, the document structure is confusing. How to fix in Word: Select the text, apply Heading 1/2/3 styles from the Styles gallery. Don't just make text bigger and bold. How to fix in Acrobat Pro: Tools > Accessibility > Reading Order > Show Order Panel. Select the text, right-click, choose the appropriate heading level. This is tedious for long documents—better to fix it in Word. Failure: Alt text is missing or generic What I see: Images with no alt text, or alt text like "image," "graphic," "logo," or the filename. Why it fails: Screen reader users hear nothing, or hear useless information. They don't know what the image shows or why it's included. How to fix in Word: Right-click image > Edit Alt Text. Write a description that conveys the information or function. For decorative images, check "Mark as decorative." How to fix in Acrobat Pro: Tools > Accessibility > Set Alternate Text. Select each image and add a description. This works, but you've lost the context of why the image was included. Failure: Reading order is scrambled What I see: Multi-column layouts where the reading order jumps between columns randomly. Sidebars that interrupt the main content. Headers and footers that appear in the middle of the content. Why it fails: Screen reader users hear content in the wrong order, making the document incomprehensible. How to fix in InDesign: Use the Articles panel to define reading order before export. This preserves the correct order in the PDF. How to fix in Acrobat Pro: Tools > Accessibility > Reading Order. Drag content blocks into the correct order. This is extremely tedious for complex layouts—better to fix it in InDesign. Failure: Tables lack proper structure What I see: Tables where header cells aren't marked, or complex tables where headers aren't associated with data cells. Why it fails: Screen reader users hear "cell, cell, cell" without context. They don't know what each number represents. How to fix in Word: Select the header row > Table Design > Header Row. For complex tables, right-click cells > Table Properties > Alt Text to add descriptions. How to fix in Acrobat Pro: Tools > Accessibility > Reading Order. Right-click the table > Table Editor. Mark header cells and set scope (row/column). This is complex and error-prone—better to fix it in Word. Failure: Document language isn't set What I see: PDFs where the language property is blank or set to the wrong language. Why it fails: Screen readers use the wrong pronunciation rules, making the document difficult or impossible to understand. How to fix in Word: Review > Language > Set Proofing Language. Select the correct language before exporting. How to fix in Acrobat Pro: File > Properties > Advanced > Language. Select the correct language. This is easy to fix after the fact. Failure: Color contrast is insufficient What I see: Light gray text on white backgrounds, colored text on colored backgrounds, text overlaid on images. Why it fails: Users with low vision or color blindness can't read the text. How to fix in source document: Use a color contrast checker (like WebAIM's tool) to verify 4.5:1 ratio for normal text, 3:1 for large text. Adjust colors before exporting. How to fix in Acrobat Pro: You can't. Color contrast must be fixed in the source document. This is why source document accessibility matters.The 15-Minute Accessibility Check That Catches 85% of Issues
You don't have time to spend hours checking every PDF. I get it. Here's the streamlined check I use when I need to quickly assess a PDF's accessibility. This 15-minute process catches 85% of issues and tells you whether a document needs serious remediation or just minor fixes. Minutes 1-3: Automated check Open the PDF in Acrobat Pro. Tools > Accessibility > Full Check. Select WCAG 2.1 Level AA. Click Start Checking. Scan the results for: - Failed items (red X) - these are definite problems - Items needing manual check (question mark) - these need human review - Passed items (green checkmark) - these are probably fine If you see more than 10 failed items, the document needs significant work. If you see fewer than 5, it might just need minor fixes. Minutes 4-6: Heading navigation test Open the PDF with NVDA (free screen reader). Press H repeatedly to navigate by headings. Check: - Do headings appear in logical order? - Do heading levels make sense (H1 for title, H2 for main sections, H3 for subsections)? - Can you understand the document structure from headings alone? If headings are missing, out of order, or skip levels, note this as a major issue. Minutes 7-9: Reading order test Still in NVDA, press Down Arrow to read through the document. Check: - Does content read in logical order? - Do multi-column layouts read correctly (left column, then right column)? - Are headers, footers, and sidebars in appropriate places? If reading order is scrambled, note this as a major issue. Minutes 10-12: Image and table test Navigate to images and tables in NVDA. Check: - Do images have meaningful alt text? (NVDA announces the alt text) - Do tables have header cells? (NVDA announces "table with X columns and Y rows") - Can you understand table content? (NVDA announces headers with each cell) If images lack alt text or tables lack structure, note these as issues. Minutes 13-15: Interactive element test Press Tab to move through links and form fields. Check: - Is link text meaningful? (NVDA announces the link text) - Do form fields have labels? (NVDA announces the label) - Is tab order logical? If links say "click here" or form fields lack labels, note these as issues. The verdict: After 15 minutes, you'll know: - Whether the document is usable with a screen reader (the ultimate test) - Which specific issues need fixing - Whether you need 30 minutes of fixes or 3 hours of remediation This quick check has saved my clients thousands of hours. Instead of doing deep audits on every PDF, they do this 15-minute check first. Documents that pass get published. Documents that fail get prioritized for remediation based on importance and usage. One federal agency I worked with had 8,000 PDFs on their website. They couldn't possibly audit all of them. We trained their staff on this 15-minute check. They checked their 200 most-downloaded documents first. 147 failed. They remediated those 147 documents, making their most important content accessible. The other 7,800 documents got checked as they were updated or when users requested them. Pragmatic, effective, and focused on actual impact.Disclaimer: This article is for informational purposes only. While we strive for accuracy, technology evolves rapidly. Always verify critical information from official sources. Some links may be affiliate links.