CityGov is proud to partner with Datawheel, the creators of Data USA, to provide our community with powerful access to public U.S. government data. Explore Data USA

Skip to main content
Universal Design or Universal Barrier? Rethinking AI Infrastructure on Campus

Universal Design or Universal Barrier? Rethinking AI Infrastructure on Campus

Many colleges and universities continue to operate with fragmented support systems, where accessibility responsibilities are confined to specific offices or compliance units. This siloed approach inhibits a cohesive strategy for digital inclusion. For instance, instructional design teams may not coordinate with disability services or IT departments to ensure that AI-powered tools align with accessibility standards. As a result, a new advising chatbot or AI-based learning platform might launch without screen reader compatibility or support for text-to-speech, effectively excluding students who rely on those tools. A 2020 survey by EDUCAUSE found that only 38 percent of institutions had a centralized digital accessibility policy, illustrating a widespread lack of institutional alignment around inclusive technology deployment1.

Outdated digital infrastructure adds another layer of difficulty. Legacy systems, particularly student information systems and learning management platforms, often lack APIs or integration capabilities needed to support adaptive AI tools. When these systems are not interoperable, AI cannot access the data required to personalize learning experiences or respond to support needs in real time. This disproportionately affects nontraditional learners who rely on flexible, tech-mediated pathways. Without updated infrastructure, even well-intentioned AI initiatives may perpetuate existing inequities by default rather than design.

How AI Exacerbates Inaccessibility Without Governance

Artificial intelligence does not operate in a vacuum. It reflects and amplifies the systems it is embedded in. When implemented in environments with inconsistent accessibility standards, AI tools can reinforce exclusion. For example, automated essay scoring tools trained on standard English syntax may penalize multilingual learners or neurodiverse students whose writing deviates from normative patterns. In the absence of human oversight and inclusive training data, these systems risk codifying bias rather than supporting learning2.

Poor governance compounds the problem. Many institutions lack formal AI governance structures that include accessibility experts, student advocates, and faculty from diverse backgrounds. Without interdisciplinary oversight, decisions about AI deployment may prioritize efficiency or novelty over inclusivity. A study by the Center for Democracy & Technology found that only 26 percent of higher education institutions had evaluated the equity implications of AI tools before implementation3. This gap in governance leaves students with disabilities and other marginalized groups exposed to tools that were never designed with them in mind.

Designing Forward with Inclusive Infrastructure

Inclusive AI design begins with universal design principles, which aim to create environments usable by all people, to the greatest extent possible, without the need for adaptation. This means ensuring that all AI interfaces support screen readers, keyboard navigation, and alternative input modes such as voice commands or switch devices. These features must be built in from the outset, not added after accessibility complaints arise. Web Content Accessibility Guidelines (WCAG) 2.1 provide a technical foundation, but institutions must go further by integrating user testing with students from diverse ability and language backgrounds4.

Multilingual design is equally essential. AI-powered chatbots, tutoring platforms, and advising systems should support multiple languages and dialects, especially in institutions serving immigrant, refugee, and first-generation college populations. Real-time language translation, culturally responsive prompts, and the ability to toggle between languages can significantly expand access. AI tools must also be monitored by human staff trained in inclusive pedagogy to ensure that automated responses align with institutional values and student needs. This hybrid model of automation and human support is key to maintaining quality and equity.

Accessibility as a Driver of Personalized Student Support

When designed inclusively, AI can power innovations in micro-learning, tailored advising, and just-in-time academic interventions. For example, AI-driven platforms can deliver bite-sized learning modules in multiple formats - video, text, audio - allowing students to engage in the mode that suits their learning style or disability. These platforms can also track engagement and suggest follow-up content, creating a dynamic feedback loop that supports retention. However, these benefits only materialize when accessibility is part of the design criteria, not an afterthought.

Student advising also stands to gain from accessible AI. Virtual advisors that integrate academic data with financial aid, scheduling, and wellness metrics can provide holistic support, but only if all students can use them. That includes those using screen readers, nonvisual interfaces, or mobile-only access. By embedding accessibility into the core of these systems, institutions can reduce advising bottlenecks, improve early alerts for at-risk students, and create more equitable pathways through degree programs. Accessibility is not a constraint on innovation - it is the condition that enables it.

Accessibility Audits and Cross-Functional Task Forces

Institutions committed to equitable AI implementation should begin with an accessibility design audit of their current and planned AI tools. This audit should evaluate not only compliance with legal standards, but also usability across different student populations. Key audit areas include interface compatibility, multilingual support, integration with assistive technologies, and the presence of bias in training data. Teams should involve students with disabilities, multilingual learners, and first-generation students in the audit process to ensure insights reflect real user experiences.

To coordinate these efforts, colleges should establish an Accessibility and AI Joint Task Force composed of representatives from academic affairs, IT, disability services, student support, and institutional research. This group can oversee policy development, procurement guidelines, and pilot evaluations, ensuring that inclusive AI design is embedded across institutional functions. Regular consultation with external accessibility experts and participation in national forums such as the EDUCAUSE Accessibility Community Group can further strengthen capacity and accountability5.

Engaging in the Work of Equitable AI Implementation

Higher education institutions should actively participate in ongoing discussions about equitable AI design and implementation. Organizations such as the Association on Higher Education and Disability (AHEAD) and the National Center for College Students with Disabilities (NCCSD) regularly host webinars, workshops, and policy roundtables where practitioners can learn from each other’s experiences. Engaging in these spaces allows teams to stay current with evolving standards and emerging tools, while building a shared vocabulary around inclusive innovation6.

Equally important is empowering students to co-create AI solutions. Student advisory boards focused on accessibility and technology can serve as critical partners in identifying priorities, testing prototypes, and shaping feedback loops. When students are treated as collaborators rather than end-users, AI initiatives are more likely to reflect the complexity of real-world needs. Participation ensures that accessibility is not a static checklist but a living practice grounded in community engagement and continuous improvement.

Reinforcing the Core Principle: Access as a Prerequisite

The central premise remains: if students cannot access the tool, the tool has already failed. This is not a philosophical argument - it is a practical one. AI systems in higher education exist to extend learning, streamline support, and personalize the student experience. But when those systems exclude users, they produce the opposite effect, creating new barriers under the guise of innovation. Accessibility is not a luxury or an add-on; it is the structural integrity of any digital transformation.

AI can scale support, but only if every student can reach the doorway. Institutions that build with accessibility as a foundational principle will unlock the full potential of these technologies. Those that delay or delegate this responsibility will find that even the most advanced tools collapse on contact. The path forward requires intentional design, cross-disciplinary collaboration, and an unwavering commitment to equity from the start.

Bibliography

  1. EDUCAUSE. “The Higher Education IT Workforce Landscape, 2020.” EDUCAUSE Review, March 2020. https://er.educause.edu/articles/2020/3/the-higher-education-it-workforce-landscape-2020.

  2. Burrell, Jenna. “How the Machine ‘Thinks’: Understanding Opacity in Machine Learning Algorithms.” Big Data & Society 3, no. 1 (2016): 1-12. https://doi.org/10.1177/2053951715622512.

  3. Center for Democracy & Technology. “AI and Student Monitoring in Higher Education: Key Findings and Recommendations.” CDT Research Report, 2021. https://cdt.org/insights/ai-and-student-monitoring-in-higher-education/.

  4. World Wide Web Consortium (W3C). “Web Content Accessibility Guidelines (WCAG) 2.1.” June 2018. https://www.w3.org/TR/WCAG21/.

  5. EDUCAUSE. “Accessibility Community Group.” EDUCAUSE, 2023. https://www.educause.edu/community/accessibility-community-group.

  6. National Center for College Students with Disabilities. “Resources and Events.” NCCSD, 2023. https://www.nccsdonline.org.

More from Artificial Intelligence

Explore related articles on similar topics