AI-POWERED ACCESSIBILITY
Deaf and Hard of Hearing Access to Intelligent Personal Assistants: Comparison of Voice-Based Options with an LLM-Powered Touch Interface
This study explores the usability of voice-based and LLM-assisted touch interfaces for Intelligent Personal Assistants (IPAs) among Deaf and Hard of Hearing (DHH) individuals, highlighting critical accessibility challenges and potential solutions for enhancing DHH interaction with AI.
Executive Impact & Key Findings
Our analysis reveals the nuanced performance and user perceptions of different IPA interaction methods for DHH users, identifying promising avenues for AI-driven accessibility improvements and areas requiring further development.
Deep Analysis & Enterprise Applications
Select a topic to dive deeper, then explore the specific findings from the research, rebuilt as interactive, enterprise-focused modules.
Enterprise Process Flow: Study Design
| Feature | Natural Deaf Speech | WoZ Facilitated English | LLM-Assisted Touch |
|---|---|---|---|
| Input Type | User's Voice | Facilitator Re-speaking | Touch/Text with LLM |
| ASR Dependence | High (Built-in Alexa ASR) | High (Facilitator as 'Best-case ASR') | Low (Text-to-Speech) |
| Context-Awareness | None | None | High (LLM-powered) |
| Latency | Low (Direct) | High (Facilitator delay) | Medium (LLM processing + TTS) |
| Key User Feedback |
|
|
|
| Metric | Natural Deaf Speech | WoZ Facilitated English | LLM-Assisted Touch |
|---|---|---|---|
| Mean SUS Score | 59.6 | 62.5 | 63.5 (Highest, but no statistical significance) |
| Mean Adjective Score | 5.0 | 4.8 | 5.15 (All "ok" to "good", Touch highest) |
| Observed NPS Score | -5 (Much higher than expected -36, indicating surprising user enthusiasm when it worked) |
-40 | -10 |
| WER (Natural Deaf Speech only) | 0.61% - 30.91% (Excluding 2 participants with 100% WER and half with 0%) | ||
Mixed Reactions to Voice Input
Participants were split on their opinions regarding voice input. Some expressed "surprise at how well the device understood me" (P17), finding it "impressive" (P6, P9) when Alexa accurately recognized their natural deaf speech. In contrast, others "struggled" due to "my deaf voice wasn't recognized" (P16) or found "voicing is hard for me" (P11). The Wizard-of-Oz method, while offering high "accuracy" (P16), also introduced undesirable "delays" (P7, P24) and led to "uncertainty" (P14) about the facilitator's role.
Positive but Latency-Affected Touch Interface
The LLM-assisted touch interface generally received positive feedback for its ease of use. Participants noted it was "easy to use, and I saw what the options were" (P4, P5), and served as a valuable solution when Alexa "didn't understand" (P11) their speech. However, key concerns included the "latency" (P6, P18) in LLM responses and instances of "limited options" (P19, P10) on the UI. Users also expressed a desire for more "control over data retention" (P7, P20) and the ability to customize or delete interaction history.
Enterprise Process Flow: LLM Touch System
Methodological & Sampling Constraints
The study faced several limitations. The Wizard-of-Oz methodology for facilitated speech represents a 'best-case' scenario, with a human interpreter likely outperforming current ASR technologies, potentially skewing usability perceptions. The participant sample, consisting of DHH individuals who use spoken English but also ASL, may not fully represent the diverse non-signing DHH population, necessitating broader sampling for future work. Additionally, the LLM's unpredictability required extensive priming with example tasks, which might not reflect spontaneous real-world use. Future research should also test commercial re-speaking solutions to better isolate system training effects.
Estimate Your Potential AI ROI
Leverage our interactive calculator to project the cost savings and reclaimed productivity hours your enterprise could achieve by implementing AI-powered accessibility solutions.
Our Phased AI Implementation Roadmap
Our strategic phased approach ensures a smooth, effective, and impactful integration of AI accessibility solutions into your enterprise.
01 Discovery & Assessment
Conduct a comprehensive audit of existing communication workflows and DHH user needs. Define key performance indicators (KPIs) and project scope. Deliverable: Detailed Accessibility Needs Report.
02 Solution Design & Customization
Develop tailored LLM-assisted interfaces and integrate advanced ASR for deaf-accented speech. Focus on multimodal input options (touch, voice, ASL if feasible) and context-aware interactions. Deliverable: Prototype UI/UX & Technical Specification.
03 Pilot Program & User Feedback
Implement the solution in a pilot environment with DHH employees. Gather extensive usability feedback and iterate on design and functionality to optimize for real-world scenarios. Deliverable: Pilot Program Report & Refined Solution.
04 Full-Scale Deployment & Training
Roll out the accessible AI solution across the organization. Provide comprehensive training and support for all users to ensure high adoption rates and maximize impact. Deliverable: Production System & Training Program.
05 Continuous Improvement & Scaling
Monitor performance, gather ongoing feedback, and continuously update the AI models to adapt to evolving user needs and technological advancements. Explore integration with new devices and platforms. Deliverable: Ongoing Performance Reviews & Feature Updates.
Ready to Transform Your Enterprise with AI?
Don't let communication barriers hinder productivity. Partner with us to implement cutting-edge AI solutions that empower all your employees. Book a consultation to discuss a tailored strategy for your organization.