AI and Accessibility - How can the use of AI in Digital Collaboration Empower People with Disabilities (PwD)

Automatic Summary

AI and Accessibility: Empowering People with Disabilities

Hello everyone,
Today, I'm excited to talk about how artificial intelligence can enhance accessibility, particularly in digital collaboration environments. I'm Priyanka, a UX researcher based in Seattle, Washington, passionate about user-centric design. I advocate that accessibility should always be at the forefront of our considerations when designing products, as it enhances usability for all users, not just those with disabilities.

Let's dive in.

What is Accessibility and Why is it important?

Accessibility is removing barriers to ensure that all people, including those with disabilities, can perceive, understand, operate and interact effectively with digital content, products, and environments. It is grounded in principles of inclusivity, usability, and equality. It is about creating a world where everyone has equal access and opportunity, regardless of their abilities or differences.

Accessibility in digital interfaces is guided by four main principles:

  1. Perceivable: Information should be presented in a perceivable format for users.
  2. Operable: Products and interfaces should be usable by individuals with a wide range of abilities.
  3. Understandable: User interfaces should be easy to understand and navigate.
  4. Robust: Digital interfaces should be compatible with a variety of assistive technologies and devices.

Role of AI in Accessibility

The inclusion of AI in digital collaboration tools has made a significant impact on productivity and efficiency. It has ensured that information, communication, and collaboration opportunities are accessible to all users, regardless of their abilities, thereby revolutionizing digital collaboration.

Let's discuss different ways AI is currently enhancing accessibility:

  • Multimedia Captioning: AI algorithms generate captions for audio and video content in real time.
  • Speech Recognition: These systems analyze and interpret spoken language, converting it to text or executable commands.
  • Image Recognition: AI identifies objects, people, and scenes within images and provides descriptive text or audio productions.
  • Screen Readers: Converts written text into spoken words, making digital content accessible to those who rely on auditory input.
  • Summarizing Information: AI-powered assistants summarize meetings, messages, or any digital content, making it more accessible and consumable for users.

Enhancing AI Collaboration for Specific Disability Profiles

Given time constraints, let's focus on three disability categories: blind users, neurodiverse users, and individuals with upper body mobility disabilities. Here are some innovative ways to use AI to improve their experiences with collaboration tools:

Blind Users

Here are some proposed AI solutions:

  • Context-Aware Navigation: AI suggests users easier ways to find features within the product through auditory feedback.
  • Intelligent Summarization: AI-powered summarization tailors information to the preferences of blind users.
  • Emotion Recognition in Video Calls: AI algorithms analyzing facial expressions and tone of voice provide feedback on the emotional state of a participant in a digital meeting.

Neurodiverse Users

Potential AI solutions include:

  • Personalized Communication Assistance: AI-driven communication tools help neurodivergent users express themselves more effectively.
  • Social Guidance: AI-powered virtual assistants offer social guidance, helping them navigate social cues.
  • Cognitive Assistance: AI-powered cognitive load management tools monitor user's cognitive load and provide adaptive support to reduce it.

Upper Body Mobility Disabilities

AI can enhance their experience by:

  • Automatic Detection of User Preferences: AI algorithms analyze users' interaction patterns and preferences.
  • Adaptive Interface Customization: AI can dynamically adjust the interface of the collaboration tool.
  • Voice Controlled Collaboration: AI algorithms can enhance voice control features to reduce reliance on manual input devices.

Conclusion

When we design and build products with accessibility in mind, we create better experiences for everyone. Prioritizing accessibility, therefore, not only grants equal access to specific user groups, but also enhances the overall quality and usability of products.

I'm grateful for the opportunity to share my insights on the intersection of accessibility and AI, and I look forward to further discussions in this space. Let's keep in touch and continue creating inclusive experiences for all.


Video Transcription

Okay. I think I'm gonna get started, and people can keep joining in. So hello, everybody.

I am Priyanka, and thank you so much for joining my talk today. I'm really excited to speak a little bit on the topic of my interest, which is AI and accessibility. I'll be sharing my take on how the use of AI in digital collaboration empower people with disabilities. So before we jump into the topic, I'd like to share a bit more about who I am. So I'm Priyanka, my pronouns are she and her. I'm based in beautiful and green Seattle, Washington, and I currently work as a UX researcher with the Webex team at Cisco. I have over 5 years of experience in the field of UX Research, and I've worked on tech human resources and fintech products.

Fun fact about me is that I'm actually an architect turned UX researcher. So I was studying users and designing spaces for them. Before I started studying users to learn how we design interfaces. I am very passionate about mentoring and helping junior and aspiring UX professionals. To break into tech and UX. So I find a lot of fulfillment doing that. And if that's something that interests you, you can find me on ADP list or just, hit me up on LinkedIn. And lastly, Beyond work in mentoring, I'm a passionate home cook and home decor enthusiast, and I love blogging about it so you can find my blog on Instagram. So that's a quick sneak peek into who I am. And let's talk about what we are gonna be discussing today. So One of the key initiatives I work on as a UX researcher at Webex is accessibility research.

And I've had the opportunity to observe how people with the abilities interact with digital interfaces. That experience has taught me a lot about accessibility, inclusivity, and why it's important. For designs to be accessible. So with that in mind, today, I'm gonna be sharing about what is accessibility, what is its importance, the role of AI in accessibility, a couple of thoughts to just spark conversations and ideas on how AI can be used to enhance digital collaboration for people with disabilities and then reflection on my experience conducting accessibility research.

So I wanna get us started with a scenario. And for this scenario to be effective, I need us all to close our eyes. So for those who are sighted, please close your eyes and imagine you are blindfolded and using a laptop. Click right now for this session. Keep your eyes closed. You shouldn't be able to see the laptop, the keyboard, the screen, or the interface. With your eyes closed, imagine you're wanting to open your browser and check your email. A very simple task that a lot of us do on a daily basis. But except you are blindfolded. So remember, you cannot see the screen, the laptop, or your surroundings. So how would you navigate to open your browser? Naturally, since your eyes are closed at the moment, you would rely entirely on non visual queues in order to do feedback.

So now imagine that to navigate to your browser, you hear this voice in the background that tells you how to do that. Let's all listen with that eyes closed.

Discounts list with discounts, 808. Return 449. Return 4 information that you review has 2 star fan megawatt, just a click. Okay. Discount split.

So how was that? How did that feel? How many of y'all know what sound that was? And what we just placed in the late in the context of the interface. It's it's a screen reader for those who don't know. It's a software application that converts digital text and graphical content in the synthesized speech or real output, allowing blind and visually impaired users to access and interact with digital interfaces. In the context of collaboration tools, screen readers play a really crucial role in providing equal access to information communication and collaboration opportunity. So it's a very important assistive technology and an accessibility accommodation. And so what exactly is accessibility then? Accessibility is the practice of ensuring that digital content, products, and environments are usable and accessible to all people, including those with disabilities.

It's about removing barriers and providing accommodation that enable everybody to perceive, understand, navigate, and interact with information and services effectively. Accessibility encompasses a wide range of considerations from physical spaces to digital interfaces and is guided by principles of inclusivity, usability, inequality. In a sense, accessibility is about creating a world where everyone has equal access and opportunity regardless of their abilities or differences. And specifically digital accessibility, the first of the practice of designing and developing digital content products and services in a way that ensures equal access and usability for all individuals regardless of their abilities or disabilities. And there are a couple of principles that digital accessibility is dependent on. And I'm gonna gloss over them quickly. So starting with perceivable, starting with the most basic level, users must be able to process information information that is not presented in processable format is not accessible.

So it's super important for it to be perceivable. Second is operable. Products and interfaces should be operable by individuals with a wide range of abilities, including those with motor impairments or limited dexterity. This may involve providing keyboard shortcuts, ensuring adequate space between interactive elements and avoiding reliance on mouse space interactions. Allowing voice assistance wherever necessary. The interface needs to be understandable. So user interface should be designed in a way that is easy to understand and navigate for users with diverse cognitive abilities. This may involve using clear language providing consistent navigation patterns and offering tool tips or contextual help within a product. And lastly, robust. Products and digital interfaces should be robust and compatible with a variety of assistive technologies and devices commonly used by individuals with disabilities.

This may involve adhering to web accessibility standards such as the web content accessibility guidelines and testing compatibility with assistive technology like screen readers and alternative input devices. And by input incorporating all of these principles into our product, and design from the outset, designers and developers can create more inclusive and equitable experiences for all users regardless of their abilities or disabilities. And now let's jump into the role of AI and accessibility. So specifically, in digital collaboration. In the past couple of years, you may have noticed so many tools and features that enhance your productivity and efficiency. Such as zoom and Microsoft teams have rolled out AI assistance, Copilot said automate tech tasks such as summarizing a meeting or chat thread, helping draft emails or find content across set of meetings. So it's your own personalized assistant that helps you use a product And this innovation is especially benefiting people with disabilities and making their lives easier as well.

Recently, I had a conversation with a blind user and asked him about his usage of collaboration apps such as Google Meet, Teams, and Webex. He's very versed well versed with these tools and has over 15 years of experience working in the tech and finance field. And one of the pain points that he shared that I'll I'll be talking about is he always has to go through tons and tons of messages and it finds and finding to find most, for find messages that are most relevant to him. He has to use a screen reader as we heard before, and the keyboard to navigate through piles of and interact with the interface. But with an AI assistant and his experience, he said that he's delighted to see how the AI summarization in chat gives him important information that's most relevant to him or messages that where he's being called out and asked for something. So this makes his work and responds faster and efficient and more focused.

So AI is truly revolutionalizing digital collaboration and accessibility by streamlining communication, enhancing productivity, and fauceting and fauceting. Next, there are a couple of ways that AI is being used to improve accessibility already. And use across products. So I'll be I'll be glossing over the current AI artificial intelligence assistive technology. So starting with multimedia captioning. So where AI algorithms can automatically generate captions for audio and video content in real time, providing users with hearing impairments access to live events, webinars, and multimedia presentations, speech recognition, speech recognition systems use AI algorithms to analyze and interpret spoken language converting it into text or executable commands.

And these systems specifically enable users with mobility impairments, visual impairments, or other disabilities to control digital interfaces. Then image recognition. So image recognition helps identify objects, people, and scenes within images and provide descriptive text or audio productions, making the visual content accessible to users with visual environments. Screen readers or text to speech functionality which converts written text into spoken words making digital content accessible to to users who prefer auditory input. And lastly, summarizing information and AI powered assistance. So something as I mentioned before, with the age of instance currently. It's a helpful tool to summarize meetings, messages, or any digital content in general that makes it more accessible and consumable for users. And now now that we have learned a little bit about accessibility and how AI is currently helping digital accessibility I'd like to gloss over a few examples, food for thought and ideas of how AI collaboration can be enhanced for a couple of disability profiles.

With the interest of time, I would be sharing thoughts on blind users, neurodiapers, and users with upper body mobility disabilities. And these ideas and thoughts that I'd be sharing are also based on my experience doing research and interacting with said users and observing some struggles that they experience when using collaboration tools. The study with a companion AI learning collaboration for blind users So a couple of challenges that the blind user space, specifically while interacting with digital content is difficultly accessing visual content, such as images, charts, and graphs. The screen reader is largely what blind users depend on to understand what content is displayed and accessing that can be often a struggle. Limited navigation and interaction with complex user interfaces because of the screen reader and keyboard navigation that they use. Reliance on screen readers for accessing textual content, which may not always provide a seamless experience.

And lastly, difficulty understanding non verbal cues and visual layouts during virtual meetings such as right now. So for sighted users, it's easy to pick up on non verbal queues in a meeting, but for blind users, They're only dependent on auditory feedback, making it not the best experience. So what is some ideas for the future apart from existing assistive technology using AI, what else can we think about? So first one could be a context aware navigation. So using AI in a product to help the user navigate a product. Through auditory feedback, the AI can suggest users easier way to find a particular feature, a button, or even suggest keys to use on a keyboard navigation when the key sequence navigation is not working. The second is intelligent summarization.

So in the previous slides, I've talked about AI assistant summarizing meetings and the So the same concept where AI powered summarization algorithms could analyze actual documents and generate concise summaries tailored to the preference and priority of blind users. So that way, they can just go through key points and relevant information and review a bunch of documents more efficiently. And lastly, emotion recognition in video calls. So this, I believe, would be really a game changer for users that are blind. And are attending meetings. For sighted users, as I said, it's easy to grasp the nonverbal cues or gauge the mood of the meeting emotions, what people are feeling. So AI algorithms, capable of analyzing facial expressions and tone of voice would provide real time feedback on their emotional state of a participant in a digital meeting, thus making it easier for a blind user to have a better communication with them. Next, I'm gonna be talking about enhancing AI digital collaboration for neurodiverse users.

So just for reference, neurodiversity refers to the natural variation in neurological characteristics among individuals and passing a wide range of neurological conditions and differences. So these differences may or may not be a visible visibly visible apparently. So depending on the specific conditions and its representation changes, it could change. So for example, conditions like autism, ADHD, dyslexia, are not visibly apparent, but these participants are would be on neurodiverse users. So what are the challenges that they face? Again, I have had conversations with neurodivergent users through the research that I've done. And a couple of things that I have noticed a couple of struggles that they experience while using an app are difficulty with sensory overload from excessive visual or auditory stimuli. Challenges in maintaining focus and attention, particularly in a very fast paced digital environment, difficulty understanding nonverbal cues and social nuances during a virtual interaction and even having in any any type of an interaction during a meeting can be very difficult for such users.

And sensory sensitivities to certain colors, fonts, or interface designs that may cause discomfort or distraction. So what are some ideas for the future? How can AI benefit new drivers users using collaboration tools? So personalized communication assistance. AI driven communication tools could analyze a neurodivergent user's communication style references and provide a personalized assistance in the real time. For example, the AI could offer suggestions for alternative phrasing or provide to help neurodivergent users express themselves more effectively during a meeting. 2nd is social guidance. So AI powered virtual assistants could offer social guidance to users with neurodiversity, helping them navigate social and understand social cues during digital collaboration. The AI could provide feedback on conversation and dynamics of suggestions for initiate and maintaining conversation and provide support for interpreting non verbal cues. And lastly, cognitive assistance.

I've observed how busy interfaces and documents. So even the UI design on design can either comfort or stress a neurodivergent user. So AI powered cognitive load management tools could monitor a user's cognitive load during digital collaboration and provide adaptive support to reduce that load. And lastly, upper body mobility, disability, how can, AI enhance their experience while using collaboration tool. So upper body mobility, disambility refers to limitations or impairments in the movement or function of the arms, hands, or upward also. This can result from various conditions such as paralysis, muscular dystrophy, or the injuries affecting the upper body. Due to the limited range of motion while using digital tools, these these users largely depend on usage of adaptive keyboards voice commands, speech recognition software, eye gaze, head tracking to name a few.

A couple challenges that these users face is difficulty using traditional input devices such as keyboards and and mouse due to limited dexterity and range of motion. Challenges in operating touch screens and mobile devices specifically for tasks requiring fine motor control. And dependence on assistive technology as specialized input devices, voice recognition software, or alternative pointing devices. So what are some ideas for AI in the future of how they could assist these users? First would be automatic detection of user preferences. So AI algorithms can analyze users interaction with the collaboration tool to detect patterns and preferences. For example, It can identify whether a a user with upper body mobility prefers voice commands over manual input, specific navigations pattern, and preferred color schemes for of visibility.

The next one would be adaptive interface customization. So based on detected preferences, the AI can dynamically adjust the interface of the collaboration to to better suit the user's needs. And lastly, voice controlled collaboration. So since these users are heavily dependent on voice control, using AI algorithm to enhance voice control features and collaboration tool will allow users to perform a wide range of actions such as navigating the interface editing a document and managing task entirely through voice commands, reducing the reliance on manual input for devices.

So those are a couple of ideas, of how AI can help these disability profiles in the future. And now with the interest of time, I'll quickly like to gloss over a couple of my learnings as a researcher conducting accessibility research. I've connected with multiple participants having different disambility profiles. And if it has brought me a lot of empathy and sensitive So even with the way I approach my conversations with these users, it has to be with a lot of empathy, sensitivity, and recognizing that each participant experiences may be unique. So showing respect for their perspectives, experiences, and challenges are super important. Establishing trust. So that is building wrapper with the participants and creating a comfortable environment, will they feel safe to share their experiences open and openly?

Assuring them of the confidentiality and expressing genuine interest in understanding their needs and perspectives. And lastly, flexibility and adaptability. So being flexible and adaptable in my approach to accommodate participants, individual needs, and preferences adjusting the style of my conversation with them or the interview, the format, the duration, or my method of communication is something that I've learned throughout my journey of conducting accessibility research with participants with disabilities.

And today, as I conclude my session, I'd like to leave us with one thought. So when we design and build products with accessibility in mind, we don't just create better experiences for some. We create better experiences for all. By prioritizing accessibility, we consider diverse user needs and preferences leading to more inclusive and user friendly experiences for everyone. Ultimately integrating accessibility features doesn't just widen access for specific user groups. It elevates the overall quality and usability of products for all use. Us. So with that, I conclude my talk. So thank you so much for attending, and I hope to stay in touch with all of y'all. Thank you so much.