Algorithmic Diversity: Zero Exclusion, AI & Ethics (Women in AI)

Automatic Summary

Algorithmic Diversity: Bridging The Exclusion Gap In AI And Ethics

Welcome to our discussion on a rather overlooked but very pertinent topic - Algorithmic Diversity. I am Yona Wilker, an advocate for change in the tech world, strongly focusing on neurodivity and gender inclusivity. My work traverses several layers, with the ultimate goal to democratize our tech ecosystem and eliminate bias in technological solutions.

Today's Challenge: A Clear Lack of Diversity in Tech Ecosystems

As we delve into algorithmic diversity, it's important to highlight the primary problem that has afflicted our tech ecosystems for years. Unfortunately, many systems were originally designed by individuals representing a single perspective, often male, thus excluding minorities, women, disabled, and neuro-diverse people. This bold monopoly of design thinking has consistently excluded others even though these systems were inherently unsuitable for them.

Embracing a Diverse Ecosystem

Significant strides have been made. Women, the underrepresented majority, have become significant drivers of innovation in accessibility. This is where I align with women in AI initiatives whose aim is to encourage the creation of tech solutions for women by women. How? By showcasing their proficiency in coding, creating AI solutions, apps, platforms, and else.

Accessible Technology: The Story So Far

Highlighting the positive, accessibility has improved over time, with big tech companies like Google and Microsoft launching accessibility programs to combat disability. Schools, too, have fostered a more inclusive environment by accommodating learning disabilities such as dyslexia and ADHD, while tech companies are testing the waters with neuro diverse hiring platforms.

Raising the Alarm

Nevertheless, the representation statistics are still dismal. A shocking 90% of people with autism in the US and UK are not fully employed, and only a tenth of people with disabilities have access to inclusive technology. Despite the availability of incredible technology including smart glasses, emotional recognition, and computer vision, their market penetration remains low.

Further, another looming problem relates to the ethical framework behind this technology – the prevalence of algorithmic bias, lack of transparency, accountability, and the omission of a social science perspective in technology.

As we invest in technology and policies, we must ask the right questions, enforce accountability, develop more comprehensive ethical guidelines, and correct the lack of representation in the tech field - the root source of bias.

Bridging the Representation Gap

This calls for a focus shift in the tech world. We must prioritize a broad range of studies beyond traditional tech fields. Gender studies, African studies, Asian studies – all these represent perspectives that are fundamental to solving the bias problem in technology design. As the saying goes, "Nothing about us without us". Therefore, the presence of professionals centred on these studies in technology teams prompts the creation of well-informed, inclusive criteria and definitions. To eradicate bias, representation matters.

Creating a Framework Around Human Rights

As we grapple with bias, it is crucial to remember that this is a social problem that affects everyone. Technology, now more than ever, should reflect our social fabric and respect human rights. For this reason, I encourage schools and universities to teach ethics alongside technology-related subjects. The lack of a strong ethical understanding by technology creators often leaves the most vulnerable among us – children, women, the elderly – at a higher risk of effects from negative technology use.

In conclusion, we need to steer away from the "one vision, one man, one type of design thinking" approach that permeates most of the ecosystem. Instead, we need to foster dialogue and collaboration that appreciates everyone's unique perspective. From hackathons in Africa, Asia, and the Middle East to teaching coding in schools, the more diversity we embrace, the closer we are to biased-free technology.

Feel free to connect with me on social media, or visit my platform - yona.org, a foundation focused on the tech-driven future of disability and the neurodiversity movement. Let's do more to create a more accessible, inclusive world through technology.


Video Transcription

I would love to welcome all the participants. Uh And there's a session called Algorithmic Diversity, Your Exclusion A I and Ethics. My name is Yona Wilker and my work is focused on the future of ability, neurodiversity, accessibility and gender. My work can be divided on several levels.

On one hand, I try to connect the dots through technology companies. I serve as an evaluator and judge for projects funded by European Commission but also collect uh portfolios of the early stage companies focused on the future of learning, well-being, accessibility and gender.

At the same time, I try to solve issues and challenges behind this technology. So I'm focused on guidelines, frameworks, ethics and policies behind uh related to human rights, disability, women and Children. At the same time, my goal is to democratize our movement and involve uh more people to this um ecosystem. So I'm focused on hackathons driven by mit and this year I had the opportunity to collaborate with women in A I and we launched um and presented Zero Exclusion hackathon which involve people around the world across United States, Europe, Asia and Africa. So as I mentioned, are today's sessions called algorithmic diversity. So uh this uh concept involve both solutions behind algorithmic bias but also diverse uh spectrum of uh technology solutions related to accessibility, disability, gender and neurodiversity. Unfortunately, until today, most of the ecosystems were designed by one vision, one man in most cases, one male. And afterward, we just try to include others to this monopoly of design thinking uh including women, neuro diverse people, disabled people, ethnic minorities. And even though this system, these institutions classroom workplaces didn't fit them, we still try to add them to this ecosystem even though it was completely not for them in terms of a nature logic mechanism and definitions.

Hopefully, in recent years, though women are still underrepresented, women become a driver of innovation in accessibility movement in disability movement. And many innovators um help us to come up with the new projects, uh new uh ventures. And one of the reasons why I joined women in A I is to demonstrate is to how to create not only for women but also by women. So we try to demonstrate how girls and women who, who uh could create sometimes much better than their uh fellow males, how they able to code engineer, create robots A I solutions platforms and apps. So uh today's sessions, I would love to share how women are able to join this movement. How are we able to not only create technology and make our world more accessible for everyone but also how we're able to uh make our technology less biased, more diverse. Um how we're able to come up with more solutions uh solve issues related to representation. Um and the lack of connection between social science, gender studies, African Asian studies uh and technology. So, uh first of all, um when we talk about accessibility, uh very often we can share some uh good news because Microsoft and Amazon, they introduce their accessibility programs and programs related to disability.

Google and Google Glass became a platform um and technology solution uh to solve issues in emotion recognition for a focused start ups. Uh tech companies uh try neuro diverse hiring platform which help to uh hire more people with a learning disability dyslexia or autism uh into technology companies.

At the same time, schools experiment with the dyslexia AD D and other solutions training uh solutions. And recently I had opportunity to work with the uh such company from Denmark and we try to incorporate the solution across not only no countries but beyond. Uh but unfortunately, the statistics related to uh representation related to uh employment and unemployment are still pretty said 90% of people uh in the United States and the UK with autism are not employed full time. Uh And only one of 10 people with disabilities actually have a uh access to with inclusive technology. So we have so many wonderful robots, uh smart glasses, emotional recognition, computer vision and technology which can be used by nurses doctors, educators, it's still not so accessible and still not so uh penetrated actual market. But uh unfortunately, we have another problem is uh uh ethical uh and uh problems of framework behind this technology. And it includes algorithmic bias, lack of transparency, lack of accountability, lack of double check principle, lack of uh um social science behind uh technology.

So it leads to technical fixes, lack of privacy. And finally, we still have no uh policy um behind autonomous agent. So how we're able to solve it? I would love to share our recent um lessons uh with uh European Commission, with the mit with the women and, and A I and are both technology and policy making efforts. So, first of all, I believe the key uh challenge we have today um and we're not able to make technology less biased uh is a representation uh because uh very often we try to say like it's a problem of algorithms, it's a problem of a technology. But the actual problem is a representation. We have only 10% of women in data science and A I teams. And we have just uh some uh uh 0.50 0.4 some just a tiny percent of people with disabilities, neurodiversity. We have a stigma uh towards uh mental health disorders in so many conditions. So uh there's a phrase uh nothing about us without us. So how you plan to create technology for women without women in technology and how you plan to create uh uh solutions for disabled people without disabled people um honestly was set on the World Economic Forum. A I can be both a solution for any problem in the world.

But at the same time, the source of apocalypses, depending who and how ask these questions. So until we have a people uh all over the spectrum, all of the ethnicity, all of race and gender, we are not able to ask correct question to build actually diverse micro teams uh in research and uh and other departments. So in the end, we just failed, we failed to ask questions to uh build the MVP of products and for sure all of our effort just useless. Another problem. Um and I know it specifically in my personal journey, my first education related to liberal arts and history and also I was always passionate about social science. Well, I spent most of my life in technology world. Is it just dramatic uh disconnection between uh technology uh in social science? Because in most cases, when we talk about technology team, we just uh think about designers, engineers. But guys, how do you plan to uh solve the problems of gender studies? African studies, Asian studies?

How do you plan to come up with the criteria and definition um race, ethnicity, uh neurodiversity if you have no professionals focus on this uh studies. So in most cases, we created not only gender and ethnicity of vacuums and filter bubbles in our teams, but also uh filter bubbles in terms of the studies. I mean, we have no social science professionals in technology teams and someone who would help to come up with the correct criteria for sure. Uh we came up with some innovation in this field is a uh ethics professionals. So it's a kind of advocate of more uh uh ethical technology in residence. But in most cases, first of all, uh start ups have no money to hire such people. And second, uh it doesn't build any kind of actual ecosystem. So just one person who actually tried to uh push um other people, but it doesn't help. So in most cases, um actual solution uh let in much bigger um spectrum and it's a building um ethical uh ecosystems driven by bioethics, uh human rights where you have an accessible more vocabulary, understanding of gender studies, uh African studies, e eth ethnicity studies and so on where anyone is a has awareness about this problem.

So in most cases, you just not able just to incorporate some ethnical ethics person into not not ethical team, you're not able to solve the Facebook problems. Uh just uh integrating some ethics professional and you know what happened in, in Google, they just fire these people because we have no any kind of a moral ethics just in the core of a company. So you're not able to solve uh social problem with technical fixes, it can be solved through complete reshaping uh the knowledge, the background uh and rep representation of these people in teams. Um Another problem which I would love to mention uh very often when we talk about uh technology uh about bias specifically in such fields like nursing, uh medicine, health care, well being, education, schools, classrooms, workplaces with um we uh typically try to consider um A I algorithms or technology as a subject of a law.

So it's a technology bias. Uh It's a, a robot is biased. But the problem is that it is technically impossible, biased person, but biased people biased teams behind it and actual people. Uh actual uh subjects of the law is the citizens is the individuals as a team as a corporation, as a companies. So uh though such uh movies like cut it bias, they made amazing work in terms of uh awareness, we should go, we should go forward in terms of accountability who actually is accountable for um facial recognition bias when we have a 30% error for women or people of color who actually responsible for that and who are uncountable.

And that's why we are not only build ethical ecosystems and um agents of ethics and uh accessible moral vocabulary. But we also try to understand who is accountable both for actions and for omissions, omissions is not action. So it's a um condition when you need to act or do something but you do absolutely nothing. So we define all of these agents of responsibility for particular levels of technology, research implementation uh in both uh um evaluation, both negative and positive uh impact. And until we are able not only define uh understanding of accountability but also uh putting it into the uh policies. Because just recently um European Commission in European Union came up with the um updated uh vision of um policies related to computer vision, facial recognition, uh security cameras and other stuff. But it's still like a yes or no like we are banned or not. But it, it's not about deeper understanding why it fails, why it doesn't work properly, why these companies develop it in which way who's responsible? It's more about CEO is it more about designers, engineers, researchers or it's about the whole complete biased ecosystem? So it is, this problem is much bigger.

And finally, um it is specifically related to my work because recently I came up with the um educational program focused on human centric A I and ethics was founded by uh European Commission and uh created in collaboration with ISD I Business School in Spain and published on shop four CF uh platform.

Um It's a lack of access to ethical guidelines. So in most cases, um we have many policymakers, experts, uh practitioners and other people who develop with frameworks related to ethical development, uh robot ethics. Um and uh many, many stuff like um Alan Turing Institute, Monreal A I Ethics Institute. But unfortunately, most of the technology um are created by start-ups by founders uh they created uh during hackathons or in university campuses. And unfortunately, uh in today, world Ethics is not um studied in schools or um U university campuses. So if we uh organize hackathons schools or if we teach engineering or coding in schools, we should teach ethics in the same time. And any uh type of, let's say uh mm implementing ethics through some isolated professionals, not for democratizing, it just uh doesn't work. Um And another problem is the lack of a niche uh ethical guidelines. Uh My key focus is accessibility, uh neurodiversity uh disability.

And in most cases, the people who are the most vulnerable ones who are mostly affected um both by society but also negative consequences uh of technology use are women, uh Children, young people, elder ones uh and so on. And unfortunately, uh until today, we had no frameworks and guidelines related to this field. So for instance, only in 2020 UNICEF came up with a framework called A I um and Children. So it was August 2020 we had no any frameworks for this field uh before it. So uh we built uh smart toys, um teenagers and young people were the the main consumers of the content on the platform like youtube, Facebook and many, many, many type of solutions. But we had no any kind of frameworks and guidelines which actually regulate how we use technology in this field. And something similar we have in general for uh disability. And that's why uh artistic people be uh became, become victims of cyberbullying some negative ways to use technology.

Um uh and so on. So what's why, for instance, when I work on the human centric A I and ethics, uh in the end, I put the beginning of a huge discussion related to uh human rights. Because in most cases, when we talk about bias, it's not technology problem, it's a social problem. It's a huge social problem because we just stop to uh actually take into account human rights. We only take into account corporate interest profits uh and how some CEO uh make companies uh unicorn. So all of the news, um all of the magazine just filled by uh founders, entrepreneurs, successful people. But we don't talk about how many people actually become victims of this technology.

How many women become um uh victims of sexual harassment and workplace or they use harassed by technology or fruit technology. So actually, until we have all of this um discussion, nothing actually uh will change. And I really spent many efforts to um change it for good because I'm disabled person. I'm a binary person. And in recent years, uh women were my key allies in this movement um trying to fix representation to reshape uh and dismantle uh uh colonial technology, uh colonial vertical of power driven by most cases, white males who use it for profits uh for their own greed, but not for society.

So what's why until today, all of the ecosystem, as I mentioned before were designed by one vision, one man, one type of design thinking without any kind of a competition, without any kind of discussion. Maybe the whole road is wrong. Maybe we need different type of agile line approach or design thinking or approach to research or a criteria behind technology. But hopefully, now we have more opportunity to create dialogue, collaboration. And which one I use almost any opportunity during my talks to say guys, you're able, even though we have a digital divide, I mean, uh currently we have a unique opportunity to participate. I mean, I organize hackathons across Africa, Asia, Middle East. And in most cases, the most talented people come as the immigrants come from emerging countries from India, from China, from uh um Saudi Arabia in in many other countries. So they participate in our hackathons completely remotely and very often they become winners. Um Just a, a recent example is our uh start up A I four N I focus on autoimmune disorders uh and Crohns Disease uh became a winner of our uh hackathon in uni United States. And before uh my men uh become a winners of an open A I hackathon and mit hackathon. So though digital divide uh exist in this uh objective thing we have more opportunities. So I welcome all of you to participate in this accessibility movement.

Now, we're able to make a difference. We're able to build A A I robots, apps platforms. We're able to leverage all of our voices of women, LGBT uh people of color to this discussion to make our wishes heard. So please let's do it. And you are welcome to this moment. Thank you so much for your time. It was Yona welcome. I'm always happy to connect on social media. You can find me on Yona dot A I or yona.org. Um It's a foundation focused on the future of uh disability and neurodiversity uh movement uh in technology. So, thank you so much and have a wonderful day.