The Cost of Inaction: Who is Coding Our Future? by Cristina Mancini
Cristina Mancini
CEOReviews
Who’s Coding Your Future? The Importance of Inclusion in AI Development
In this digital age, the rise of artificial intelligence (AI) is reshaping our world in profound ways. However, it’s crucial to examine who is coding the future—the developers, the data scientists, and the individuals behind the algorithms that drive AI systems. This article explores the necessity of inclusivity, diversity, and responsible practices in AI to ensure that it benefits everyone.
The Reality of AI: Moving Beyond Hype
Today's discussion isn't about the hype surrounding AI but rather its real-world implications. As we transition from basic chatbots to advanced agents, the question arises: Who determines the algorithms? This choice matters, as it shapes the outcomes and impact of technology on society. At organizations like Black Girls Code, we strive to equip learners not only with coding skills but with the confidence to lead and innovate in the tech space.
The Data Dilemma: Bias in and Bias Out
One pressing issue with AI is that it learns from what it is fed. As the saying goes, "garbage in, garbage out." If the training data is biased—whether skewed, incomplete, or stereotypes-driven—the AI will perpetuate these biases. Here are some key considerations:
- **Bias exists in every aspect of identity**: Race, gender, nationality, body type, and more can reflect biases when not considered accurately.
- **Examples from the real world**: The case of the Liv chatbot highlights how biased coding can manifest in technology, leading to harmful stereotypes and misrepresentations.
- **AI and healthcare**: Instances of AI misdiagnosing patients, especially Black patients, have serious real-world consequences. When algorithms fail, lives are at stake.
Human Element in Coding: Why Diversity Matters
Who is behind the datasets and technologies? This is a vital question for anyone involved in AI development. When diversity is lacking in development teams, the outcomes reflect a narrow perspective. Consider these points:
- Representation is essential: A recent study revealed that 97% of AI-generated images of CEOs were white and male until the data was manually rebalanced.
- Curiosity and courage: Everyone has a superpower that transcends technical skills. Speak up when something feels off; it could make a difference.
- Break the cycle: If your team lacks diversity, you likely have a blind spot that could hinder innovation.
Building a Better AI Ecosystem: Strategies for Change
At Black Girls Code, we are dedicated to fostering an inclusive environment in tech. Here are some strategies for individuals and organizations to promote diversity and inclusion in AI:
- Question biases in data: Audit your hiring pipeline and datasets to understand who is represented and how this affects the algorithms you create.
- Encourage diverse teams: Build tech teams that include perspectives from all walks of life to create more robust and debiased systems.
- Engage in community partnerships: Consider donating, volunteering, or sponsoring tech cohorts to broaden the reach of initiatives like Black Girls Code.
- Speak up for ethics: When encountering bias in AI, do not accept "that's how the model works" as a valid reason. Advocate for fairness and transparency.
Concluding Thoughts: Inclusion as a Strategy, Not an Afterthought
Inclusion isn't just a noble goal; it is essential for creating responsible AI that serves everyone. As the technology accelerates, so must our commitment to ethical practices and equitable representation. Let’s build a tech future we can all trust and benefit from.
Join the movement and support organizations that aim to build an ecosystem where all voices matter. By doing so, we can ensure that AI is genuinely reflective of the diverse society we live in.
Thank you for reading!
Video Transcription
Today, I actually wanna talk about reality. I don't wanna talk about the hype around AI.Today, I wanna talk about who's shaping it and how we can ensure that it includes all of us. Because when I ask who's coding your future, that's not rhetorical. That's real. AI is rewriting the rules and with the acceleration of moving from chatbots who suggest to agents who act, who determines the algorithm deeply matters. You can play the video, Christy. It's on mute. We can't hear the video? Oh, unfortunately, it looks like there's no sound. Can you hear sound? You can hear sound? Okay. No? Not the video? Okay. Well, you can see in the video at least at, Black Girls Code, we have a mission. And our mission is to equip our learners with the tools to build careers with staying power.
And with the state of the world as it is today, as you can imagine, it's critical that we go beyond teaching code, that we are focused on building confidence, that we are focused on helping them have careers with staying power. That means that at Black Girls Code, we teach tech, but we also teach agency. We are focused on preparing the next generation of technologists not to just use AI tools, but to build them, to question them, to lead with them. Because I'm not here to say what some are saying, which is, you know, don't use AI. I think everyone on this call knows that that genie is out of the bottle. Because the reality is is that the future isn't agent versus human. The future is humans that are gonna be elevated by technologies that they can trust and understand. So with that, let's go to slide three. Bias in and bias out.
Let's talk about the elephant in the room, in the server room, which is bias. AI doesn't just make mistakes. It makes predictions based on what it's fed. And with 80% of, the model being determined by the data that it is built on, If that data is skewed, incomplete, or built off of stereotypes, unfortunately, so are the outcomes. I think we're gonna try to drop into the chat. I wrote an op ed a while back, with Essence on live, live as a chatbot. Did any of you hear about this scandal? You can answer in the chat if you have. No. You guys, this is I maybe it's because I make this by business to know about this. Liv was a chatbot. Liv was a black, queer, mother of two truth teller. But it turned out that Liv was none of these things.
Liv was a chatbot that was developed by a team of white men. And so you ask yourself, what could possibly go wrong with that? Well, as you can imagine, all sorts of stereotypes were coded into this chatbot. But she not only loved to spill the tea, she not only loved chicken and collard grains and kept instigating the journalists to go after these developers because how dare they. It turned out that Liv was in fact what most of these technologies are, which is a code switching, rage baiting chatbot whose sole existence as all tech platforms are, which is to drive engagement. The scary thing about this is that the journalist asked it how it knew that she was black, and you might enjoy reading the whole article, to find out how. But based on your titles, I'm sure that you know that we are well past ticking race and gender boxes. But that was last year.
What does ChatGPT think that we sound like today? With that, we're gonna click into doctor Ariel Epps' Instagram page. So you can see that this is what ChatGPT thinks a black person sounds like. Is it yes. Doctor Avril Epps is amazing. This is what ChatGPT scroll to the next one. This is what a Latino American sounds like. Next. Asian American. Next, white American. Next, a black woman. Next. And with that, doctor Avril Epps, we are really proud to say is our AI expert in residence at Black Girls Code. We want to make sure that as we train this next generation of technology that we are not also chasing hype cycles rather that we are building intentionally. Abril will help us expand our AI curriculum with care and with clarity and ensure that our students understand the power that's behind the tools and the responsibility that comes with them.
And equally important, and something that we are doing this summer is we are making sure that they learn to use AI in ways that will expand their thinking, not shrink it. And we will also teach parents how to have conversations with their, children about AI. The next one is, again, why does, who is coding our future matters? Why does who is inputting the datasets matter? Well, I don't know how many sports fans there are out there, but a couple of years ago, Klay Thompson, had his best, not his worst night ever as a Warriors, player. He, kept missing the net. He could not get anything right, and I think he may have actually been traded either the very next day or the day after. But unfortunately, in addition to that, he woke up to some disturbing news.
Grock had put out a article saying that in a weird twist of fate, Clay had gone driving around Sacramento, and he was throwing bricks through people's window. Luckily, nobody was injured, but, the police were investigating. Obviously, that didn't happen. What obviously happened is whoever was the developer for Grok, maybe is not a huge basketball player fan, and this is why context matters. Blondes are dumb. That's not just an outdated joke. It's an example of a stereotype that's quietly being encoded into global AI systems. I think, Margaret Mitch Margaret Mitchell and the Big Science Project, put out a study that shows how generative AI trained primarily in English is now spreading western biases across languages and cultures. And the dumb blonde trope rooted in American sexism and media, unfortunately, is an example of a stereotype here that is also showing up in languages where this stereotype never existed.
And so when you think about the article that Bill Gates just put out where he said that he believes that AI will replace doctors and teachers in the next decade, again, I ask who's building these systems? Who is coding our future? Because in health care, AI chatbots are already repeating racist myths about black patients. They're misjudging pain levels, kidney functions, and even skin tone. And, unfortunately, in this case, when algorithms misdiagnose, people don't just lose trust, they lose lives. And I think, you know, I'm obviously a black woman. I'm the CEO of Black Girls Code. And so you might think that I had come here today to just talk about bias with black people, but bias affects all of us. Bias does not stop at race.
We all know by now or we should that it shows up in gender, it shows up in nationality, body type, religion, and how differently abled you are. Unfortunately, it's every identity that's been turned into a punch line or left out altogether. Unfortunately, as AI systems accelerate, so does the spread of bias. Again, who is inputting these datasets deeply matters. There was a test recently where researchers asked an AI to generate an image of a CEO, and 90% of the 97% of the results were white and male. That did not change until they re manually until they manually rebalanced the data, but now I'm nervous. See if you can play this one. See if you can hear it. This is one of my favorite ads from SoFi. No. Go to YouTube and, check out this advertisement.
You can just let it run-in the background. SoFi asked who's good with money? These are the initial results that came out. It's really great with the music, but this is a great example of how with intentionality, you can change what the outcomes are. And, of course, no big surprise that our portfolios outperform Mint. So where was all this data when the initial outputs outcome? So the good news is that you can, with intentionality, go back in and fix some of these mistakes. However, if you recall the Liv story, the systems actually already know who you are. So when you think about how this is going to affect someone applying for a loan or buying a house or what their interest rate is, a study showed just how biased these systems were on these topics.
And when they removed race, that was great great outcome. Both applicants received the same rate, but what if we built these correctly from the beginning? And, again, the things that keep me up at night is back to live. The systems actually already know who we are. So how do we solve for that? How do we make sure that we understand that innovation without guardrails isn't actually innovation? It's acceleration without direction and inclusion, which means all of us, has to scale just as fast as the tech is. So, you know, this is why, people deeply matter. I I'm not sure if you're following just how quickly companies are rushing to dismiss whole teams only to have to hire them back, which is kinda funny, because, you know, most organizations are simply not tech companies. And, unfortunately, everyone is caught up in this hype cycle of not being left behind.
And in so doing, they're just building technologies that, in the best case, are only going to hurt their businesses. But in the worst case, these technologies can actually harm people. So what's your superpower? Everyone in this room has one, and it's not just a technical skill. It's curiosity. It's courage. It's choosing to speak up when something feels off. It's what the advantage of making sure that you have diverse teams provides for you. It's making sure that you build teams that include all of us because we all see through different lenses. And this was my ask of you at the beginning. Not just as an audience, but as leaders, speak up in the room. If your AI team all looks the same, your development team all looks the same, then your outcomes probably will too.
This means that you don't have an innovation problem so much as you have a blind spot. And when you look back and think about your own team, consider if there's a voice missing there. Audit your own hiring pipeline beyond the technologist. Again, this is why, our focus at Black Girls Code isn't just teaching code. I believe that regardless of what you do, you should have a baseline understanding of these technologies. Because if you have the money to invest in a new platform, why not ask what the development team looks like? If you are managing a development team, you can ask them what are the datasets that determined the algorithm.
If you're on a board, you can drive all sorts of change. But are you thinking about the right questions to ask so that you can be mindful, so that you can ensure that the organization of the board you sit on do better. And, of course, if you're in policy, if you're in government, I think we've seen just too often what it looks like when the people in government do not understand these technologies and let people get away with, like, the very basis bay base most basic of answers when asked about what these technologies are doing, how are we getting these outcomes.
Audit the data before you build the model. Don't fix it later. Something that is really helpful, obviously, is checkpoints. You can also ask the platform when it makes decisions, why it made the decisions, so that you can make it easier to fix. And if you see something that is biased, say something. Don't let well, that's just how the model works be the end of the conversation. Your voice could be the intervention. So with that, that pretty much wraps up my time with you. Black Girls Code isn't doing this by ourself. We're building an ecosystem for you. We need partners and here's how you can plug in. I think, you can donate. You can volunteer. You can sponsor a cohort. We, also welcome any invitation to build with you because the work that we're doing isn't charity, it's strategy.
I think we all understand at this point that inclusion isn't a nice to have in AI. It's how we can make sure that we future proof it. So with that, I just wanna say thank you for sharing this.
No comments so far – be the first to share your thoughts!