International Women’s Day 2026: Balance the Scales

Reviews

0
No votes yet
Automatic Summary

Empowering Women in Tech: Balancing the Scales in Leadership

The tech landscape is rapidly evolving, and as we anticipate the future, it's vital to recognize the importance of balanced representation, especially at the leadership level. A recent panel discussion featuring prominent figures in technology explored the roles women play in transforming the industry and the challenges that continue to persist. Let's delve into their insights on balancing scales in leadership, navigating systemic biases in AI, and empowering the next generation of women in tech.

Insights from Industry Leaders

  • Pooja Varshneya, a senior engineering leader at Lyft, emphasized the importance of navigating the tensions that come with leadership, such as speed versus quality and innovation versus responsibility. She believes that diverse perspectives lead to better outcomes, highlighting the responsibility leaders have to ensure equitable decision-making.
  • Liz Moyle shared her experience in making sure all voices are heard in the decision-making processes to create fairness in leadership roles. Liz pointed out that the emotional burden often falls on women, and balancing the scales involves distributing responsibilities equitably across teams.
  • Sarah Seager, a senior director at EPAM, stressed the significance of intentional sponsorship—amplifying the voices and ideas of those closest to the problems faced in tech. She highlighted the need for leaders to create spaces where diverse ideas are welcomed and valued.
  • Sowjanya Pandruju, a senior staff engineer in architecture, discussed the implications of AI biases, emphasizing that bias often enters the AI lifecycle at the data and problem framing stage. She believed that diverse perspectives are critical during the design phase to avoid perpetuating existing inequalities.

Navigating Bias in AI

As technology increasingly permeates our daily lives, addressing bias in AI systems is essential. Panelists agreed that bias can begin in the earlier stages of the AI lifecycle:

  • Data: If the data used for training AI reflects existing inequalities, biased outcomes are likely to follow.
  • Model Design: The way problems are framed and success is measured can unintentionally reinforce bias.
  • Deployment: Even well-designed models can behave differently in real-world situations, leading to unintended consequences.

To combat these issues, leaders must adopt a lifecycle approach that thoughtfully considers how data is utilized and continuously monitors model behavior post-deployment.

Leading with Intent and Integrity

Leadership in tech isn't just about integrating new technologies; it's about fostering inclusivity and empowering others. The panel shared a collective belief that:

  • Women must advocate for themselves and support one another, building strong networks and alliances.
  • Fostering an inclusive environment requires tangible actions from organizations, such as setting measurable goals and holding leaders accountable for progress.
  • Leaders should model behavior that prioritizes mental well-being, fostering a sustainable workplace that encourages breaks and emotional connection, ultimately driving productivity.

Call to Action: Building a Foundation for Inclusivity

To construct an inclusive tech ecosystem where diversity and equality are foundational, organizations and individuals must commit to the following:

  1. Prioritize diversity in hiring practices at all levels.
  2. Create and maintain avenues for mentorship and sponsorship, especially for women in mid-level positions aspiring to rise to leadership.
  3. Utilize data-driven insights to inform strategies and decision-making while remaining conscious of bias.
  4. Encourage open discussions on emotional labor and wellness within tech teams and foster an environment where team members feel valued and heard.

The commitment to balancing the scales in tech leadership isn't symbolic; it demands operational choices that directly impact who gets promoted, funded, or heard. As we continue this journey toward a more inclusive future in technology, let us take actionable steps today to make meaningful changes.

Let’s continue the conversation around empowerment and transformation in technology. Share your thoughts and actions toward inclusivity in the comments below!


Video Transcription

Architecture focusing on building scalable and intelligent enterprise systems.As a senior staff engineer in architecture has led complex micro migrations from on premise systems to AWS, delivering, significant cost optimization and operational efficiency. She is passionate about responsible AI, inclusive, inclusive innovation and designing technology that balances technical excellence with long term societal impact. And we also have Sarah Seager. Hi, Sarah. 

Hey. She's 

a senior director at, at business development at EPAM. She has over twenty five years experience in the health care and life sciences industry specializing in real world data and evidence, driven generation to improve patient outcomes worldwide. Currently a senior director at EPM system, she leads strategy and partnership across global RWE initiatives and has built and managed diverse, high performing data teams throughout her career. A passionate advocate for women in tech and data, Sarah serves as global lead for women at EPM and president of the Healthcare Business Women's Association in the London chapter. And finally, but not last but not least, we have Pooja Varshneya. She is a senior engineering leader at Lyft. Hi, Pooja. She is a technology leader with over twenty years of experience building large scale technology platforms for Fortune 500 companies and major Wall Street investment banks. 

She currently serves as a senior engineering manager at Lyft, where she leads global engineering teams responsible for building Lyft maps, the navigation system that powers the company's rideshare platform and enables millions of variety today. Throughout her career, Puja has focused on designing resilient high scale systems that sit at the core of real world products used by millions of people every day. Beyond her technical leadership, she's passionate about increasing representation about women in STEM fields. She actively invests in mentoring and coaching engineers and emerging leaders, helping them navigate career growth and leadership opportunities in the tech ecosystem. So believe it or not, that was those were big introductions. But if you go over to our event page, there's a lot more to learn about them, and they have their LinkedIn, links over there so you can connect with them. 

Okay. Cheryl, can you hear us? Can we hear you? No, I'm sorry, Cheryl. We can't hear you. Well, we're going to get started and, we have a backup for Cheryl, but you can watch Cheryl participate, but we will be able to hear. So we got a quick question for everybody to start. What does balancing the scales mean in your leadership today? So let's start with Puja. 

Hi. Balancing the scales in leadership is right now about juggling things. Leaders are constantly managing tension between speed and quality, innovation and responsibility, and business outcomes and human impact. The role of leadership is not to eliminate these tensions, but to navigate them thoughtfully. There's a lot going on. It's a it's a big change going on right now. Layoffs are happening. People are doing a lot more with less. So it's Yeah. It there's a lot on the leaders to ensure there is, well-being being counted, all voices are being heard, and we are being equitable in decision making. Mhmm. For me, it's also it also means being intentional about whose voices shape these decisions. Diverse perspectives lead to better systems, better products, and ultimately better outcomes for the people we serve. 

So especially in this field in technology where we where our platforms are affecting millions of lives, leaders have a responsibility to ensure those systems are designed with fairness, accountability, and long term impact in mind. It's about creating the environment where innovation can thrive, people can grow, and systems can build systems we build that are both responsible and scalable. 

Agreed. Agreed. Great. Liz, what are your thoughts on what balancing the scales means in your leadership style today? 

You know, it's not some lofty abstract concept. It's really about bringing sanity and fairness to all situations. And similar to what Puja was saying, making sure all voices are heard, not necessarily the loudest one in the corner. It's making sure that we do a lot of emotional heavy lifting, especially on the female front that tends to fall on our shoulders. It's one of those unheard tasks that I I always talk about, and making sure we're proactive with problem solvings that don't always just fall on the women's shoulders that you know, these are rarely formal job descriptions, but it's always something that ends up being essential in our everyday role that we rarely get credit for. 

So balancing the scales is not necessarily just making, you know, an automatic decision to have the mentor be the female in the room or the party planner be the female in the room. It's spreading the chaos across everybody. You know, we are 49.6% of the world's, population. So let's spread the wealth. That is what balancing the scales means for me. 

Oh, Liz. That was great. Sarah, your perspective, please. 

Yeah. So probably very similar. I think we'll all probably have very similar responses, to be honest. But I think for, you know, balancing the scales, it it yeah. It it it means being, intentional and purposeful. And, again, it's it yeah. It's really about that voice in some of the decisions and some of the conversations that we're having specifically around what's shaping technology, what's shaping data, what's shaping analytics. So I think with my own sort of leadership style today it's you know balancing the scale sort of shows up in in numerous ways it's about who I who I mentor and whose idea or ideas that I amplify in meetings, in conversations. So it's about being an effective sponsor and not necessarily just a mentor but it's also about making sure that those people closest to the problems are part of really, you know, shaping that that solution to to whatever the conversations around or whatever the problem may be. 

So again it's really it's essentially voice and supporting and sponsoring those voices. 

Agreed, Sarah. Agreed. So Jania, your your thoughts, please? 

Sure. For me, balancing the scales means being intentional about who gets to shape technology and not just who builds it. So as someone, working in cloud architecture and AI systems, I I can see how decisions made by a very small group of people can influence products that affect millions of users. So in my leadership, balancing the scales means bringing diverse perspectives into technical conversations early, especially when we are designing systems that can even automate decisions. So asking questions about who might be unintentionally excluded or impacted by system, is the core. So when I believe that when more voices are involved in designing technology, the outcomes tend to be more thoughtful. They tend to be more responsible, and, I feel ultimately more impactful. 

Absolutely agree. Cheryl, we're gonna try to see if we can hear you. Oh, I'm sure she's got something profound to say. We can't hear her. Oh, I'm sorry, Cheryl. We're gonna have to go to the next question. But, you know, I agree with all of you with everything you said about balancing the skills in in terms of your leadership style. You know, I think we, you know, as being one of those those groups, should we say that, you know, is is fighting for parity, right? And recognition of all the things that we deliver. I think we take it very seriously to, have all voices heard. So thank you for your thoughts on that. All right. We're gonna move to our next question. We're gonna talk about inclusive innovation. So the first question is where does bias most often enter the AI life cycle? 

Is it data, model design, or deployment? And I think we're gonna have Sowjanya answer this question. 

Sure. Thank you for the question, Lori. So in my experience, bias most often enters the AI life cycle much earlier than, people expect. And that is usually at the data and problem framing stage. You know, the AI models, they are essentially pattern learners. So they learn from historical data. And if that data reflects existing inequalities, missing populations, or incomplete representation, the model will simply learn and scale those patterns. Imagine you have, like, a hiring recommendation system, and that is trained on historical data, like the hiring data from the past ten years. If that organization historically hired fewer women or underrepresented groups for technical roles, the model may learn patterns that unintentionally reinforce those same outcomes. So the model might be mathematically correct, but the underlying data reflects the biased reality. 

And I believe that bias up also can appear in how we define the problem because sometimes the way we measure success can unintentionally create a lot of unequal outcomes. Let's say, optimizing purely for accuracy or conversion rates and such. They may overlook fairness across different demographic groups. And we can see bias emerge, even during deployment. And I think this is where things become really interesting because real world environments are very dynamic. Data solutions shift, user behaviors are evolving, and this is when the feedback loops begin to form. So a model that appears fair during testing can behave very differently once it interacts with real users at scale. 

Yeah. 

So we just talked about a recommendation system. Right? So let's use that example. So when an algorithm begins recommending certain types of content more frequently, users interact with that content more. And that kind of reinforces the system's belief that this is what users prefer. Yeah. So more time, this feedback loop, it's just unintentionally amplifying the bias. 

Yes. So 

because of this, I often say that, bias doesn't show up in just one stage of the AI life cycle. It can begin in the data very early, be reinforced through the way we design the models, and even evolve further the way the systems interact with the world. So that's why addressing bias requires a life cycle approach, meaning that, it, it has to be thoughtful about how we use the data or, the objectives we are optimizing for and what we are continuously monitoring, how these models are behaving, after deployment. So when organizations take this broader view, they move from simply detecting bias to actively designing the systems that are more, responsible, that are more inclusive and trustworthy from the start. 

Agreed. Agreed. It can be self perpetuating in AI. Right? You know, as we as we talk about, you know, it starts in that very the the, the very beginning. Right? And then it just continues to grow. And as we interact with it, we're training it. We're just perpetuating, that bias. Agreed. Cheryl, how are we doing here? 

We can see your beautiful face, but 

we still can't hear you. Oh, no. Alright. Well, Liz, I'm gonna ask you if you can you jump in on that question, please? 

I most certainly can. Sorry, Cheryl. I'll I'll I'll, take some of this off of your plate. So first of all, for me, I've worked in multiple different areas, and, I've done seven ERP implementations and multiple RPA rollouts, and it always comes down to the bias and data. By data bias is like glitter. Right? Once you get a little bit in there, it just permeates its way throughout the entire model. And before you know it, it ends up being a hot mess. And so for me, it the data, it feels it it feeds our AI models in a reflection of our past. So let's face it. Our past isn't always pretty. We've seen it in some of our old deployments. We're seeing in our new deployments. 

And when we teach AI to think in a certain one divided way, it's going to have an outcome in the same way. So if we don't bring enough color and enough, imagination and enough voices into the room similar to what Shoshana was saying, we are going to end up with a very gray scale design. So if we are able to bring those voices and champions in the room, you're going to have a much more, formatted framework that is gonna be user the like, have a much stronger user ability downstream. And things won't get lost in an echo chain chamber of dominant perspective per se, like the loudest voice in the room like I was mentioning earlier. And then there's deployment. You can have the most perfect designed AI model, and the you're gonna have unicorns of algorithms that are just going to unleash into this wild thinking of actual humans. 

And so we're not going to be able to build something that is perfectly humanistic without having a humanistic voice into the room. And so when I'm going through the design process, I always aim for an eighty twenty model. I always say, let's go for what's 80% of what's happening every single day of the time, and then focus on that 20% that happens in real time and optimize your supply and your design from that given moment. And so those are those unsung heroes preventing AI from becoming an expensive paperweight. It's really devoting and taking that time to invest in understanding the nuances of the everyday. Because although the world is very scared that AI is gonna take over their job, but humans come in with a certain perspective and a certain gumshoe that is needed in the everyday. And AI is not going to be able to pick up on that current that happens on the plant floor when you're running low in materials and you're you're hoping a truck shows up. 

AI is not gonna see that. You're gonna have your inventory clerk that's gonna be raising their hand like, hey. We've got a problem happening down here. And it's driving those conversations Yeah. And getting your engineers and developers onto the plant floor and having those direct conversations with end users that are going to make your designs much more successful. 

And can I just add to that very quickly? 

Yeah. 

So, yeah, I think, you know, there's there's the human element. The human intelligence is essentially that kind of tribal knowledge that we bring. And, you know, we're all sort of subject matter experts in our own areas. And one sort of example or a couple of examples I want to sort of show very quickly is around so I work in the health care life sciences space, worked with data for a very long time and you know what AI is doing, it's learning from what we've fed it or it's learning from what we haven't fed it. 

So, you know, a lot of the healthcare data sets, for example, they're reflecting the health systems, the populations of information than it's been it's used to, you know, or it's being fed. But the problem is that, you know, if there are certain groups of the population of the community that have had, say, less access to care, then they'll be underrepresented in that data. So you've got, for example, drug studies. You know, many, many years ago, women were historically excluded from certain drug studies. People in lower socioeconomic groups, they're likely to be sort of less, represented in certain datasets because they have less access to health care. So, you know, but that's what AI is is picking up on. So it's learning from it. But what we as, you know, the human intelligence, we're coming along and going, well, look, actually, we know that those, you know, parts of the community, parts of the population are being underrepresented and, you know, there's naturally gonna be that bias, but we're still needed to sort of provide those insights in it. 

So it's, you know, there's always gonna be that element of artificial intelligence versus human intelligence. 

Agreed. I think that is so important, you know, and and what I really appreciate is the conversation we're having here in this room about it. Right? Because we're not hearing that when you start you listen to some of the the the, the, let's say the executives of these AI, AI industry, they're just, they're not talking about this. They're not talking about the underserved. They're not talking about the underrepresented. So thank you for that. All right, Cheryl. I know we're continuing to try to hear you. I'm gonna go to the next question is AI capability scale. What are the new risks or ethical considerations that leader must leaders must prepare for? And, Puja, I think we'll start with you on this one. 

Sure. It's a new world. It's, we are entering a new era where Mhmm. Old systems that we build, we are constantly, like, redefining them, breaking them. And we have to think about risk and ethical considerations very, very early on as Liz, Sarah talked about, and Cheryl works in the security space. She's an expert in the field. Like, it all it all starts with the data. Your models are as good as your data is whatever you feed in. It's garbage in, garbage out. So the story starts with validating and building good input pipelines. We talked about biases and data. You have to make sure, training your models, you are validating for that, making sure that there is no bias in the data Yeah. Whatever you're feeding in. So you have to do that before you build the model or before you deploy it. And then after deployment also, you have to think about the guardrails. Yeah. 

Now we are moving into a new generation where the systems are self learning. They're learning on the fly itself. 

Right. 

So you might do some due diligence while building it, but after you it's deployed and agents are learning, they can still break the system. So you have to think about what is a liability. Yeah. A recent example is OpenClaw. It's the hardest thing where OpenClaw is impersonating people, driving personal attacks, taking on data, taking over people's computer. There are strict warnings in enterprise environments. Do not install or put on your enterprise machines and things like that. So you have to think about these protocols, the risk considerations, not after you build the systems, but it has to be, it has to be part of your design. And that is something very, very new to all of us, and we all are learning that together. 

Yeah. Agreed. Absolutely. Cheryl, this the second part of this was supposed to be you, and I really wanted to hear what you had to say about this because I know you are an expert in this field. You're gonna try to do you think you can connect on your phone? You wanna try that? Oh, no. Alright. Does anybody else wanna jump in with some thoughts on this? Consider the new risks or ethical considerations that we have to consider as as we're scaling these, applications? 

I wanna jump on to the black box mystery. So Puja touched on Agendec. I have personal issues with and I've worked with developers over the last two decades. So I know what it feels like. I'm not an engineer, but I've worked plant with plenty of them, and I've done in a lot of my own designs through engineering. Yeah. And one of the biggest issues I've had in the past is I would go to a developer and I'd say, why is it doing this? If we have a black box design and it is determining whether or not I'm going to get a home loan, I want that black box developer to tell me why. And if those engineers or developers cannot answer that question with a shoot from the hip decision that is actual and factual. I have an issue with Agendex. 

And we just mentioned on that earlier. Once we deploy, if we don't have any type of monitoring of what those Agendex bots are doing and making sure that they are staying within those safe guardrails. I mean, we're starting to see, you know Interesting. We've got Chad GPT is now being, procured and being used with our federal government right now. Like, these are big things. Big decisions are being made, and they do have a lot of impact that could happen to our future generations. And so that is one area that I really wanna make sure we're highlighting that the black box perspective of we're going to trust, but not necessarily verify, that is one area that we really wanna drive some true clarity from an oversight perspective and from an ethical perspective. 

That is one area that I really wanna highlight with this question because data privacy, our own personal futures are very much driven off of this black box perspective in this day and age, and it could and it will have a dramatic impact on the future race human race if we don't put some guardrails in place. 

I love that. And you're absolutely right. And we're hearing more about that. Right. You know, it was interesting when you, when you read about, anthropic and why they wouldn't do what they, what they were being asked to do. And now ChatGPT, as you said, is in there having this conversation. And so it's really important that we have the ethics, and we are, are, have the guardrails on Agentik. I agree with you. All right. We're gonna move to our next question. And this one is for Sojania. Share a defining moment when you influenced a high stakes tech decision. 

Sure. And I love this question because I get to share my personal story in this. Yeah. So one moment, that stands out in my career happened very early during, like, you know, a very large enterprise cloud migration initiative, where we were moving critical workloads from on premises infrastructure to the cloud. And the organization had a very ambitious timeline and, understandably, was focused on the speed, getting systems migrated as quickly as possible because so that so that way, the business could start benefiting from the cloud scalability. But as we started reviewing the architecture, I realized that simply replicating this existing system in the cloud, it would definitely work, and it would we could technically make it work. But it would also carry forward many of the same operational and scalability challenges that we were seeing, and already facing and struggling with that. So instead of doing a direct lift and shift migration, I suggested redesigning parts of the architecture to be, more modular, more event driven. 

And 

my approach was that, this would allow the system to scale more efficiently. It would reduce the operational complexity, and it would even support future innovation. So the challenge here was that, you know, the leadership wanted it as soon as possible. The speed was the driving focus, but, I'm talking about additional planning, changes, coordination across multiple teams. So in this fast moving environment suggesting structural changes, it felt like I'm leaning towards slowing things down, especially when the stakeholders were focused on immediate delivery. So what helped me, was shifting the conversation from short term effort to a long term impact. So I walked stakeholders through scenarios showing how a more modular architecture could improve resilience, how much it can reduce the operational overhead, and, lower the infrastructure cost over time. 

So, after several discussions back and forth, the leadership then decided to move forward with the redesigned approach, and the result was a platform that it not only supported the initial migration, but also became the very foundation for new services the organization built later on top of that cloud infrastructure. 

And one interesting thing for me is that after the product succeeded, many people would jokingly say something like, it would sound along the lines of, like, well, that worked out. You got so lucky. You were, you know, pushing for it. You got lucky. Thank God it worked. I remember thinking for a moment, and my response was that it wasn't really luck. 

Right. 

The mission was based on clear architectural vision. A lot of lessons learned from all the large scale systems, you know, dealing with them, careful research into how, even driven patterns would fit very well, and it could improve improve our, reliability and how it could improve our scalability. So I've said when you combine this experience, this data, this thoughtful design, it might look like luck from the outside, but it's not. It's the result of deliberate preparation. So that experience reinforced something important for me about influence and technology. So it's not just about having the right idea. Right? It's about communicating the broader impact, grounding decisions and evidence. Because even, as I said, I was, talking to multiple skip levels and trying to influence those decisions. So having, evidence really helped influence decision, and, I could help others see the long term value of thoughtful architecture. 

So, I would say in high stakes environments, influence often comes from ability to connect the technical insight with strategic thinking and clear communication. 

That's a great observation. And and and, you know, thank you for sharing that experience. I I think we can all learn, you know, how to be strategic, but also to be able to communicate, thoughtfully to a group of people with different with different interests as well. Cheryl, we're gonna try you. I think you you came in on your cell phone. Can you let me see if we can hear you. No. We still can't hear you, darling. I'm sorry. I'm sorry. Okay. Anybody else wanna jump in on that? Is there any other, any experiences you would like to share about being in a room where you influenced a high stake decision and what you did to do that? Alrighty then. So we'll go to the to the next question. So, what's one strategy and, Sarah, this is gonna be for you. 

What's one strategy that helped you build credibility and authority in rooms where you may have been underrepresented? 

Yeah. Sure. So I think, one of the main strategies that sort of really helped me probably early on in sort of my career progression and sort of my as I sort of moved up sort of through into leadership was really, and and it helped sort of being in the world of analytics and data. So, you know, I naturally have that sort of analytical, sort of ability. But the biggest strategy was just leading with evidence and insight, and that sort of hit every single time. More so more than, you know, trying to lead with, say, your job title, or your level of seniority. I was always someone that kind of, and I don't wanna use that sort of the the the phrase imposter syndrome. It wasn't really that. 

It was just a case of, like, I still got lots more to prove, and I've come so far, but, you know, let me prove to you what will I have, the evidence, the insight, the data from what I'm presenting. And I never have ever sort of used essentially my my title or sort of where I am to really try and force that. And I think working certainly in the world of, real world evidence health data, I really have learned that, you know, credibility grows when you bring very clear, data, thoughtful analytics. It's about the outputs. It's about the insights. But I think more importantly, it's that sort of the softer skill. It's around actually translating that complex technical information into almost maybe another language, but it's then sort of translating it and giving it to the people, the ones that are driving the strategy, the ones that are making the decisions. It's really you're sort of almost translating it, mulching, munging it into something that everyone can use and actually act upon. 

And so I've had various scenarios where you know I've been like the newest person in the room and I think for me it was always about sort of consistently bringing those results, those insights and being able to do almost like that translation and what I found was that when people start to then sort of trust you they're kind of like oh even if you're not saying a huge amount or you're not the loudest person in the room you don't have to be the loudest person in the room. 

But if you are constantly bringing those clear and concise insights, that translation and helping to drive those decisions, credibility just naturally comes with you. So for me, I would say it's really about let your results do the talking. It's you know the loudest person in a room usually has the least to say and the most to prove. That's my personal opinion. So it's not always you know the loudest person the loudest voice in the room that carries the most weight. 100%. Certainly not for me. And it's really the one that speaks with maybe grace, clarity, with care, but also very importantly, it's about consideration for those other less heard voices in that room too. So even if you are in a position of leadership, absolutely, you're probably going to have some sort of input and, you know, you'll have a certain time to actually speak up or drive that strategy. 

What about the others? You've got other technical experts in the room and they may even not necessarily be the most confident. So, you know, be that again, that sponsor, that that sort of support, allow them to come forward and let their voice be heard. 

And, you know, can I add to that? I Yeah. Breaking that wall, I think, has been one of the things as a female I've found as a benefit is, you know, we're most of us, I don't wanna say all of us because that's generalizing, but let me throw my own bias in. You know, breaking that barrier and and forcing folks to talk that don't generally wanna talk, I've I've found that operations folks never wanna talk to accounting finance folks, and finance folks never wanna talk to IT. And then forcing them just to sit in a room and say, hi. Let's kumbaya together has gotten me so much success in my career just by making them talk to each other. 

Yeah. 

And and it's driving those collaborative conversations to get and see that North Star goal that Sarah and and I think Puja was mentioning. Like, we're trying to drive an end game, an end design. This is what we're trying to deliver. We think we can get there, but I need your support, and I need us to get along to make this successful. 

And you need your input too and their input as well. Yeah. Yeah, absolutely. Absolutely. We've had a couple, we've had a couple of webinars where we talked about, the, ability and the opportunity to, share an idea that everybody can understand. So, you know, coming from one perspective and trying to share that idea, with credibility, right? To someone who may not understand, the technicality of it. So, so sometimes just telling a story, using evidence in, in your outcomes, to, to back that up. Cheryl, I think you joined in another window. Can we hear you here? Oh, no. 

No. No. 

I don't think that's helping either. I appreciate that you keep trying though. I really do. So hopefully we'll get lucky. We'll be able to hear from you at some point. Okay. So we're gonna now talk about scaling opportunities. And so, Liz, this is a question that I'd love to ask of you. What systemic barriers continue to limit women's advancement in AI engineering and in product? 

I'm gonna say it. Some might agree, some might not agree, but the good old boy club is still kicking. It's alive and well. Yep. I'm seeing some nods and some faces on my face. So I'm gonna say we have some agreement that's happening here. And I'll be honest, myself included, I've been excluded from powerful sponsors just because I'm distracted. Single mom, got too much on my plate. They're not going to be able to Yeah. Provide as much as others. And and having those opportunities not necessarily be provided to me just because of bias or whatever. The burden of the unlisted task, I mentioned that earlier. It is cumbersome. It is exhausting, and it happens to all of us. And we tend to see things, you know, just by, the the art of the unseen things that need to happen and need to be achieved. 

But they're very, very time consuming, and they're very, they take at your heartstring, and they're exhausting. And and it's it's a part of leadership and management that, might not necessarily be discussed. But if you take and you show those emotions, you're suddenly seen as weak as a female when to me, that is not a weakness. That means you care. That means you're willing to put yourself, not necessarily as a sacrificial lamb, but you're willing to be a powerful voice for somebody that might not necessarily be that powerful just yet. And that's when I'm exuding a low emotion, that tends to be what I'm doing. I'm trying to speak out for others or myself. And then it's just having so much to do and so little time to do it and managing all of those things and the timetables and and working for companies that are willing to work with you and recognizing that you might miss that 9AM meeting, but you're gonna be there following the notes and you're gonna follow-up on those tasks because you gotta take your kid to the doctor that one day and that one week or something like that. 

You know? The work life type rope, I've worked in the Fortune 500 world. I've worked corporate. I've worked in non corporate. The demand is constant. That constant availability, that 05:30PM phone call when you're trying to get dinner going, and you got sports practice happening. You're dividing and and conquering in most cases. But even when you're dividing and conquering, we don't necessarily get credit for dividing and conquering. You know? We're just conquering. Yeah. And then just to really truly break those chains of, like, everybody has good intentions, but the reality is there's a lot more on a female shoulders than, might necessarily. And I don't wanna undercut any of the male counterparts that might be watching, but we got a lot on our shoulders, a lot on our plates, a lot of unspoken things that we need to go through, to drive, change and to drive action. And and it's it's one of those things that that to get a systematic squeeze and really drive to get that future game of AI is going to help in some areas and utilizing those talents. 

It's it's it's not our we don't need to scale back our ambitions to be successful because we're parents. We don't need to scale back our ambitions because we are taking those unburden those burden tasks or those unmarked tasks. We should be accredited for them. We should be recognized for them. We should be seen as the person that did plan the party or the lunch or, you know, got the team together because you could see they'd been working in the trenches for weeks on end, and they were exhausted, and they just needed a laugh. So for me, that is probably one of the biggest culprits is that lack of recognition for the extra effort that a lot of females are having to do having to do to get recognized. And also the name change thing. I myself have had to go through a name change, and that name recognition is a big part of our career. 

If you change your last name mid career, you lose a lot of aspects that reputation of what you had the first ten years. And you have to teach and train and bring and be like, hey. I was the Liz Spurgeon that you worked with five years ago. You might not remember, but I'm still the same human. And that's the reality of what we're fighting with today. 

That I've lived that one as well, Liz, where, you know, I when I when I got married and I hyphenated my name and then, you know, after, like, six months, I said, you know, I'm just for business. I'm just gonna use my my my maiden name because this is how everybody knows me. And I've been doing this for fifteen years, so I absolutely understand that. And I think you said some other really critical things too, you know? I, and you know, I, I sometimes hesitate to say this, but, you know, and maybe Sarah can share her perspective on this, but here in The US, I feel like we're going backwards. Right. I don't feel like we're, we're moving forwards. I feel like we're going backwards. And, so it's gonna take powerful women and it's gonna take, our voices. And again, I think going back to what what we were talking about before about making, being influential, right, and having impact, is leading with those outcomes. 

So they understand the difference. Sarah, what are your thoughts? I mean, you're in in The UK. 

Yeah. No. Absolutely. I think from some of the the research that I'm sort of close to, with regards to, you know, some of the the groups that I lead and network with, you know, it's yes. We still have that, gender pay gap, and that gender inequality certainly within this particular industry of of technology. But, I can't remember if it there was probably a Harvard, paper. But apparently, yes. There's still not enough women in in in the tech industry, regardless of of continent. But what's actually happening in the last decade or so is that, less women and girls are actually even entering the industry. So So it's not a case of that they're just not getting anywhere or not being given the, you know, the the right promotions at the right time or or even being recognized. 

They're now not even considering. So I think there's an even huger piece around, you know, working with sort of earlier years. So some of the the work that I do in The UK specifically around STEM is, you know, talking to schools and colleges around and certainly to to the to the girls around, you know, look, you know, you might not consider it, but, you know, science, technology, engineering, math, you know, you might not think it's a sexy subject, but trust me, you know, there's so much here. 

I even come into the world of, of what I'm doing now, you know, health care, data analytics, life sciences, it purely by accident. I was gonna go and be an artist. I was an art student, and it happened by accident. But it turned out that, you know, well, I'm I've been here twenty seven years since, so I must be doing something alright. But I think yeah I think there's a lot that we need to do with sort of trying to encourage more girls more women to actually consider and you know coming into into this industry and I think for any of the organizations to actually be serious about balancing the scales, we need to move toward well, yeah, we need to move beyond just intentions and actually move more on, you know, measurable change, because, you know, we all do our goals and objectives and everything has to be measurable. 

This is exactly the same thing. Even when we're thinking at, you know, industry level, it still has to be measurable. And so I think where there needs to be, it's not just about hiring women in tech roles, I think it's then about actually ensuring that they are represented in leadership, they're represented in the roles that they want to be represented in, it's about the strategy. It's about getting involved in AI governance. It's the whole kit kaboodle. You know, just getting more women in the door isn't it's not just a tick box scenario. It's about actually supporting and, you know, ensuring that they have that level of representation. And I'm gonna say it again, sponsorship. It you know, you really need to have those those people who are in the rooms where you're not speaking for you and standing up for you and saying, hey. I know Sarah's not here right now, but she's, you know, brilliant at this and she can help you with that and and have you considered this person? 

We need more of those sponsors and it's so important. And I think, you know, it's it's it's really about ensuring that we get and we keep those high potential women as senior leaders and actually really advocate, you know, for them in not just promotion but also other future opportunity discussions. And that's not, you know, in one specific country. It needs to be global. And I know there's a lot happening, you know, politically, certainly within The US, you know, a lot of the sun setting of of certain, you know, inequality groups and everything, but it still remains a problem. And I think organizations individually can still do that and still make a change. We don't have to solve the, you know, world peace, but I think step by step, just having even just this discussion is enough to make little changes in the right direction. 

DEI is not just meant for your website. DEI is essential for a safe workplace. 

Yes. I agree. 

Yep. You know, and, you know, coming from my perspective, I talk to a lot of companies, you know, every day, right. Or every week, I should say. And there's, there are still, as you said, individual companies, there's still a lot of companies that are very committed to this. You know, they're gonna, they're going to and, and, you know, one organization will tell you, they said to me, look, we want the best people for the job, and we're gonna look for them everywhere. Right? So there's not going to be, this one place that we're gonna look because we've been told that that's where we need to look. So, Cheryl, I know this is something that you is very near and dear to your heart because this is something that you have you actually do, supporting the underserved in, you know, accessing STEM and and building careers in it. So I am so sorry that we can't hear for you, but, I'm happy that you're still here. So I have an open question for the group now. 

If we are to succeed in truly balancing the scales, what measurable shifts should organizations commit to in the next twelve to twenty four months? Something tangible. Any thoughts on this? 

I think it comes oh, 

go ahead. Go ahead. Go ahead. Go ahead for it. Sarah and 

I are tag teaming. Yes. I think it comes down to measurable successes. Yes. I'm a black belt, but I think it always comes down to the percentage points and putting something actively and tangibly in place to say, I want to increase x by y by a certain date, but for this reasoning and really believing in it. And I mentioned don't just put it on your website, believe it. And and put people responsible for that and be willing to pay their salaries because those are the folks that really drive inclusion and and really drive change in an organization. And it forces conversations in places that most people are afraid to have those conversations. And it brings folks into the room that are afraid to sometimes be in the same room or don't have the opportunity to be in the same room. 

In ERGs, I've been able to work with marketing folks. I've been able to work with IT folks, operations folks. I've met people I've never been able to considered meeting during ERG sessions. So for me, I think it's really putting tangible numbers to what your voice is saying on your website. 

Absolutely agree. I was gonna pretty much say the same thing, Liz. So Yes. We're we're reading the same lines. 

Yeah. Yeah. No. I think I think that what you're saying is absolutely right, Liz. I mean, you know, it's it's defining what it is, what the goal is, what are you trying to accomplish, and then having those steps there that you're going to take, that you're committed to, that you're intentional about to actually hit those outcomes. Right? So it's understanding in the very beginning what are those things that we are trying to do that actually balance the scales, and how are we gonna do that. Right? And I think another piece you've all talked about is in involving everybody and making that happen. Right? Because when you have allies and sponsors and mentors, right, we we we tend to accomplish more, in in that direction. Okay. So, Cheryl, you're saying a host may unmute you at any time. I know. I tried to unmute you. See, now you're muted. Now you're unmuted. 

I'm not sure why it's still saying that. Okay. So, Sarah, we had your thoughts on that. We're gonna move to, another question or another area, a topic that we wanted to explore, which is thriving through tech disruption. So, Liz, this is for you. In high growth environments, how do you protect well-being while driving performance? 

It's ruthless prioritization. It honestly, that's what it comes down to for me. It's Yeah. It's really, I I I always get a a ton of requests. I always get a ton of tasks. There's a lot of things that I need to be getting need to get done during the day, but it's really making sure what do I need to get done and how can I do it the most efficiently? And utilizing the tools that we have today using your Grox, using your clogs, using your Amanis, using whatever chat g p t tool you want, using whatever Excel spreadsheet you can. It's it's driving clarity from chaos for me, and it's that ruthless prioritization that keeps me sane, and it keeps me understanding. And and putting my family first, and making sure that my daughter always comes first. If she's sick, I stop, drop, and roll. 

I know whatever business I'm working in, you know, the last business was wine. Wine's gonna be there. Right? Like, they're gonna be making wine tomorrow. They're gonna be making wine in three weeks. You know? So for me, it's making sure it's that ruthless prioritization, and it's my own well-being too. If I feel burnout is coming, I take a break. I take a breather. If I need to step away for a couple hours in the afternoon to go meander in my flowers to take some calmness and and reflect and think through some processes that I know I can't do sitting in front of a monitor, I'm okay taking a step away. And that for me is that mental mind numbingness that sometimes we need. And with AI coming, one of the concerns I have is we keep saying that we're gonna take 30% away from people's, the busy work that they have to do. 

Right. For me, that 30% of busy work that I used to have to do back in the day was my mind numbing days. So those were when I'd key in data just so I can come and and calm myself before I had to make some strategic decisions. And that is something that we as leaders need to think through and be thoughtful of. If we're taking 30% of their busy work away, we need to give them some mindful mindful moments to gather themselves so they don't feel that they're having to churn like a machine, like a robot because humans are not robots, and we need to think through Right. The emotional toll of what we're designing. 

Agreed. Pujoy, any thoughts on that? 

100% agree with everything Liz said. Being thoughtful and, how I think of it is, as Liz talked about burnout, it's a thing. I am very highly driven and I can like, when I'm passionate about something, I can work twelve hours, fourteen hours a day on those things. Yeah. So now I've been burnt out many times, and I have learned through that. Like, now I watch out for it, and I coach my team members. Like, hey. Don't run 100%. Don't burn the candle at both ends. 

Right. 

Let's think about sustainable performance. In our in technology cycles, it's always peaks and lows. Sometimes the you are driving towards a deliverable and everyone is all hands on, but then take a break. Don't treat it as, okay, go back to business again. You need to be you need to think about your sustenance. So I think about sustainable performance and not high performance, which is just, measured in impact numbers. You cannot keep doing that. You cannot keep churning numbers on top of numb numbers and high performance all the time. You need those moments as Liz was thinking about. Thinking moments, deep work moments. 

Yeah. 

So I encourage that. I coach my teams, and I say no to work. I when I'm building roadmaps, I teach my teams. I'm like, yes. There's a lot coming our way, but you have to say no so you can prioritize the hardest. 

Agree. Yeah. 

In technology and in large teams, often there's a tendency to to say yes to something which looks low hanging. Yeah. So you do that, but you never get to the hardest task because you are just so burnt out doing the low hanging work. Yeah. Hey. This is simple. This will take you one hour. Yeah. But that one hour I think is required to do that difficult job that has been sitting for 

Yeah. Months. Nothing's ever simple. 

No. No. Nothing's ever simple. Right? No. Totally agree. And I think it's giving ourselves permission, to protect our well-being. Right? Is is recognizing that we're not robots. Right? And that, you know, we do have thresholds and we do have limits and giving ourselves opportunity and permission to just take that when we need to. I know for myself, it just it when I do that, I'm far more productive when I'm actually working than trying to just push through. Right? You gotta give yourself a little bit of a break. Okay. So our last 

just one thing I will add there is, sorry to interrupt us. 

No. No. 

Lead by example. As leaders, we have that ability to create that space for people. I have a I have a 10 year old daughter, and I mark my calendars when I have to pick her up, drop her off. I need some time for childcare. And I enable I encourage everyone to do that, to log off at a sustainable time if they need to. And, leaders around me do that. Take take time off. Take longer time off, not one or two days. Take a week off, and it's okay to do that. So leading by example is also important that you set the right expectations that will empower and encourage your teams to do the same, others around you. So so that modeling role modeling is very, very important. 

I couldn't agree more. I absolutely absolutely agree. Okay. So we're running close on time here, but we have a few questions, from the audience. So one is, I'm wondering if any of the panelists have observed a thought about social norming of large models, maybe as a reverse bias that coerces different voices into preexisting or larger behavioral patterns. Any thoughts on that? 

This is actually a great, this is actually a great idea. 

Yeah. 

And it it is helpful. Like, as I talked about, like, thinking about your training pipelines and how do you in your training pipelines, you have to give both kind of behaviors, some behaviors which are like, hey. This is affirmative. Yeah. And some behaviors that show you, hey. This is negative. This is false. So I think this idea is talking about the same thing, doing a reverse bias. When you're building your training datasets, make sure they're balanced, and they're good examples of both. Hey. This is this is correct behavior, and all of these are negative behaviors. Yeah. 

And use them as 

an Mhmm. 

So I I often I encourage everyone, and this is something I I encourage people to think deeply about. What are you inputting? How can we change that to change the output behavior of a model? 

Mhmm. Agree. Agree. Wonderful. Thank you for answering that question. We have one more here. And this is a I think a a really great one when we talk about leadership. I've seen programs bringing women into tech and I've seen executive coaching programs, but I see a gap for mid level women trying to become senior. What are your thoughts and advice on that? How do they get how do they break into leadership or or or senior leadership from a mid 

level? Be your own advocate. 

Yeah. Yes. And I think also adding to that, be your advocate, but also I think that's where the, your networks come into, into play. Your networks, your your tribe, your army, your whatever you wanna call them. 

Build a solid sisterhood. 

Exact exactly. Because I I 100% agree, and I've I've also found that, you know, as you go from sort of, like, the the junior level into sort of, you know, middle sort of management, you you'd sit there and you go, oh, it's kind of a bit lonely here. You know? I used to have big teams and big men you know, friends and colleagues around me, and now there's only a small handful. Then as you sort of go up into more sort of directorship and various things, you're like, oh, k. So it's, it's just me then. Fair enough. And but then there's this kind of, like, that that sort of gap. And, you know, you kind of from a lot of the organizations, you'll have that sort of, like, mandatory training of, like, right. Okay. 

Just to be an effective manager, you must, you know, treat people this way, and you're gonna learn this and these are the processes. And you go, great. But then kind of then what? And there isn't a kind of any help with the then what? So from that perspective and from my own experience, it's like, okay. Use your network because I've always I've tried to sort of insert it in everyone. And from even from, like, you know, the the first time they ever get a a job, it's like, this might not make sense. But trust me, in about ten, twenty, thirty years time, this network, you'll be like, oh my goodness. Because they will be your core people, and they could be completely external, or they could even be members of your family. It can be absolutely one. 

But you'll use them in such a way to go, well, you know, you can bounce ideas off them. You'll ask them questions, you know. Am I doing this? Do you think it's a good idea? Or, you know, and they'll tell you something completely different. You go, well, I'm gonna do it anyway. That's what you need. And that really, really does does help. And that that network is just so crucial. 

And they've seen you in your different Yes. Phases of your career. Right? So, like, I've been friends with folks that knew me twenty years ago when I was just a production assistant. I still talk to them, and they're they they're like, you speak differently. And I'm like, well, it's been twenty years. You know? But Yeah. We go back and we reflect on some of the projects that we had back then and how we would handled it. And I saw someone put a question in here, like, what about if we're not only male sponsors? It's fine. Male, female. It's sometimes it's good to have a male sponsor because you can teach them to be a solid ally because they don't necessarily recognize what they're doing. 

It is ingrained in their their how they were raised. It's in our society. They don't it's it's a completely unclear thing that they're doing that they don't see unless they have a female saying like, hey. I'm here. I'm struggling. I need your help. And a lot of them will say like, oh, man. She's forgotten. I'm gonna help them. And don't be afraid if they're male or female. Sponsors, mentors, they're there. If they don't help you in one area, they'll help you in another area. There's a reason why they were brought down your path for for anyone. 

Then diverse, you know, diverse thoughts, opinions, angles, and I I know we're well, actually slightly over, but really wanna throw this in. So, one great example, I would say fun fact. I don't know it's a fact because I you know? Anyway, I was at the, the, House of Commons, so the UK parliament, and I was there for an event, around trying to get more women to stand for parliament. You know? Again, another area, very male dominated. Great. But there was this guy who come in, oldish guy, and he was one of The UK's top endocrinologists. And he just he spoke up, and he goes, I just wanna tell you all. He goes, I wanna tell you this fun fact. He goes, if we had more women in power across the world, we would have far less wars. And everyone's gone, what? You know? And I'm sort of all joking. 

And he goes, no. There is actually a sort of scientific background with it. You know? It's to do with our hormones. It's to do that we we kind of have that approach of wanting to nurture more with with more solutions sort of driven. And also, you know, with with the the guys that's that level of testosterone, it's like, okay. Right. We'll just solve it and what and I was like, wow. And that's really stuck with me for a long time to the point I've kind of gone off and wanted to know more about the scientific research of it. But when he just announced it, I was like, wow. That is that is an amazing statement. 

Yeah. 

You know, it's absolutely that. 

I had a male early in my career told me 

to diversify my resume 

and not to my resume and not to pigeonhole myself into one given space because he knew if I stayed in purchasing Yeah. That's true. Operations, I was gonna limit myself. But if I diversified my resume, my voice was going to mean more to the others in the room. And that came from a male colleague. That I don't necessarily, twenty years ago, don't think I would have been able to find that in a female mentor. Right. I hope 

he's in your network. I love that. Yeah. 

I really love that advice. 

Diversify diversify your Yes. 

Diversify your resume and Yeah. In those trends. Yeah. Yeah. Yeah. I It also talks about evolution of personalities. Something you want to do in your twenties, you may want to you may or may not like it in your thirties. You don't want to be out of options. Right. Right. Also, like, diverse perspectives bring in and give you more allies, more peers, more colleagues, and more avenues. So it is very, very important. 

I think that's a a really great way to sort of wrap this up and and wanna thank you all. And, Cheryl, I it's so disappointing we didn't get to hear your insights and perspectives because I know 

We 

met her. She's fabulous. 

I know. She's wonderful. She's wanna I'm gonna invite Cheryl back. Cheryl, I'm gonna we're gonna figure this out. We're gonna invite you back for for a conversation. So alright, everybody. This was a powerful conversation, and I thank you. We've explored how bias enter systems and how it can be prevented. We've heard what it takes to influence high stakes decisions. We've discussed the barriers that still exist and the measurable shifts that we need to take to dismantle them. And we've imagined a future where leadership truly reflects the world in which we are building. If there's one thing I got and it stands out from this this conversation is balancing the skills is not symbolic. We're all committed to that. It's structural, it's operational, and it's a leadership choice. 

And it shows up in who gets funded and who who gets promoted, who gets hired, and who gets heard in the in the conversations. So to our audience, I invite you to take one idea from today's discussion and apply it immediately. Maybe one hiring decision, one design review, one sponsorship or mentorship conversation, and one leadership opportunity, that you may want to extend to somebody else. And so balancing the scales doesn't happen all at once. It's a decision. It happens decision by decision. I think that was something that Sarah said at one point in time. And to all of you, my extraordinary panelists, thank you so much, for not just building technology, but building it with intention and integrity and sharing your thoughts and perspectives with us. 

And to everyone who's joining us around the world, let's continue building a tech ecosystem where inclusion isn't aspirational, it's foundational. So happy International Women's Day, everybody. Please join us on April 2 for International Women in Tech Day. And today developed by women tech network for another exciting, inspiring and informing panel discussion, celebrating leadership and innovation. So thank you all for being here. I really enjoyed our conversation. And look forward to connecting with you, and continuing to connect with you outside. So have a great day, everybody. Thank you so much. 

Thank you everyone. Bye.