Strategies for driving organizational change; insights from fighting the good fight in an industry that is slow(er) to change

Pratibha Basrao
VP, Applied Technology Office Director
Alexandra Ciobotaru Nørballe
Head of Automation Innovation
Donna Huey
Chief Digital Officer, Sr. Vice President
Nisha Thomas
Director, M&A Infrastructure Integration

Reviews

0
No votes yet
Automatic Summary

Unlocking Technology Adoption in Traditional Industries: Insights from Tech Leaders

In today’s rapidly changing world, technology adoption is paramount, even in industries that are often characterized as "traditional." From architecture and engineering to oil and gas, many sectors face unique challenges in integrating modern technology. In a recent panel discussion, technology leaders shared their insights and experiences on overcoming barriers to technology adoption. Here’s a summary of their valuable wisdom.

Meet Our Expert Panel

  • Pratik Bal Basrao - HDR Engineering
  • Donna Huey - Chief Digital Officer, Atkins Realis
  • Nisha Thomas - Technology Leader, Oil and Gas Sector
  • Alexandra Chobotaro - Head of Automation Innovation, Maersk North America

Challenges to Technology Adoption

The panel discussed several significant barriers to technology adoption in traditional industries:

  • Cultural Resistance: Many organizations lack a culture of continuous learning, hindering their ability to embrace rapid technological advancements.
  • Financial Constraints: Financial reluctance can prevent companies from investing in necessary training and improvement initiatives.
  • Need for Proof of Concepts: Leaders emphasized the importance of piloting new technologies on a small scale to mitigate risks and validate assumptions.

Strategies for Overcoming Barriers

The panelists shared effective strategies to foster technology adoption:

  • Continuous Learning: As Donna pointed out, institutions need to institutionalize continuous learning to keep pace with technological change.
  • Start Small: Alexandra suggested focusing on proof of concepts. Conducting small-scale tests helps avoid disruptions in core operations while evaluating new technologies.
  • Engage Stakeholders: Nisha highlighted the importance of involving key stakeholders in the decision-making process to champion new technologies.

Securing Leadership Buy-In

Donna emphasized the need for multi-layered engagement when convincing leadership to invest in new technologies. It is essential to tailor the message to match the audience's values and concerns, whether they be executives focused on increasing shareholder value or managers worried about productivity metrics.

Measuring Success in Innovation

Effective measurement of innovation success is crucial for securing ongoing support:

  • Establish Clear KPIs: Set clear performance indicators to quantify time savings, cost reductions, or improvements in safety and efficiency.
  • Utilize Data-Driven Insights: Leveraging analytics and dashboards allows organizations to track productivity gains from new technologies and make informed decisions.
  • Focus on Tangible Benefits: As Nisha mentioned, understanding how new technology impacts profit and loss statements is vital for demonstrating value to stakeholders.

Lessons Learned and Personal Reflections

The panelists offered essential lessons from their careers, aimed at helping the next generation of tech leaders:

  • Ask Questions: Alexandra emphasized that there is no such thing as a stupid question; asking for clarification is crucial to understanding.
  • Value Failures: Embrace failures as learning opportunities, much like Nisha shared, as they can provide pathways to innovation.
  • Focus on Collaborative Change: Donna advised focusing energy on collaborators who are eager for change rather than trying to appease the naysayers.

Conclusion

In conclusion, technology adoption in traditional industries poses unique challenges but also offers opportunities for innovation and growth. By fostering a culture of continuous learning, piloting new technologies, and engaging stakeholders, organizations can overcome barriers to change. The insights from the panelists underscore the importance of adaptability, collaboration, and measuring success in paving the way for the technological future of traditional industries. Embrace change, ask questions, and remember: every failure is a step closer to success.


Video Transcription

Alright. Well, let's go ahead get started. Hello, everyone. Really excited to be here today.We have a great topic and, some very some some phenomenal, technology leaders from across, across various industries, ready to engage with you on this topic. So, we'll just start with a quick round of introductions, and I'll introduce myself first. My name is Pratik Bal Basrao. I work for HDR Engineering, and, I lead a, an office within, the AC sector that manages, technology and makes technology decisions, for all the professional services that we provide to our clients. And with me today, we have, Donna Huey. Donna is the chief digital officer for Atkins Realis. She has over thirty four years of experience, and like me serving clients in in various industries.

And among the clients that she has listed, of note are the US Army Corps of Engineers, San Diego Council of Governments, the New York MTA, and what she calls a small mouse like entertainment company in Orlando, Florida. We also have Nisha Thomas with us. Nisha has been leading technology integration and innovation for over twenty years. She is, with her experience in the oil and gas industry, airlines consulting, and most recently in mergers and acquisitions. She has navigated, complex organizational changes and and driven technology adoption. So very excited to have you here, Nisha. And finally, we have Alexandra Chobotaro with us. She is the head of automation innovation for Maersk North America, and she leads efforts to enhance supply chain resilience and safety by integrating cutting edge technologies, into scalable customer centric solutions. She spearheads innovation, initiatives involving robotics and AI, which have the potential to disrupt the logistics industry. So very excited to have you, have you all join us.

And our topic today is really about, you know, what technology changes, we see in in traditional industries and some of the unique challenges we have, when not being in a tech tech focused industry. We're more kind of old school in many ways, like architecture, engineering, railroads, oil and gas, shipping. We have a different set of challenges. The the people we deal with, the clients we deal with, the industry we're a part of, are sometimes a little slow to embrace, technology. And we as technology leaders within this context, have a lot of shared experiences. And while we were preparing for this, this talk, this panel discussion, I was just amazed at how much more we had in common across, our our industries and how much shared experiences we have. So we hope to bring some of that, into, our discussions today, and hopefully, it will inform, all of you, our audience, and and give you something to take away.

So, Donna, I'm gonna ask you kind of our first question here. And, really, it is it is about our challenges and and barriers. So what do you see are some primary barriers to technology adoption in our industries? And are these are these cultural, financial, something else? Like, what is it that makes it, unique for us?

Thanks, Pratibha, and thanks for having us here today. You know, the first thing that comes to mind for me, from challenges and barriers is just the striking the balance between time and pace of change. I have been in the business for a few decades, and so I've seen that pace of change change pretty dramatically. And there used to be a time where, you know, you you had the time, you know, the luxury of time to learn a new skill. And in today's world, the pace at which technology is changing so rapidly, skill development has to be continuous. There can't be, I'm gonna go to, you know, a training class for a week this year, and I'm gonna learn all the new stuff I need to know, and I'm good till next year.

Just that's not the same anymore. And and generally speaking, I don't think most organizations haven't caught up to the fact that training continuous learning needs to be institutionalized in order to to maintain a competitive advantage for yourself as a professional and for your organization as competitive in the marketplace.

So that is one of, I see, the biggest barriers and challenges to adoption today is we're not we're not ready to adopt because we haven't institutionalized that continuous learning. We haven't had given ourselves the time to develop those new skills. And so we have to get past that, you know, time and pace barrier in order to have change and adoption, you know, more readily in our industry.

Alexandra, what do you think from, the perspective of, you know, some effective strategies because you've talked about some of these similar challenges in your industry, in your, in your company as well. How do you think what are some effective strategies to kind of build those barriers

and

go past those?

Thank you so much, and thanks for the invitation. Really excited to be here. I would say that I want to start by kind of setting the tone that though we are not a technology company, if you ask any of my colleagues, they would say that we are tech enabled or we aim to be as much as possible. So a technology road map is something that we always keep in mind when we create new business opportunities or when we create any new products and and technologies for our customers. And the way we do this, especially when it comes to logistics and services, which is where I I focus most of my time now, we look at it through the lens of, can we test this in a smaller container, in a contained environment, maybe even, you know, separate from the from the core operation so that we don't disrupt something that really needs to be continuous.

And then we test the assumption that this new technology or this new process would work in such a big and complex structure. I think for us, the challenge is not necessarily the fact that we do not want to add new tech and new solutions to the table. It's just that whenever we make a small change, it impacts really everything that we do. So we need to be careful, and we also need to, have a plan. I would say, in terms of how we do that, we really, like to do proof of concepts, as in we do have a strategy for it. This is also part of my core business. And, by having proof of concepts, you basically take the assumption, put it in an environment, have some success criteria, and all of these are aligned with a bigger goal.

So we're testing this because our goal is to prove x y zed business case that we we have set up together with our, with our business stakeholder. And it's very important to know that it comes from the top, meaning that we have that sponsorship and that, autonomy as well to come up with, with some of the the next technologies that we would like to test. But they're all aligned with the mission that we're trying to achieve. So everything, I think, needs to be, to a certain extent, aligned when we're doing some of this, some of this, disruptions because it would it will, in the end, really potentially change the way that we do business in a few years.

Very cool. Nisha, you're from the oil and gas sector. What what is your take on this, on this topic?

Yeah. Some very common themes. Again, thanks for the question, Alexandra. So first off, we need to overcome the initial doubt of whether a new technology can actually work. So Alexandra mentioned proof of concept. So a very similar theme for us is piloting technologies before it's applied, on a larger scale. It's almost like, you know, trying to convince your grandparents that their old dial up Internet should be replaced with high speed broadband. The best way to do it is by starting small and testing the waters. You really want to create a successful pilot project, something that will showcase the potential benefit clearly. It's crucial to let the results speak for themselves. Get others, especially those within the business, to become advocates for the new technology rather than the IT team speaking about the technology. Let the business be the champions.

Let the business be the evangelist for it. So their success stories will convince others. Think of the last year technology cheerleaders chanting for the new system. And, of course, once you have the momentum, you can scale up from there. It's about building believers one step at a time starting with a small successful pilot and growing from there.

Yeah. Well, I can see, you know, starting starting small and and doing that proof of concept is is definitely very valuable. But at some point, if you're trying to do major, you know, transformational things, right, bringing really big technology components, like, making it making your company very data centric or or leveraging AI or something, like, transformational from that perspective. That that requires a lot of, you know, high level c suite kind of engagement. So I wanted to ask you, maybe, Donna, maybe you take this one on. Having that kind of high level stakeholder engagement, and buy in, like, really thinking through, you know, something longer term. Maybe it's a one year, three year strategy. How do you go about convincing your leadership that that is something worth undertaking? And it may be, you know, it may have a financial implication there or may, you know, you you you may have to train and you talked about skill development being a major hurdle for us.

It may require some significant time or or money investment.

Well, I think one thing that's really important to remember with, any type of of change management, particularly in tech adoption, whereas Alexandra said, you're sometimes, you know, taking those risks to try new things out and doing those proofs of concept is to really know your audience.

And that stakeholder engagement has to be multilevel. You can't just have one story. You have to have a story that matches the executive level person who might be thinking more about, you know, what's gonna help them get to the next level of their career or what's gonna help increase shareholder value for the organization versus the way in which you need to pitch the change or tell your story to a mid level manager who's just worried about, you know, hey.

I'm getting I'm getting pressure for productivity or I'm getting pressure for my financial performance at the end of the month. Any risk I take is gonna have an impact on that bottom line. They're gonna wanna know more about how are you gonna mitigate my risks. And so I think multilevel is really important to understand that you need to have a story that matches and really get to what what that person, values. And I've just one funny, anecdote to, something, you know, Nisha said about finding, you know, what, you know, what motivates people to change. I frequently share a story about my my dad, and this is a while ago, because my my children are adults and grown from the house. But my dad thought having a cell phone was the silliest thing in the world. Why would someone wanna carry a phone in their pocket?

But the second he realized he could engage his grandchildren with a text message and they would communicate with him, he never let that phone go. So I I think it's it's finding that place of value to your to the stakeholder you're trying to engage to create, you know, that change, whether it's your, you know, your your dad trying to use a cell phone or whether it's a CEO you're convincing, you know, for a multimillion dollar investment in new tech.

It's it's finding that sweet spot of what's gonna motivate them to to move the needle with you.

Yeah. Alexandra, I was gonna ask you. You've talked about some, you know, doing, potentially doing, proof of concepts and pilots. And what what makes a successful pilot and, okay, kinda, what what should we expect? We talk about, you know, failing and failing fast is a good thing in our you know, whenever you're trying out something new, and and that's the value of doing a proof of concept. But, realistically, what how does that look and how does that work to get people either excited or just may sometimes prematurely give up on something because the proof of concept failed and so you won't try it again for next five years. How do you

Absolutely. Yeah. Thank you for that question, Prativa. I think that the role of innovation in a company is to test assumptions so that we don't invest in the wrong thing. So I definitely see myself as a function that supports the whole ecosystem within, you know, my stakeholder map, where if you have an idea, come to me, and then together, we can figure out if we have a business case, if we have a return on investment, if we can scale this beyond, you know, the desire to see if it's cool.

And if it doesn't work, are we then avoiding a cost that would not give us the benefit that we want? So I always take into account the fact that I prefer and I encourage everybody in my stakeholder, group to think strategically and align that once we wanna test something, we have opportunities to scale. Right now, I look at the my horizon on a 80%, pool and 20% push. What does it mean? It means that 80% of the time, I pull ideas from within the organization of actual opportunities or problems that we're trying to solve for. 20% of my time, I come up with something disruptive, and I say, what if we do this huge, you know, robotic solution for our, for one of our sites and see if it works at scale, but we test it in this particular location. It's something that wasn't I I wasn't told it's, like, super interesting.

But if we test it, we would know early on if this is something we should look at in the long run and understand if it has potential to scale as a result of the success criteria. So I start from that kind of strategic, overview. After that, based on the engagement, I personally really think that it's important to know your stakeholders. So I prioritize meeting my colleagues, talking to them a lot, understanding what are your pain points, why are we even looking at this problem, is it truly a problem, or was it that someone mentioned it would be nice if we would know more about the industry? Because then we can do a market research. We don't need to test. But if we're trying to see if something works at scale, then we we should have our own assessment of this, of this use case, which is why I encourage, everybody within the business, but also all of my innovation peers to have, in a way, our own company KPI and our own company process and not necessarily copy someone else's recipe because it might not work in your organization.

So whatever works in other companies, of similar industries or different industries might not work for us, which is why we created our own innovation process. We engage the stakeholders very early on, so we make sure that we work towards problems or opportunities that can actually solve something for us or for our customers. And after that, we have a proof of concept process where we start from idea to business case to return an investment, actually deploying that technology in a site, or if it's a software, deploying it within a, environment that's contained. And then from then on, checking within a few weeks to a few months, depending on how long the POC will will will last, whether the success criteria has been reached. And you did mention, Pratibha, something important. We don't aim to succeed at any everything. Because if we succeed at everything, it means we're not tackling the right issues. We should also fail so we don't invest in the wrong technology.

And right now, the current industry trend is that you would have between 70%, success rates of your innovation projects and 30 to 35% or 40, depending on how you look at it, of, failed, POCs. And we take that as our benchmark as well. Obviously, you know, the more we test, the more we see. But, I would think it's a good reminder that your especially your senior stakeholders should want you to fail and have some, results of trying things out that didn't work so that you make sure you don't make that investment in in maybe something that won't, won't work in the long run.

Very cool. Nisha, I wanted to ask you from your because you amongst all of us, you've you've dealt with mergers and acquisitions way more than, probably the rest of us. But, and change management is especially hard when you have, in that scenario. Right? And and throw on top of that, you know, different technology stacks perhaps, different approaches, different pace of change that people are used to. What are some of those unique experiences you've had and how you how have you managed to navigate it, you know, leading technology, innovation or changes, through through companies that are perhaps merging and they are at different different pace of, you know, thinking or aligning with, on the technology front?

How do you manage that?

Yeah. You know, to start with I I really love Donna's example of getting her dad to use the

cell phone.

And why I say that is because, see, when it comes to getting everyone on board, with new technology, with, the merger and acquisition, it's all about the people. So it will be funny. Like, I'm not gonna speak about the technology first. Let's speak first about the people. First, I I think it's very important to understand the different organizational culture between the seller organization and the buyer organization and how they work. Very imperative to talk to them, find out their ways, to make sure that between the seller and the people that are coming over to the buyer organization, to make sure the transition feels smooth. So once you handle the human side, merging different technologies or adopting new ones over the, m and a becomes easier. So change management gets a lot simpler when everyone is comfortable as you all know.

So that said, in the merger and acquisition fee, area as well, we are surprisingly using some innovative ways. The recent one, generative AI is top of mind for everyone. I have a very technical group of, folks who do some significant work. So the the merger and acquisition, we do multiple, sometimes at, in one year across my organization. The one that we are doing, right now is over $15,000,000,000 worth of, the bringing forward to multiple different organizations. My team has surprisingly embraced generated AI to accelerate mergers and acquisitions work. So we are managing hundreds of applications, petabytes of data, transitioning again over 300 locations, into our organization. So if you think about the size and scale of most of the merger and acquisition, we are dealing with vast amounts of information in the data rooms.

Where we are applying generative AI is to helping us infer the amounts of information. It's not just becoming a copilot. What I'm of also observing is the team of, experimenting with producing technical documents for future state. The team's job that involves checking and updating missing info. So it's not perfect perfect yet. It still needs human oversight. Mhmm. But I'm really thrilled to see it evolve from just task delegation to in future, like, where generative AI is becoming a genuine thought partner in accelerating some of our m and a's.

Fantastic. Yeah. I did not think about that being a, just through just to to relieve the the m and a process to use a new technology. I was thinking more in terms of, you know, you're you're trying to create this new family, this merged family from two, two different perspectives. But that's that's, that's amazing insight. Thank you, Nisha. Donna, I was I wanted to ask you. I I think Alexandra talked a lot about you talked a little bit about KPIs and and sort of metrics to use, to to test the effectiveness of change. And I've always struggled with, you know well, you you can measure, you know, software usage. That's an easy thing to measure. But how do you measure workflow automation or business process improvement or things like that that are way more nebulous, but probably more important, you know, in many ways.

Do you have some tricks up your sleeves or some best practices you've developed over the years to kind of to to actually to to effectively measure it and be able to communicate that, you get the green light to do more of that. So you got you got some lessons learned there?

Yeah. And I'll pick up a little bit on what Alexander said was, you know, I think that when we think of innovation, a lot of times we think of just entrepreneurial, you know, try things out, you know, experiment, but innovate, good innovation requires process. And it it does require a lot of forethought around how you're gonna measure and having a benefits framework and really putting the effort in upfront, you know, and looking at how can we measure this and having a plan for how you're going to measure it. So, for example, maybe you are measuring, time. It can be as simple as if you do think about we're all everybody's experimenting with Copilot. If you save five minutes a day, by not having to search folders and just asking Copilot where something is or having it create a graphic for you on your PowerPoint slide, you can equate that time to money really easily. Or another thing we look at with, for example, as we look at digital adoption in in my space and and design, as I look at the way in which we're gonna adopt more automated ways to do design, not only are gonna look at saving time upfront once

Oops. Did we, looks like we lost Donna mid mid sentence there. Hopefully, she can join us again. So, Alexandra, she perhaps you can kinda chime in there because she was talking, quite effectively about, some metrics, and you were talking about it earlier. Or Nisha, if you wanna weigh in, please feel free to do so as well. I'm just I'm curious. Same question. How do we, you know, how do we develop, effective keypad metrics and and and hold ourselves accountable to it?

Absolutely. I can pick up on on what Donna was saying starting from the fact that if you want to prove value of innovation, you need to measure something. Otherwise, it's, nice to have. And it's it's great that you have this thinking and that mindset of let's see what happens next. But most likely, in a, let's say, corporate setup, you will be asked, and what does it mean? And by that, you need to translate that into some sort of savings or into some sort of benefits for the future that ideally would be measurable. And Donna mentioned time or, you know, efficiency gains. It can also be, we can also we can also look at, process improvement, meaning we were doing this task in x amount of time.

Now we do it faster, or we figure out that we can replicate. We can change our whole process and do this more effectively by actually changing the whole step of, of of our operations, that we've done before. Right? So it it matters that we can we can, have metrics. Metrics, I think, depend on the use case. So I would definitely recommend that depending on, you know, whether you have a software solution or a hardware solution to work with the people that would become the end users or the main, I guess, the the main people that would benefit from that solution and ask them how would you rate this, experience of using the software or this, or this hardware based on its use.

And that means that you will be you will be told, okay. If I use this solution, I am 20% more effective in this metric. There there you have one of your kind of ways of measuring. I would definitely think that it it's very much dependent in robotics and automation, we very much work with very specific metrics in the industry that we operate in. So then that's really clear. We have numbers, and then we compare numbers to numbers. But, of course, you also have the safety element. You also have the efficiency gains. You also have the, the fact that now you know that technology works, and that's anyway a win. You will be able to say this this can work at scale because we've proven the assumption.

So those are kind of the the ways that you can look at the at the measuring success based on the criteria that fits every use case. And those, of course, can be dependent on the industry that you work in and, they will change. So it will be hard to say a few, but, but I think, I think we all kinda get the the the whole setup here that we can, we can use.

Yeah.

Nisha, I was curious since you mentioned, generative AI as a as a, test case that you that you, recently delved into, like, how is that, measured? Like, how how effective did that were you able to get to some numbers in that one? And and, I'm just fascinated about it.

So Great question, Pradeep. And we were also figuring out how to measure this effectively. Right? So, like, Donna and, Alexandra mentioned, we had a metric and working together with some of our external industry partners, we knew with every use of generative AI is equivalent to six minutes of productivity efficiency. So that is became one basis of math to identify the total productivity efficiency. So we have a number of licenses that are available to our organization. A good chunk of my team has those license. So when anyone uses, let's say, Copilot, we are automatically tracking through our generative cent AI center of excellence on how much productivity efficiency is generated for the organization. So we have a dashboard. So we've heard quite often we have to measure everything that we do.

If you don't measure it, you forget it, and you don't really understand the value. So we have a running dashboard every which we monitor every week on every month basis on how much productivity efficiently has been generated using using, innovative tools such as Copilot or Generative AI. Here's an example. Right? Again, because of the volume of information that we sit on, we use Generative AI to pull together information from multiple different documentation to produce technical documents. This if it was done, by a person, typical benchmark that we see within our organization is depending on the size and complexity, roughly around four to six weeks. And if it's a major complex one, it can take months as well. But we just, took, a medium sized programs.

Every architecture document would take us around four to six weeks. Now we are able to kind of do that at a much shorter scale. Yeah. So it's not only the use of productivity efficiency, but we have, let's say, hundreds of technical documents that being produced. If you just, again, do the math, eight weeks or six weeks, do it for hundreds of different applications that we are bringing over, it would have taken us months, years to complete, which we are instead, now we are able to do it in a shorter time. Now is it again perfect perfect? No. It still requires human intervention. Diagrams have to be populated. You have to have, controls and checks and balances in place to make sure it is not hand misnating and creating, something that it should it should not. So human intervention is required, but the process, the grunt work is taken away.

Yeah. Yeah. Fantastic. So, you know, some couple closing questions that I have, for each of you. One is around lessons learned. So we've we've all been around the block a little bit, twenty plus years. And I wish somebody would have told me things to my younger self when I was struggling with something. Like, change is hard. You know? It's not talking to people that, are of a different, you know, background or they have a different kind of mindset. That is even harder. So I'm just curious from from your experiences, probably some battle scars along the way. What would you advise your younger self to to to help you which would have helped you kinda navigate this, this technology career path that you've been on?

Sure. I can start if that's okay, Fatima.

Yes.

There's one thing that, I really dislike, and the more time passes, the more I dislike it, is when you're in a meeting and someone says, can I ask a question? But it might be stupid, but I wanna ask something. I don't think there's such a thing as a stupid question because there's no way that you would be able to know everything. So ask the stupid question. Get that clarification that you need in order to have your thought process clarified in order to gain more information, get smarter, know what to do next time. Ask the question, don't be shy, and don't call it stupid. Just say, I would like to follow-up, and make sure that I understand this if you need to explain yourself. Otherwise, just say, I have a question.

This is the question. Go ahead. I definitely think that we can all at least in my experience, I kind of felt like I had to explain that Yeah. Maybe I'm missing something. And the the more I progress in my career, the the less I I enjoy that when I hear it from someone else. So I always say, it's it's okay. You can ask any question you want because we want you to understand what's going on. And if there's something you don't get, it's important we are all on the same page. It's okay to ask the question. And I have two more things here that I put on my notes that I really wanted to to highlight since we're all here in a in a in a such a great conference. Don't self reject. So if you feel like you wanna do something, don't be the person that doesn't try.

You know, raise your hand and say, I wanna try even though I might not know exactly what to do. Be that person that wants to, to try it out and ask at least. Don't self reject before you actually try. And then, bridge the gap. Something that I've noticed is that sometimes we or in my experience, especially working in larger companies lately is that everybody everybody wants to stick to their place because they think, okay. This is my team. This is my project. You know, you kind of everybody looks at their own set of projects. And what helped me in my career, and I would say now I'm on my fourth iteration of my career, is that I always bridge the gap. And whenever I see connections, I'm like, oh, it seems like we're going in the same direction here.

Shouldn't we connect the dots and try to move forward by maybe moving one of these projects in this area or learning from each other? And that always helped me. So, you know, be open to connecting the dots and and bridging that gap where you see that, potentially you understand something and you can connect other people, to have a better outcome for everyone. I think people really appreciate people that are proactive and want to want to do that.

Yeah. It certainly, you know, puts you in a such a strong position of collaboration when you can see what where the other people are also struggling with perhaps the same thing or there is alignment that can be had and there is a there's benefit to going far together. Right? Fantastic, Alexandra. I love your your answer, your response about asking the question. There's no the only stupid question is the one not asked. Right? So very good. Anisha and, Donna, glad you could make it back. We're talking a little bit about lessons learned. So, perhaps after Anisha goes, you can you can close out with, our session with your thoughts. Anisha, you go first.

Yeah. I I love what Alexandra said. So there are two that I can leave with you. From an organization perspective, we've talked a little bit about business cases. One thing, is, what I would have told my younger self is it's not about how hard you work, but, about the tangible benefits that you are producing for the organization. It's it's almost like, if if you're a chef and planning out a recipe before cooking a gourmet meal, Traditionally, what I would have done is I would have relied on intangible benefits, you know, risk reduction, customer satisfaction, which are all very valuable but hard to measure.

But lately, as a team, we've been asking, how can this help our profit and loss statement? So, again, very similar to the chefs, while still valuing the artistry of our dishes, we have to prioritize cost effectiveness and profitability of the menus. Right? So keeping that in mind, the tangible tangible portion, that's what's going to help not only yourself, your team, your organization for the future growth. And and lastly, lastly, the one other key lesson is if we talked about failures, all of you have spoke about failures, you know, embrace those failures because we need those failures as learning moments. Those are incredible learning opportunities. Every setback is a chance to grow and improve. Take it from a person who has had setbacks. So if if I were to give any advice to my younger self, it would be to wear those scars as badges of honor and let them propel you and the organization towards more better and innovative work.

Yeah. Fantastic. Thank you, Nisha. Yeah. I would I would characterize the, you know, pro like, keeping keeping an eye out for what does it mean for the organization. It's not always about, you know, profit and loss, like but aligning to organizational goals. Like, where does the organization wanna go? If they want to focus on safety or quality or risk reduction and your initiatives are aligned to those organizational goals, like, that is a winning kind of a strategy, a winning approach. But, yeah, really, I love the I love your, you know, all your reflections. I I love especially when, wearing your scars as badges of honor. Like, that is a good one. Donna, close us out, please, with your reflections. And what what would you tell your younger self about, this journey you've been on?

Alright. Just to mic check, you can hear me as well. Yes. I adapted to change and found my phone link. Yeah. I love the food analogies. I'm a coach. That's a good one. You know, I think one of the biggest lessons learned, you know, I used to spend a lot of time concentrating on working on change with the naysayers, the ones that didn't believe. If, oh, I we can just get them to believe it's we'll we'll we'll be able to make progress. Back to my very opening comment about the pace of change, things are moving too fast. Latch on to the people who are ready to go and and embrace them and partner with them, whether they're part of your organization or part of an outside organization, build your ecosystem of partners, latch on to the people who wanna embrace change and move at pace, make progress, and get results.

The story will tell itself and success will follow. So I think those are you know, if I could if I could take back that time I spent trying to convince naysayers,

that

would be what I'd tell my younger self.

Yeah. I know. I I mean, I have so many moments where I've I know I've gone, like, if only I tell them this one more thing. Like, if only I, you know, if only I show this one more thing, and it's just yeah. Some and it's okay. It's okay. They they'll just be late adopters, and they'll get there, but at their own time, on their own pace. And, so those are some really good, really good insights. Well, I wanna thank you all again one more time. Really appreciate the conversation. Really good insights. I hope everyone, all of our listeners benefited from from y'all's experience and the way you've approached this, and sometimes.