DE&I Conscious Technology

Aparna Shah
Account Manager
Automatic Summary

An Introduction to Ethical Tech: Combating Biases with Inclusive and Conscious Technology

Hello, everyone. I'm Aparna, an account manager by profession and an ally by choice. Today, I will be taking you through various aspects of ethical technology.

Unpacking Ethical Tech

Ethical technology is an expansive topic. Essentially, it looks at the impact of tech on society and aims to engage technologists into its ethical dimensions. Ethical tech is a conversation; an ongoing discussion focused on the relationship between technology and human values, and the choices we make about technology's impact on people.

By combating biases both conscious and unconscious, ethical tech can have a transformative impact on our daily lives. This is especially relevant in workplaces where individuals from all genders, races, ethnicities, or with disabilities and different sexual orientations need to work collaboratively towards achieving organizations' goals without prejudice.

Diversity and Inclusion in Technology Implementation: Use Cases

  • Facial recognition software incorrectly reading faces of women or ethnically diverse individuals due to a homogeneous design team and a non-diverse data-set used for machine learning.
  • The iPhone 6 was initially designed only for right-handed users due to a lack of diversity in user perspectives among the design team.
  • Recruitment AI algorithms, intended to improve diversity in hiring, ended up aggravating the problem due to unchecked biases in algorithm development and training.

These examples could have been mitigated by expanding the conversation and decision points to include a more diverse design team or a broader data-set.

Freedom vs Surveillance: Technology's Double-Edged Sword

While growing usage of AI and Machine Learning powered surveillance technologies in sectors like education and healthcare have yielded tangible benefits, they raise significant concerns around privacy, bias in surveillance targeting, and the normalization of comprehensive surveillance. These challenges call for a thoughtful discussion on potential implications for students, patients, workers, and society at large.

Addressing the Ethical Tech Conundrum: Current Industry Approaches

Many organizations are starting to focus on ethical tech. Various methods include forming tech data or AI Ethics Boards, appointing C-level executives to oversee the area, conducting internal research on ethics, and identifying it as a high priority.

Some sectors, especially in higher education, are designing curricula to better address and embed ethics pertaining to technology and AI. Simultaneously, varying degrees of regulations and legislations are being considered at all levels, from global to local.

Building Ethical Tech Practices: A Roadmap

While there's no standard roadmap for building ethical tech practices, organizations are aiming to develop and operate ethical technology solutions. This means giving professionals the skills to identify, debate, and mitigate ethical tech risks, creating an internal panel for addressing high visibility issues, and constantly assessing progress and adapting to create sustainable practices.

Key points in this roadmap include transparency, clear communication, proactive identification of false reporting, simplicity, efficient governance systems, and swift processes.

In conclusion, ethical tech is more than just a philosophy; it's an ongoing quest to make technology serve the best interests of all users. By promoting diverse perspectives and countering biases, we can shape a more inclusive and ethical technological landscape. Thank you for your time and participation in this important discussion.


Video Transcription

Hello, everyone. I'm Aparna, an account manager by profession and an ally by choice. Today in the, in the next 20 minutes, I will be taking you through various aspects of the en I conscious technology A K A ethical tech.So before we dive in, uh I'll just give you a quick intro of what the session is going to look like. I would be taking you through the introduction of the concepts of ethical tech. Why is it important to have a, have an inclusive and diverse uh workforce uh some influential common practices that, that have been making impact? I will also share a few use cases into uh to get into the practical application of uh ethical tech. And last but not the least a few frameworks to help uh organization effectively build a culture of DN I through technology. And if time permits, we can have a have a round of Q and A as well. So let's dive in um what is ethical technology? It is, it is an expanse expansive topic. So let's break it down into small chunks and um dive deeper into each ethics, ethics and technology are expansive topic. It's Um So I'd like to spend some time uh reviewing each, let's start with ethics which is far more complex than a set of rules or a checklist to be followed. We all make ethical decisions every day, even when we're not aware of it.

And we bring our own experiences and assumptions to those decisions. And then there's technology, technology has the potential to help address biases and to create a more inclusive workplace. The word may bring to your mind A I social media and even self driving cars. But technology is far older and broader and deeper topic than all of those. It is hardware and software, it is design and implementation, it has its intended usage and users and it has actual effects and impacts technology has pervaded and transport about every aspect of our personal and professional lives. Most of us use technology every day and in some way, shape form, uh we use it and it impacts others while as humans, we can prefer uh individuals who think or even look like us. Technology. On the other hand, can keep us uh focused on true value that professionals bring in. It can be used to detect conscious and unconscious biases in the workplace, which brings me to uh ethics of technology or what we refer to as ethical technology. So what is ethical tech? Is it a philosophy that looks at the impact of tech on society and attempt to get technologists engaged into the ethical dimensions or a list of requirements to follow when designing and implementing new technologies to answer all these questions. It's yes.

Ethical Tech at its core is a conversation. It's an ongoing discussion focused on relationship between technology and human values. The choices we make about technology and its impact on people. Ethical tech combats biases both conscious and unconscious that have a significant impact on our daily lives, affecting everything from the way we perceive. Uh and we make decisions at workplace, it becomes even more significant when individuals from all genders, race, ethnicity or disability status, sexual orientation need to collaborate towards achieving organizations goal without prejudices this work culture where employees are at center stage becomes uh and welcomes new ideas and perspectives.

If I have to summarize this in one line, ethical tech conversations are the means by which we practice ethical decision making behavior and ultimately reach ethical decisions. Moving on and diving in deeper building ad I conscious uh technology gives an organization access to different skill sets point of views um experiences which can result in an organization generating more innovative products and services. Think about the early challenges with facial recognition when the software inaccurately read, uh faces of women are racially or ethically diverse. If you look back at the desi at the design team of that software, you will find that the teams themselves are pretty homogeneous uh with a few black or brown designers to offer their perspective. And the data used to teach the machine learning necessary for recognition wasn't very diverse either. There are so many examples like this take iphone six, which was initially designed by and therefore meant for uh right-handed users to recruitment I algorithms intended to improve diversity in hiring and uh and end up aggravating the problem.

These examples could have been mitigated by expanding the conversation and the key decision points to include a more diverse design team or a broader data set. From there, we have come to a stage where enterprise collaboration tools such as Zoom Microsoft teams can help in enhancing communication and that uh speech to text translation and transcription capabilities for those with auditory uh disabilities and among professionals who may not natively speak the same language as business uh or maybe assistive software technologies that supports uh differently abled such as visually impaired through their equal access via screen readers, special keyboards and handheld navigation Softwares.

So these are a few examples that talk about how technology intersects with uh diversity and inclusion. I'll take um So I'll take this uh conversation a notch higher uh by sharing few examples when unintended applicant, uh implications of technology can be harmful. Now, how technology can be harmful you may ask. Right. So organizations are um across industries are increasingly using um A I and ML power tools to sort through candidates more effectively and gain an edge in the battle of talent. This un undoubtedly saves the organization some time in scanning through piles of resumes um to identify potential candidates. The thinking was that technology is unbiased and more efficient than humans and could look past the individual biases to see the candidates really had the skills to uh be more successful in a particular role or organization. However, as you probably know by now, that thinking was flawed because the technology was created by people who may unconsciously put their own biases into the development of that technology and technology was trained on a biased data set. If hiring algorithm is created to identify a successful individual based on the current leaders of the organization, then by and large, you will get candidates that look like the current leaders of the company.

If that makeup reflects society at large, then the algorithm would dis disbenefits women and people of color. Let's look at another example maybe from another industry. OK. So let's look at schools, the market for technologies that monitor communications of millions of students in the US has grown rapidly.

Many schools have embraced digital surveillance to monitor students in order to manage safety concerns. Their physical security measures can prevent threat at the school gate. Digital monitoring uh provides critical insights into behaviors and activities that may otherwise go unnoticed.

Common technology solutions uh use a combination of A I in-house A I and uh human content moderators. So how they do it, they they have, they offer, they may offer a free automated 24 hour a day surveillance of what students write in school email, um uh share it uh the the what they write in the shared documents and chat messages. Now this often gets plugged into the email servers of the schools. And uh the schools can review the links and notifications that may floating from social media accounts that are linked to their uh school email I DS. This technology sends notifications to officials con uh with concerning phrases and they flagged. So the schools that are currently using this monitoring technology claim that technology has allowed them to reduce the student on student violence intercept, illicit uh drugs on school grounds, intervene with suicidal students and reduce online bullying. What would be bad about this? Right?

There could be harms associated with perce pervasive surveillance of the students. Is it a good thing to normalize surveillance of pioneers often when these technologies are applied in school districts, uh there is not an option to opt out um leaving major privacy considerations for not just students but also school employees or even the outside actors who may be communicating digitally with the monitored students and employees.

Finally, depending on how the technology is trained, there's a potential that uh it would not work equally well on all students. Now there's food for thought. Would certain groups vernacular prompt more flags than the others who becomes responsible when the technology doesn't catch uh the signs of self harm or violence. Would this just make minors better at hiding their intentions to game the system? Um While you ponder on those, let's cover uh one more example uh that might bring this closer to home, take healthcare industry. Uh concurrent advances in sensing technology, computer vision and machine learning have enabled um the development of ambient intelligence, what is am ambient intelligence.

So basically, it is the ability to unassumingly monitor and understand actions in the physical environment as such ambient Intelligence. Um is increasingly finding space in healthcare settings because it involves using sensors, capturing vi videos, collecting data and analyzing that data in real time.

Unlike other monitored uh mo monitoring technologies, uh this technology not only capture video footage, it also interprets that footage. Uh for example, a nurse can make uh sure that the pa uh patients are moving around as told to speed up their uh recovery process. It can also help reinforce hand washing and sanitization protocols. While these technologies have a great potential to improve patient care, uh they also prompt healthcare industry to engage in thoughtful discussion on potential implications uh for patients visitors and hospital workers.

So what are the ethical implications that we're talking about here, hospitals are places where patients willingly surrender some of their privacy because of some benefits warrant. Um This trade off. Uh however, would this considered be considered to be too much of uh invasion of privacy of patient visitors and hospital staff that would uh be captured in the monitoring. What do you think? Let me know in the chat now that we have covered a few examples from uh various industries. Uh let us and, and the harm, harmful effects, it might have, let's take a step back and talk a little bit more about what the organizations are doing to address it. Uh, ethics in technology. There's a clear case of why business needs to focus on ethical tech and plenty of examples of how they are doing it, but it's not just the business world. In fact, across all sectors and industries, organizations are responding to ethical tech in varied and evolving ways to start with. Many are making commitments to ethical tech, whether it's forming tech data or A I Ethics Board appointing C level executives to lead the way uh doing internal research on ethics or simply identifying it as a high priority.

In addition, companies are extending uh the mission of these existing functions like compliance and ethics learning and development and inclusion to include ethical tech. Some of the sectors are uh especially the higher education are responding to ethical tech by designing a number of schools have developed frameworks courses curricular for better addressing and embedding. Ethics related to general technology A I and especially computer science and engineering.

Businesses and corporations are being uh beginning to do something similar similar by leveraging their internal learning and um learning and development capabilities. Coming to the uh fourth pillar that is governing uh from the governing angle, varying degrees of uh regulations and legislations are uh are in place and under consideration at all levels from global to local. And the code of code of ethics for associated uh trade and industries are being revised and developed. And while the meaning of ethical tech may differ among different places and culture, there is a strong global movement uh towards a common set of ethical principles, particularly as they relate to A I. And um our data still there remains little consensus as to what any of these principles might look like in actual policy. And finally, companies, universities, governments, nonprofits, everybody is collaborating to share knowledge, develop curricula, define and redefine careers.

Uh You can take groups like uh partnership in A I founded in 2017 by uh Google Deepmind, Facebook, Amazon, Microsoft Apple. Now they accommodate nearly 90 groups, uh also public and private consortia emerging uh to bring diversity of experience and perspective, to bear tackling these, these new challenges, how to set up for future, uh how to recognize and deal with uh ethical risk that, that you know, we might encounter since it's a broadly new topic and fairly uh organizations are adopting it.

OK. So we should aspire to have all systems that are developed that are designed using trustworthy ethical technology decision making. However, building awareness of ethical risk itself is not enough, we need to do a co ordinated effort to build ethical technology practices into our work.

There is no standard road map for this and there are aren't any plug and play solutions waiting along the way that you know, can be a lift and shift. Um That is why the focus should be on making sure that the technologies that are developed and operated on both internally like your HR systems and those that are developed and deployed on clients are designed, developed and operated using ethical technology decision making of how to do that.

First and foremost, helping our people understand their role in promoting ethical tech and giving them skills to identify debate and mitigate ethical tech um risks. Some people may already see their role in conversation while others uh might need a concrete example of why and how they this applies to them. We can approach this um this challenge from several directions. We can look outside the doors, see guidance from leaders in the space like thought leaders, research groups, industry. We can adopt a framework for ethical tech decision making to give our professionals a consistent approach of identifying and debating on any ethical tech risk that they foresee or they have encountered and create a se it's create several learning resources and experiences to help people build a muscle memory of, of the question of, of the questions you should ask and consider while making any ethical tech uh decisions.

But we should realize that people uh need to understand what's expected out of them. That's why working on uh standing checkpoints triggers standards for ethical tech uh that everyone knows uh And that and that is expected out of them in addition to establish, establishing a review panel to consult on difficult or high visibility issues and aligning with the risk and compliance and legal functions.

So I think everything comes together when it comes to ethical decision making, it's not just an individual, it has to be a larger team that has to be involved here that has to be consulted. And there should be a best practice uh module that has to be created finally taking a critical eye on technology because our a technology as we see as we have seen as an enabler uh that we use to deploy and evaluate ethical tech uh challenges and how it can be addressed. This isn't a simple exercise. Uh It's not a one time effort, it's iterative. We must constantly assess our progress over time such as um adapt to, to create a sustainable practice uh that mitigates harms that could be created by technology, maybe having a council, having, having, having a review, uh things of every technology that is developed. Um And if it is uh accessible, if it, if it has privacy protocols, if it is trustworthy, so everything has to be checked and everything has to be uh reviewed with, with a very critical eye. When, when thinking about ethics and technology. This brings me to the end of uh my uh presentation of where I would be just uh sharing few mitigation recommendations uh of what we have summarized or spoken about earlier to be transparent, communicate with the doctors determine how false reporting can be identified, be clear and simple, have a governance system and your process should be swift.

Thank you so much for your time and it was great interacting and delivering this session. Thank you.