Skip to main content
Featured: Women in Tech London 2023
Women in Tech Conference

9-12 May 2023
Virtual & Hybrid

Toggle menu

Women in Tech Conference

  • Why Attend
    • Overview
    • Meet Ambassadors
    • Meet Communities
    • Convince your manager
    • Code of Conduct
    • Register Interest
  • Program
    • Schedule
    • Tracks & Topics
    • May 09 - Tuesday - Chief in Tech Summit
    • May 10 - Wednesday - Key Tech Summit
    • May 11 - Thursday - Career Growth Summit
    • May 12 - Friday - Global Impact Summit
  • Speakers
    • Overview
    • Apply to Speak
    • Executive Women
    • Women in Product Development, UX & Design
  • Companies & Careers
    • Overview
    • Companies hiring at WTGC
    • Job Opportunities at WTGC
    • Career Profile
    • Mentoring Program
    • Career Growth Summit
  • Partner
    • 2022 Edition & Sponsors
    • 2021 Edition & Sponsors
    • 2020 Edition & Sponsors
    • Sponsor
  • 🎫 Tickets
    • Book Tickets
    • Group Tickets
    • Apply for Scholarship
    • Volunteer
  1. speaker
  2. Christine
  3. Speakers
  4. Speakers
WOMEN IN TECH GLOBAL CONFERENCE 2022

Christine Phan

Technology Policy Fellow at The Greenlining Institute

chris-phan-headshot.png


"Algorithmic Greenlining: The Case for Race Aware Algorithms"

Wed May 10 - 12:30 PM EDT/New York (See in local time)
Add to Calendar 05/10/2023 12:30 PM 05/10/2023 01:10 PM America/New_York #WTGC2023

"Algorithmic Greenlining: The Case for Race Aware Algorithms"
#WTGC2023

"Algorithmic Greenlining: The Case for Race Aware Algorithms"
https://www.womentech.net/vfairs
https://www.womentech.net/vfairs
Reserve Your Spot


Join to meet more than 700 women in tech speakers live...


Vote by Sharing

Unite 100 000 Women in Tech to Drive Change with Purpose and Impact.


.


Do you want to see this session? Help increase the sharing count and the session visibility. Sessions with +10 votes will be available to career ticket holders.
Please note that it might take some time until your share & vote is reflected.

Session: Algorithmic Greenlining: The Case for Race Aware Algorithms

With the advent of algorithms being used to make decisions at all levels of society, definitions and explorations into algorithmic bias have deepened in the past decade. Algorithms, artificial intelligence have been used to make automated decisions in major outcomes including employment, credit and lending, housing healthcare and more.

However, despite popular belief, algorithms can still create and perpetuate long-standing biases and harms against groups of people, including those with protected class status such as race and gender. This impact on protected classes on algorithms needs to be assessed to prevent the perpetuation of discrimination. However, when we incorporate race as a factor in analyzing or responding to these biases, we are treating people differently on the basis of a protected class; a civil rights violation, known as disparate treatment.

In order to effectively respond to these challenges, we need to incorporate a race-aware framework in our analysis and in responding to data. This workshop will explore the precedents set in exploring a race-aware framework and how algorithmic auditing and review can be applied to these frameworks. It will draw knowledge from participants in a wide variety of fields where tech intersects, including education, health, and more. We will map out different opinions through structured activities with Jamboard, word clouds, and other interactive tools. These activities will involve 1) a prompt-and-respond activity in which participants will name key challenges involving race as a part of reviewing algorithms, 2) an activity in which participants will identify and sort new technologies being developed with the corresponding industry precedents (for instance, an education algorithm that predicts grades with ways the education system has historically discriminated against BIPOC students).


Premium Members, Ticket Holders, Speakers and recognized Global Ambassadors can watch this content once logged in.


Key Takeaways

  • Identify how the civil rights precedents of disparate impact and disparate treatment affect algorithmic bias and regulation.
  • Understand the need for algorithms that are race-aware in their evaluation step of development (how well an algorithm performs on certain subsets of data).
  • Synthesize precedents around bias identification and mitigation in other sectors or policy areas, including education, healthcare, housing, employment and how they can be applied to algorithms in these areas.
  • Connect individuals to organizations doing ongoing AI work in regulation and documentation


Bio

Christine Phan (she/her) is a Vietnamese American from University Place, Washington. She is a Technology Equity Fellow at the Greenlining Institute, where she addresses how the digital divide and algorithmic bias impact communities of color. She works on building algorithmic accountability and on Greenlining’s Town Link program, which partners with Oakland community organizations to provide digital literacy programs and address gaps in broadband affordability.

Previously, Christine has worked on dis/misinformation, Census outreach, coalition building and civic engagement in Asian American spaces. She is passionate about envisioning what community safety and resilience looks like for refugee and immigrant communities.

Christine Phan.png

Powered By​​​​​​​

Women in Tech
Coding Girls

Women in Tech Network

About Women Tech
Career & Hiring
Membership
Women in Tech Statistics

Women in Tech Conference

Why Attend
Tickets
Sponsor
Contact

Tech Women Impact Globally 

Women in Tech New York
Women in Tech DC
Women in Tech Berlin

Women in Tech Barcelona
Women in Tech Toronto
Women in Tech San Francisco
All Women in Tech Countries

Privacy - Imprint  -  Sitemap - Terms & Conditions

Follow us

  • facebook
  • linkedin
  • instagram
  • twitter
  • youtube
sfy39587stp18