Gender bias in AI data collection often arises when datasets predominantly reflect traditional gender roles or stereotypes. For example, images labeled as "nurse" may overwhelmingly feature women, while "engineer" images might mostly show men. This skewed representation reinforces harmful stereotypes within AI models.
- Log in or register to contribute
Contribute to three or more articles across any domain to qualify for the Contributor badge. Please check back tomorrow for updates on your progress.