Apple Card and six other tech tools accused of gender bias
Apple became the latest tech giant to face criticism when customers of its new credit card service accused it of giving men higher credit limits than women
From recruitment software that favours male applicants to facial recognition technology that fails to recognize transgender people, a growing number of artificial intelligence (AI) programmes have been accused of holding human gender bias.
Apple became the latest tech giant to face criticism last week when customers of its new credit card service, including company co-founder Steve Wozniak, said it appeared to give men higher credit limits than women.
Here are six other tech tools that have been accused of gender discrimination:
Facebook Ads
A US study this year found Facebook's algorithms matching marketing for housing and jobs with viewers learnt on stereotypes. Ads for jobs in the lumber industry went mostly to white men, while secretary positions were mostly directed at black women, according to the study.
Amazon's Recruiting Tool
Amazon scrapped an experimental automated recruiting engine that used AI to give job candidates scores ranging from one to five stars after finding it did not like women. Amazon's computer models were trained to vet applicants by observing patterns in resumes submitted to the company. But as most came from men, reflecting male dominance across the tech industry, the system had taught itself that male candidates were preferable.
Digital Assistants
A United Nations report this year said popular digital assistants styled as female helpers such as Apple's Siri, Amazon Alexa and Microsoft's Cortana reinforced sexist stereotypes and normalized sexist harassment. Styled as female helpers, most voice assistants were programmed to be submissive and servile - including politely responding to insults.
Facial Recognition
Facial recognition technology struggles to recognize transgender people and those who do not define themselves as male or female, according to an October study by the US University of Colorado Boulder. Researchers tested facial recognition systems from IBM, Amazon, Microsoft andClarifai on photographs of trans men and found they were misidentified as women 38% of the time.
Google Images
A 2015 University of Washington study found women were underrepresented in Google Images search results formost jobs and slightly underrepresented for some of them, including CEO. The researchers said the issue could have had a negative impact on people's perceptions, reinforcing bias and preconceptions.