Page Nav

HIDE

Pages

Classic Header

{fbt_classic_header}

Breaking News:

latest

Not Only Does Facial Recognition Software Often Get Racial ID Wrong, a New Study Finds It Misgenders Trans and Nonbinary Folks Almost All the Time

Nonbinary and transgender identities are increasingly being accepted and normalized throughout society, but not when it comes to high-tech...

Nonbinary and transgender identities are increasingly being accepted and normalized throughout society, but not when it comes to high-tech facial recognition software developed by some of the largest tech firms in the world.
According to Forbes, a recent study by the University of Colorado, Boulder, found that when it comes to transgender and nonbinary people, facial recognition software misidentifies transgender people about a third of the time and always gets it wrong when it comes to nonbinary folks, those who identify as neither male nor female.
The issue is that while society is increasing its vocabulary when it comes to identifying and labeling gender, computer programs are still binary in their design.
“Training a system to recognize gender beyond the binary breaks the purpose of the system,” Morgan Klaus Scheuerman, lead author of the study, told the Daily Camera. “If someone wants to include nonbinary identities in their algorithm, the problem becomes that nonbinary people look like any other people, so the system won’t know how to classify anyone.”
Scheuerman noted that systems’ typical use of “contextual labels,” like whether someone was wearing a dress or had long hair, to determine gender indicated that “traditional concepts of gender are ingrained in facial recognition algorithms,” as the Daily Camera explains.
And so far, according to Scheuerman, a common response, if any, by tech companies has been to remove any classification of gender by their facial recognition algorithms, rather than try to teach them nonbinary concepts regarding gender.
Said Scheuerman’s co-author Jed Brubaker in a statement, per Forbes:
“We knew there were inherent biases in these systems around race and ethnicity and we suspected there would also be problems around gender. Bottom line: What we found is that the computer vision systems that run all of our facial detection and facial analysis do not handle the level of gender diversity that we live with every day.”
Of course, as Daily Camera notes, removing all binary gender classifications from such systems is not necessarily prudent, given their usefulness in certain urgent tasks like locating a missing child.
But the slow adaptation of such crucial technology to be gender-nonconforming is frustrating for the LGBTQiA+ community, Mardi Moore, executive director of Colorado’s Out Boulder County organization, told Daily Camera.
“This kind of stereotyping flies in the face of what we know to be true about humans,” Moore said. “It doesn’t reflect the real world.”

No comments