The Accuracy of Facial Recognition is Questionable
The New York Times wrote an interesting article about how race and gender may cause facial recognition software to be innacurate.
Facial recognition technology is improving by leaps and bounds. Some commercial software can now tell the gender of a person in a photograph.
When the person in the photo is a white man, the software is right 99 percent of the time.
But the darker the skin, the more errors arise — up to nearly 35 percent for images of darker skinned women, according to a new study that breaks fresh ground by measuring how the technology works on people of different races and gender.
These disparate results, calculated by Joy Buolamwini, a researcher at the M.I.T. Media Lab, show how some of the biases in the real world can seep into artificial intelligence, the computer systems that inform facial recognition.
Read more here.