According to the Electronic Frontier Foundation, a nonprofit organization defending civil liberties in the digital world, facial recognition software may be prone to error. It’s specifically worse when identifying people of color, women, and younger groups. If the software reports a “false negative,” it will not be able to match a person’s face at all. A “false positive” may identify this person as someone else. And for law enforcement, this can pose quite a problem. Since 2013, San Diego police officers could stop to take photographs of a person of interest to run their face through Tactical Identification System (TACIDS). That photograph could then be analyzed against more than a million other booking shots.

Source: Zora on Medium