Can you predict who is a murderer just by looking at their face? What about a pedophile? A software company now says it can, and it claims it is able to identify terrorists purely by their facial features. Turning the old idiom that “you can’t judge a book by its cover” on its head, the two-year-old company claims its artificial intelligence algorithms can look at a face and tell if it’s likely to be a terrorist, pedophile, and, wait for it… professional poker player.
The bold claims are made by Israeli start-up Faception, which boasts its breakthrough computer-vision and learning technology can analyze a person’s facial image and automatically develop a personality profile. Claiming its technology will enable security companies to detect and apprehend terrorists and criminals before they have the opportunity to do harm, the company has already signed a contract with the Department of Homeland Security, according to the Washington Post. The Mirror reports Faception’s technology correctly identified 9 of the 11 Jihadists involved in the Paris massacre with no prior information about their involvement.
“We understand the human much better than other humans understand each other,” says Faception chief executive Shai Gilboa. “Our personality is determined by our DNA and reflected in our face. It’s a kind of signal.”
Faception uses 15 classifiers that are undetectable to the human eye, including extrovert, genius, professional poker player, pedophile, and terrorist. The classifiers allegedly represent a certain persona with a unique personality type or collection of traits and behaviors. Algorithms then score the individual according to their fit to the classifiers.
The startup’s claims were validated at a poker tournament, where the technology correctly predicted that four players out of 50 amateurs would be the best.
That said, Gilboa admitted a worrying gap in accuracy in that the system is only correct 80% of the time. Roughly translated, this means one in five people could be incorrectly identified as a pedophile or terrorist.
Unsurprisingly, experts have pointed out ethical questions in using the tricky technology. Pedros Domingos, a professor of computer science at the University of Washington, told the Washington Post the evidence of accuracy in the judgments is extremely weak, while Princeton psychology professor, Alexander Todorov, observed Faception comes “[j]ust when we thought that physiognomy ended 100 years ago.”
The new technology raises concerns that relying on it will take us down a dark route that promotes dubious preconceptions of who and what constitutes a terrorist. If the creepy profiling really is accurate, however, perhaps the first places it should be rolled out are in the corridors of power and the film industry.