The company has already signed a contract with a homeland security agency.
A tech startup in Israel, Faception, claims that its artificial intelligence (AI) algorithms can look at a person’s face and pinpoint the personality traits that are impossible to see with the naked eye — in fact, it claims it can tell which people are most likely to be terrorists, pedophiles, poker players, or white-collar criminals.
The company was founded in 2014, and the research team includes experts in computer vision, face analysis, machine learning, psychology, and technology.
“We understand the human much better than other humans understand each other,” Faception chief executive and chief ethics officer, Shai Gilboa, told Matt McFarland at the Washington Post. “Our personality is determined by our DNA and reflected in our face. It’s a kind of signal.”
Faception has built 15 different classifiers, which “represent a certain persona, with a unique personality type, a collection of personality traits or behaviors,” the creators write on the Faception website. “Our algorithms can score an individual according to their fit to these classifiers.”
Gilboa says that the algorithm works with 80 percent accuracy. While the number sounds high, it still means that 1 in 5 people could be incorrectly flagged down as a terrorist or pedophile.
“Can I predict that you’re an ax murderer by looking at your face and therefore should I arrest you?” Pedro Domingos, a professor of computer science at the University of Washington told the WP. “You can see how this would be controversial.”
The theory behind the technology is broken down into two points on the site. The first is that, according to Social and Life Science research, our personalities are affected by our genes, and second, that our face is a reflection of our DNA.
However, the science supporting these claims is shaky at best — while DNA certainly plays a role in physical appearance and personality, it’s a stretch to say personality can be determined from appearance.
Plus, there’s a fundamental problem with the entire concept behind Faception. Since the algorithm is based on image analysis, it can only do its job based on the examples it has been trained on. Therefore, if the data sample is narrow or outdated, the results will be skewed.
“The evidence that there is accuracy in these judgments is extremely weak,” Alexander Todorov, a Princeton psychology professor whose research includes facial perception, told MacFarland. “Just when we thought that physiognomy ended 100 years ago. Oh, well.”
Faception recently showcased its technology at a poker tournament, using its facial analysis technology to screen which four of the 50 poker players would be the best. Faception analyzed the photos of the 50 players by comparing them to a database of professional poker players, and by the end of the competition, two of the four poker players it picked out were among the event’s three finalists.
Despite the glaring concerns over the technology’s reliability, Faception says it has already secured a $750,000 contract with a “leading” homeland security agency to help identify terrorists.
However, the company’s chief ethics officer says that the classifiers that predict negative traits will never be made available to the general public.
For further information about Faception, check out their video below:
You might also like: 7 Science-Backed Tips to Tell if Someone is Lying to You