Artificial Intelligence

Artificial intelligence can predict political beliefs from expressionless faces


Scientists have demonstrated that facial recognition technology can predict a person’s political orientation with a surprising level of accuracy. Their research, published in the journal American Psychologist, shows that even neutral facial expressions can hold clues to someone’s political beliefs. This finding poses significant privacy concerns, especially since facial recognition can operate without an individual’s consent.

Facial recognition technology is a form of artificial intelligence that identifies and verifies individuals by analyzing patterns based on their facial features. At its core, the technology uses algorithms to detect faces in images or video feeds, and then measures various aspects of the face — such as the distance between the eyes, the shape of the jawline, and the contour of the cheekbones.

These measurements are transformed into a mathematical formula, or a facial signature. This signature can be compared to a database of known faces to find a match or used in various applications ranging from security systems and mobile unlocking to tagging friends on social media platforms.

With the growing use of facial recognition technologies in both public and private sectors, there’s an increased possibility that these tools could be used for purposes beyond simple identification, such as predicting personal attributes like political orientation.

“Growing up behind the iron curtain made me acutely aware of the risks of surveillance and the elites choosing to overlook inconvenient facts for financial or ideological reasons,” explained lead author Michal Kosinski, an associate professor of organizational behavior at Stanford University’s Graduate School of Business.

“Thus, in my work, I am focused on auditing new technologies and exposing their privacy risks. In the past, we showed that data that Facebook sold (or exchanged for content) exposed users’ political views, sexual orientation, personality, and other intimate traits. We showed the worrying potential of the personality targeting approach used by Facebook, Cambridge Analytica, and others.

“We exposed how Facebook used a trick to continue selling their users’ intimate data. We showed that facial recognition technologies, widely used by companies and governments, can detect political views and sexual orientation from social-media profile pictures.”

But previous studies often didn’t control for variables that could affect the accuracy of their conclusions, such as facial expressions, orientation of the head, and the presence of makeup or jewelry. In their new study, the researchers aimed to isolate the influence of facial features alone in predicting political orientation, thus providing a clearer picture of the capabilities and risks of facial recognition technology.

To achieve this, they recruited 591 participants from a major private university and carefully controlled the environment and conditions under which each participant’s face was photographed. The participants were dressed uniformly in black T-shirts, used face wipes to remove any makeup, and had their hair neatly tied back. They were seated in a fixed posture, and their faces were photographed in a well-lit room against a neutral background to ensure consistency across all images.

Once the photographs were taken, they were processed using a facial recognition algorithm, specifically the VGGFace2 in a ResNet-50-256D architecture. This algorithm extracted numerical vectors — called face descriptors — from the images. These descriptors encode the facial features in a form that computers can analyze and were used to predict the participants’ political orientation through a model that mapped these descriptors onto a political orientation scale.

The researchers found that the facial recognition algorithm could predict political orientation with a correlation coefficient of .22. This correlation, while modest, was statistically significant and suggested that certain stable facial features could be linked to political orientation, independent of other demographic factors like age, gender, and ethnicity.

Next, Kosinski and his colleagues conducted a second study in which they replaced the algorithm with 1,026 human raters to assess if people could similarly predict political orientation from neutral facial images. The human raters were recruited through Amazon’s Mechanical Turk and were presented with the standardized facial images collected in the first study. Each rater was asked to assess the political orientation of the individuals in the photographs.

The raters completed over 5,000 assessments, and the results were analyzed to determine the correlation between their perceived ratings of political orientation and the actual orientations as reported by the participants. Like the algorithm, human raters were able to predict political orientation with a correlation coefficient of .21, which was comparable to the algorithm’s performance.

“We knew that both humans and algorithms can judge intimate traits, ranging from personality to sexual orientation, and political views from social media profile pictures. Much of the signal likely comes from self-presentation, facial expression, head orientation, and other choices made by the person in the photo,” Kosinski told PsyPost.

“I was surprised that both algorithms and humans could predict political orientation also from carefully standardized images of expressionless faces. That suggests the existence of links between stable facial features and political orientation.”

In a third study, the researchers extended their examination of facial recognition’s predictive power to a different context by applying the model to a set of naturalistic images — those of politicians. The study aimed to validate the findings from the controlled laboratory settings in a more real-world scenario where the images were not standardized. The sample consisted of 3,401 profile images of politicians from the lower and upper chambers of legislatures across three countries: the United States, the United Kingdom, and Canada.

The results demonstrated that the facial recognition model could indeed predict political orientation from the naturalistic images of politicians with a median accuracy of a correlation coefficient of .13. This level of accuracy, while not high, was nonetheless significant and indicated that some of the stable facial features predictive of political orientation in the controlled laboratory images could also be identified in more varied, real-life images.

The findings have worrying implications for privacy.

“While many other digital footprints are revealing of political orientation and other intimate traits, facial recognition can be used without subjects’ consent or knowledge,” Kosinski explained. “Facial images can be easily (and covertly) taken by law enforcement or obtained from digital or traditional archives, including social networks, dating platforms, photo-sharing websites, and government databases.

“They are often easily accessible; Facebook and LinkedIn profile pictures, for instance, can be accessed by anyone without a person’s consent or knowledge. Thus, the privacy threats posed by facial recognition technology are, in many ways, unprecedented.”

“All these findings are inconvenient. For ideological reasons, scientists prefer to avoid discussing links between appearance and traits,” Kosinski added. However, “companies and governments are keen to use facial recognition to identify intimate traits.”

As with any study, the research has limitations to consider. The diversity of the participants was constrained, with a significant majority being Caucasian, and all from a single private university, which might not provide a broad representation of global or even national demographics. While the study controlled for many variables, the influence of inherent biases in human perception or the algorithm’s design cannot be entirely ruled out.

Future research could expand on these findings by including a more diverse participant pool and employing more advanced imaging technologies, such as three-dimensional facial scans. Additionally, exploring these predictions across different cultures and political systems could provide deeper insights into the universality of the findings.

“We should be careful when interpreting the results of any single study,” Kosinski noted. “While our findings are in-line with previous work, the results should be treated as tentative until they are replicated by independent researchers.”

Nevertheless, the research raises important questions about the potential uses and abuses of facial recognition technology.

“I hope that our findings will inform the policymaking and regulation of facial recognition technology,” Kosinski said. “Our previous papers often resulted in tightening regulation and tech companies adjusting their privacy protections. I also hope that this research will help us to boost our understanding of the links between appearance and psychological traits.”

The study, “Facial Recognition Technology and Human Raters Can Predict Political Orientation From Images of Expressionless Faces Even When Controlling for Demographics and Self-Presentation,” was authored by Michal Kosinski, Poruz Khambatta, and Yilun Wang.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.