Individuals who believe in conspiracy theories and those who spend more time on social media tend to be better at detecting deepfake videos, according to new research published in Telematics and Informatics. The findings shed new light on the factors related to human recognition of highly realistic manipulated videos.
Deepfake technology, which involves creating hyper-realistic video content through artificial intelligence, has emerged as a significant concern. It allows the swapping of faces and voices in videos, making it possible to fabricate scenarios that never actually happened. This technology’s potential misuse includes creating fake evidence, spreading misinformation, or manipulating public opinion. Previous research has primarily focused on developing algorithms for detecting these fakes, but the Leiden University study shifts focus to human detection capabilities.
Understanding how well humans can identify deepfakes is crucial, given the technology’s widespread availability and realistic outputs. Past studies have offered varying conclusions on human accuracy in detecting these fakes, ranging from moderate to relatively high levels. However, there was limited understanding of how individual characteristics like age, gender, social media usage, and personal beliefs might influence this ability.
“Deepfake videos are a rapidly developing technology that can have a lot of impact on society,” said study author Ewout Nas, who is now a researcher at the Amsterdam University of Applied Sciences. “A fair amount of research had already been done on algorithmic recognition of deepfakes, but little was known about human recognition of deepfake videos.I was very interested in human performance in recognising deepfake videos and its predictors.”
The study enrolled 130 participants through Leiden University’s research participant platform and social networks. The participants, mostly young adults, were tasked with a deepfake detection challenge. This task utilized the Celeb-DF dataset, a collection of videos featuring 59 celebrities in both authentic and deepfake formats. Participants were shown 174 videos in random order and asked to judge whether each was real or fake. They were also asked if they recognized the celebrities in the videos.
The researchers assessed participants’ conspiracy mentality using the Conspiracy Mentality Questionnaire, a tool designed to measure the tendency to believe in conspiracy theories (e.g. “events which superficially seem to lack a connection are often the result of secret activities”). They also gathered data on participants’ social media habits, including the time spent on these platforms and the number of posts they made.
Participants demonstrated a reasonable level of accuracy in distinguishing deepfakes from real videos, with an average accuracy rate of 80%. However, the more intriguing findings lay in the nuanced analysis of various influencing factors.
Firstly, a surprising positive correlation was found between conspiracy mentality and deepfake detection performance. Contrary to the initial hypothesis, individuals who scored higher on the Conspiracy Mentality Questionnaire were better at identifying deepfake videos. This suggests that a conspiratorial mindset might enhance one’s ability to scrutinize and detect fabricated content.
“We found a positive correlation between conspiracy beliefs and deepfake detection performance. We expected to find the opposite,” Nas told PsyPost.
In addition, the researcher found a strong link between time spent on social media and deepfake detection capability. Participants who spent more hours per week on social media platforms showed a higher aptitude in recognizing deepfakes. This finding implies that regular exposure to a wide range of content, including potentially manipulated media, might hone one’s ability to discern authenticity.
“Time spent on social media and belief in conspiracy theories are positively correlated with deepfake detection performance,” Nas said. “In general terms, deepfake videos are becoming more realistic and harder to recognize. One can no longer automatically believe everything you see on video footage.”
As expected, participants were more adept at identifying deepfakes when they were familiar with the celebrities featured in the videos. This familiarity likely aids in spotting discrepancies and inconsistencies in the fake representations. Interestingly, age and gender did not significantly impact the ability to detect deepfakes.
While the study offers significant insights, it also acknowledges certain limitations. For instance, the research only included visual content without audio, which might affect the generalizability of the findings. Participants were also aware that they were being tested for deepfake detection, which is not typically the case in real-life scenarios where deepfakes might be encountered unsuspectingly.
Moreover, the rapidly evolving nature of deepfake technology means that the study’s findings, based on the current state of this technology, might need updating as newer, more sophisticated methods emerge.
“Since deepfake technology is evolving rapidly, it is important to keep assessing the human performance at detecting state of the art deepfake videos,” Nas said. “It would also be interesting to study the relationship between cognitive performance/ability and deepfake detection performance.”
Moving forward, researchers suggest expanding the demographic range of participants to include a broader age spectrum. This could offer more definitive insights into whether age influences deepfake detection. Additionally, investigating the role of specific digital skills, cognitive abilities, and the impact of deepfakes on people’s perceptions and attitudes would provide a more holistic understanding.
The study, “Conspiracy thinking and social media use are associated with ability to detect deepfakes“, was authored by Ewout Nas and Roy de Kleijn.
Discussion about this post