Learning Objectives
At the end of this activity, students will be able to:
- Apply a variety of verification techniques to debunk deep fake audio and video sources online.
- Students will develop the critical thinking skills and practices that support evaluation of various forms of media.
Context
“Fake news isn’t new. But large language models create false and divisive narratives convincingly, and plentifully, to the considerable delight of troll farms at home and hostile disinformation operations from Moscow and Beijing” (Marcus & Crovitz, 2023).
The Case of Deep Fakes
“[D]eepfake detection technologies are not accessible to the public. Thus, humans are often left to their own to decide whether videos are fake or authentic” (Somoray & Miller, 2023).
According to Groh et al. (2022), despite the potential biases people might have, some researchers strongly believe in the collective intelligence of a group when it comes to detecting deepfakes, emphasizing that many people working together can help uncover the truth more effectively. In their study, they further discovered that combining human and model predictions yields higher accuracy compared to relying solely on humans or the model alone. The research highlights that humans possess specialized face processing skills and the capability to consider context, which uniquely equip them for detecting deepfakes.
How to Detect Deep Fakes
Detecting deepfakes can be challenging, but researchers at the MIT media lab offer strategies that can help. They suggest paying attention to the:
- Face. High-end DeepFake manipulations are almost always facial transformations.
- Cheeks and forehead. Does the skin appear too smooth or too wrinkly? Is the agedness of the skin similar to the agedness of the hair and eyes? DeepFakes may be incongruent on some dimensions.
- Eyes and eyebrows. Do shadows appear in places that you would expect? DeepFakes may fail to fully represent the natural physics of a scene.
- Glasses. Is there any glare? Is there too much glare? Does the angle of the glare change when the person moves? Once again, DeepFakes may fail to fully represent the natural physics of lighting.
- Facial hair or lack thereof. Does this facial hair look real? DeepFakes might add or remove a mustache, sideburns, or beard. But, DeepFakes may fail to make facial hair transformations fully natural.
- Facial moles. Does the mole look real?
- Blinking. Does the person blink enough or too much?
- Lip movements. Some deepfakes are based on lip syncing. Do the lip movements look natural?
Take the Challenge
Detect Fakes: An MIT Media Lab research project
Reflection
Reflecting on your experience, what lessons have you learned about the importance of critical thinking and media literacy in evaluating the authenticity of various forms of media?
References
Groh, M., Epstein, Z., Firestone, C., & Picard, R. (2022). Deepfake detection by human crowds, machines, and machine-informed crowds. Proceedings of the National Academy of Sciences, 119(1), e2110013119.
Marcus, G. & Crovitz, G. (2023, June 29). Would you be able to recognize AI misinformation? Washington Examiner.
MIT Media Lab (n.d.). Detect DeepFakes: How to counteract misinformation created by AI. [web page]
Somoray, K., & Miller, D (2023). Human Detection of Deepfakes: An Investigation of Confidence and Accuracy. Available at SSRN 4412638. (pre-print, not peer-reviewed)
Stay Connected!
Discover more ways to enhance your teaching and learning experience:
Attend our Workshops & Events
Book a 1-on-1 Consultation>