We are independent & ad-supported. We may earn a commission for purchases made through our links.

Advertiser Disclosure

Our website is an independent, advertising-supported platform. We provide our content free of charge to our readers, and to keep it that way, we rely on revenue generated through advertisements and affiliate partnerships. This means that when you click on certain links on our site and make a purchase, we may earn a commission. Learn more.

How We Make Money

We sustain our operations through affiliate commissions and advertising. If you click on an affiliate link and make a purchase, we may receive a commission from the merchant at no additional cost to you. We also display advertisements on our website, which help generate revenue to support our work and keep our content free for readers. Our editorial team operates independently from our advertising and affiliate partnerships to ensure that our content remains unbiased and focused on providing you with the best information and recommendations based on thorough research and honest evaluations. To remain transparent, we’ve provided a list of our current affiliate partners here.

How Difficult Is It to Spot a Deepfake?

Updated May 17, 2024
Our promise to you
WiseGEEK is dedicated to creating trustworthy, high-quality content that always prioritizes transparency, integrity, and inclusivity above all else. Our ensure that our content creation and review process includes rigorous fact-checking, evidence-based, and continual updates to ensure accuracy and reliability.

Our Promise to you

Founded in 2002, our company has been a trusted resource for readers seeking informative and engaging content. Our dedication to quality remains unwavering—and will never change. We follow a strict editorial policy, ensuring that our content is authored by highly qualified professionals and edited by subject matter experts. This guarantees that everything we publish is objective, accurate, and trustworthy.

Over the years, we've refined our approach to cover a wide range of topics, providing readers with reliable and practical advice to enhance their knowledge and skills. That's why millions of readers turn to us each year. Join us in celebrating the joy of learning, guided by standards you can trust.

Editorial Standards

At WiseGEEK, we are committed to creating content that you can trust. Our editorial process is designed to ensure that every piece of content we publish is accurate, reliable, and informative.

Our team of experienced writers and editors follows a strict set of guidelines to ensure the highest quality content. We conduct thorough research, fact-check all information, and rely on credible sources to back up our claims. Our content is reviewed by subject matter experts to ensure accuracy and clarity.

We believe in transparency and maintain editorial independence from our advertisers. Our team does not receive direct compensation from advertisers, allowing us to create unbiased content that prioritizes your interests.

Camera apps have become so sophisticated, and so widely available, that the moving images you’re seeing may not be real at all. Called “deepfakes,” manipulated videos and other digital representations produced by artificial intelligence and machine learning can fabricate images and sounds and make them appear incredibly real.

To illustrate how a false narrative can be created with technological wizardry, the Museum of the Moving Image in Queens, New York, is exhibiting “Deepfake: Unstable Evidence on Screen,” which shows how trickery on a grand scale can be used to manipulate viewers.

The centerpiece of the exhibition is a six-minute film called “In Event of Moon Disaster,” which features manipulated images that tell a false story about how the Apollo 11 astronauts supposedly died when they were unable to return home from the Moon.

Technological trickery:

  • The film was produced by the MIT Center for Advanced Virtuality, and it won an Emmy Award for Outstanding Interactive Media: Documentary in 2022. Images of Walter Cronkite and Richard Nixon are used with dialogue that appears to show them discussing a tragedy that never happened.

  • Given these technological capabilities, it’s not difficult to imagine a scenario in which someone could create deepfakes that could be used to influence elections, or to assassinate someone’s character by depicting them in a compromising position.

  • The museum exhibition comments on how deepfakes are just the newest example of creative editing, citing other contested depictions throughout history, such as Spanish-American War reenactments, Frank Capra’s Why We Fight propaganda films during WWII, and the Zapruder 8mm film of the JFK assassination.

WiseGEEK is dedicated to providing accurate and trustworthy information. We carefully select reputable sources and employ a rigorous fact-checking process to maintain the highest standards. To learn more about our commitment to accuracy, read our editorial process.

Discussion Comments

WiseGEEK, in your inbox

Our latest articles, guides, and more, delivered daily.

WiseGEEK, in your inbox

Our latest articles, guides, and more, delivered daily.