Deep Fakes and the Nixon Moon Disaster Speech that never happened.
Since 2015, face swapping apps started making rounds on the Internet, becoming increasingly popular and giving better quality results that were easy to create even with minimal technical knowledge. Many people enjoyed swapping their face with that of a celebrity starring in a popular music video and posted the deep fake videos onto social media. Some deep fakes were more convincing than others, although the reality presented to us by this app was grimmer than the one was making it out to be.
The software used for these videos is called Deep Fake and like most new technology Deep Fakes gained serious attention when they started being used in porn. In 2017, a Reddit user calling himself “deepfake” transposed the heads of celebrities onto the bodies of porn actresses and claimed to have used an algorithm to do so. Modern deepfakes were born. (Amer, 2019) Deepfake porn is a kind of image based sexual abuse. (van der Nagel, 2020)
Deepfake technology stems from the words “deep learning” and “fake”, and has been steadily improved since the 1990s, with convincing technology becoming available since 2018 to the general public, with the launch of the FakeApp application. Rather quickly, a popular deepfake video went viral on YouTube, showing Barack Obama cursing and calling Donald Trump names.
An early example of deep fake use in art, was the video artwork “Un’emozione per sempre 2.0” (Titled “The Italian Game” in English). An AI version of Ornella Muti, an Italian 1980s movie star, was shown traveling in time from 1978 to 2018. Ayerle used photos of Kendall Jenner to create the synthetic AI body, replacing Jenner’s face by that of Ornella Muti.
As time goes by, it is becoming very difficult to distinguish between a deep fake doctored video and an authentic video. What is worrying is that deep fakes can be used to influence political election or help to generate fake news and create public mistrust and misinformation.
A 2019 report by Deep Trace Labs states that there are nearly 15,000 deep fake videos online (of which more than 96% were porn) and although the situation seems worrying , other scholars, seem to agree that even when they do seem threatening to our narratives of the truth, there is still room for pushback to the problem of deepfakes thwarting control over images of women (van der Nagel, 2020) and fake news.
Audio deep fakes also exist, with applications that can convincingly clone human voices after a few seconds of listening. Deep fakes have also been used to “resurrect” people and actors.
“Deepfakes are not going to travel fast and far without media amplification,” she says. For deepfakes to be lifted from the landfill of social media content, reach an audience, and gain massive traction, they’ll have to be backed by traditional media, she adds. “Journalists are going to play a role in pointing people’s attention in this direction.” (Amer, 2019)
The installation “In the event of a moon disaster”, by Francesca Panetta and Halsey Burgund, is an art installation and film produced by the MIT Center for Advanced Virtuality, that reimagines the 1969 moon landing. The project is meant to inform the public about deepfakes — and show how easy it is to create them.(Amer, 2019)
The two artists created a deep fake of President Nixon “presenting” on TV the contingency speech that was prepared for him in case the Apollo 11 astronauts had never managed to return back to Earth after the first lunar landing in 1969
The installation comprised of a traditional 1960’s living room and invited the viewer to sit down and watch President Nixon’s deep fake on a black and white TV.
Although the installation also raises playful questions about what life would be like if history was different, the real intention was to teach the general public how convincing deep fakes can be and to be aware that vision, in this age of mechanical creation has been compromised.
Three days before the Apollo 11 spacecraft was set to launch, President Richard Nixon’s speech writer got a call from Apollo 8 astronaut Frank Borman, the NASA liaison to the White House. Borman told the speech writer, William Safire, that he would need to write a backup address for the president — in case of “mishaps.”
The most dangerous part of the mission wasn’t landing on the moon, Borman explained. The true danger would come when the astronauts left the moon’s surface, launching the lunar module back into space to re-join the orbiting command ship. If they were unable to make contact with the main ship, NASA would be forced to “close down communication,” leaving Neil Armstrong and Edwin “Buzz” Aldrin stranded with no hope of rescue. In a 1999 interview, Safire recalled being told that the men would either “starve to death or commit suicide.”
Bibliography
Amer, P. (2019) ‘When you can’t believe your own eyes: As “deepfake” technology proliferates, seeing is not always believing. But the battle against AI videos that blur the lines between reality and fiction has only just begun.’, Boston Globe, 15 December, p. K.1.
van der Nagel, E. (2020) ‘Verifying images: deepfakes, control, and consent’, Porn Studies, 7(4), pp. 424–429. doi:10.1080/23268743.2020.1741434.