AI Deepfakes also target politicians. BuzzFeed released a deepfake of Obama calling Trump a “total and complete dipsh💩t” in April 2018. #FakeNews #InformationWarfare
“Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” Johansson told the Post.
“The fact is that trying to protect yourself from the Internet and its depravity is basically a lost cause... The Internet is a vast wormhole of darkness that eats itself.”
Deepfakes—ultra-realistic fake videos that place someone’s face onto another person’s body using machine learning tools—were first reported on by Motherboard last December. A Reddit user going by the name “deepfakes” shared fake porn videos of a number of female actors, including Johansson, on the site. In the months since, the tech behind these videos has only become more powerful and easier to use, with researchers developing ways to both make deepfakes more convincing and easier to detect. (The hope for many is that the latter developments will outpace the former.)
She was among the dozens of female celebrities whose nude photos and videos were stolen in 2011. Her photos were then posted around Los Angeles—without her consent—as part of an art project.
And in 2016, a product designer in Hong Kong made a humanoid robot in Johansson’s likeness.