Earlier this week the YouTube channel Shamook highlighted the latest Deepfake technology in which scenes involving two notable characters in Rogue One: A Star Wars Story were greatly enhanced. This was the latest such use of Deepfakes in pop culture.
What was impressive about this most recent use of the technology is that it made the character of Governor Tarkin seem more lifelike and more closely resembled Peter Cushing, the late actor who played the role originally in the 1977 film Star Wars. The same technology was previously employed earlier this year on the YouTube channel to “de-age” Robert De Niro for the Netflix original film The Irishman.
Shamook has used the same technology to put Tom Selleck in the Indiana Jones films – thus allowing fans to see whether the mustached actor could have in fact pulled off the role before he had to decline due to his TV commitments. While the YouTube videos are just a few minutes and not a full film, the fact that the Tarkin and Princess Leia look far more convincing suggests this technology has serious potential for video manipulation.
MORE FOR YOU
Deepfakes relies on autoencoders and other machine learning techniques, is often so good that it is hard to tell if the content was manipulated. It is the latest form of manipulated media where a user can take an existing image or video and replace the person or object with the likeness of another by using artificial neural networks.
The dangers of deepfakes being used for nefarious purposes are so great that social media giant Facebook banned such content earlier this year.
“The technology to create real-time and photorealistic images can create fantastic low-cost videos and do incredible amounts of damage,” said technology industry analyst Rob Enderle of the Enderle Group.
“Imagine, for instance, intercepting a widely watched political event and changing it, so the politician appears to be calling for Civil War,” added Enderle. “On the other hand, you only have to spend a little time on YouTube looking at the Dust videos to realize creators can now do amazing things on a shoestring budget.”
One of the dangers is that technology for manipulating video is fantastic and getting better every day – as noted by the Shamook videos.
“However, the real innovation is in packaging such tools to be widely accessible,” explained Jim Purtilo, associate professor in the computer science department at the University of Maryland. “Great research might tell us how to bend video in deceptive ways, but it takes hard core engineering to figure out how to do this at scale. I guarantee you this engineering is under way now.”
This isn’t the first time that we must question whether “seeing” is in fact “believing” however.
“It was like this with audio and photographs,” added Purtilo. “Today’s consumers routinely touch up recordings and images in ways that once could have only been done in a research lab. As those technologies became packaged for consumers, so did we come to accept that not everything we heard or saw was necessarily how it was. We’ll need to embrace that reality as video manipulation tools become packaged for widespread use too.”
For now this isn’t quite in the reach of everyone with just a desktop or laptop computer.
“These tools are more computationally demanding, but to some that’s not a bug but a feature,” said Purtilo. “Hardware manufacturers that are eager to create demand for their products are some of the most active players in the world of multi-media research. Deepfakes will sell a lot of computers.”
The technology isn’t unstoppable either. There are now AI-based tools that can identify and flag deepfake videos, but unfortunately these aren’t widely known or used.
“They need to be if people are again going to depend on video evidence to convey facts, particularly in a world where more and more questionable sites present themselves as authentic news sources when they are anything but too often are funded by hostile states,” warned Enderle. “Microsoft has developed an AI tool to identify deepfakes, but they admit it won’t be enough to prevent serious problems, at least not alone. Other tools exist, but they are likely too difficult to use for most, and we need advancement in this area if we are to prevent deepfakes being used against us effectively.”