OsugiSakae.com

TESOL, Technology, and EdTech

Deepfakes

It's about to get real(ly bad)


Pictures can be faked. We know this. We are used to this. Even so, people still get fooled by fake pictures. (Or, maybe they don't want to know and prefer to believe a fake is real.) Considering how easy modifying images is these days, we haven't really seen too much trickery involving faked photos. (Oddly, when we do see it, it is often poorly done and obviously faked.) But, what happens when a video can be faked? What happens when we can watch a politician, for example, say something horrible at some public event? How many people would believe what they saw, no matter how much the politician denies saying it or even being at the event?

That sort of video is coming to the public, probably very soon — not in 5 or 10 years, but 2 or 3 years, I think. Maybe even in time to be used against candidates in the 2020 USA elections. The technology already exists; the videos are known as Deepfakes. Currently, they require a lot of video and audio footage of the target to train the software. Once trained, you can make video of another person look like a video of your target.

I said coming to the public above because the videos already exist online in porn circles. Some people have used it to put a non-porn celebrity's face onto the body of a porn actor in a porn movie. Regular actors have lots of video out there that can be used to train the software. Just add porn movie. Regular people, not so much. So, we probably don't have to worry about deepfake non-consensual sharing of sexual videos (revenge porn videos) just yet, but deepfake fake non-consensual pictures, maybe?

Remember the second requirement: that you have a second video, the one you want to modify. If you have one of your target already in a good situation (a politician making a speech, for example), you only have to train the voice and the face / speaking elements of the software and you can make the politician say anything you want him or her to say.

If you don't have an appropriate video of the person you are trying to deepfake, you can use or make a video of someone of similar size to make the deepfake video. You'll probably need more video of your target to train the software, because the software is going to have to do a lot more work to create a convincing fake. For celebrities and politicians, getting enough video may not be a problem.

Researchers are working on ways to use software to detect deepfakes. It is going to be an arms race, and because it is software, both sides are going to use the other to get better. I'm not sure anyone can predict how it will end. Will we be able to detect deepfake videos as faked? Will it matter? Political partisans will not care what any experts say about the truthiness status of a possible deepfake. Just the fact that the video is out there on social media will be enough for many people. We've seen that plenty already with traditionally doctored photos and videos on social media the last few years. Better fakery will only make less partisan people more unsure about what to trust.

Honestly, a lot like with climate change, it is really hard to be not pessimistic about our future. People will make fake stuff to hurt others, for political or economic gain, or just for the lulz. Many people won't care about truth or accuracy. They will just believe whatever reinforces their existing beliefs.

I don't want to say that we are in a lot of trouble, but I think that we may be in a lot of trouble. I think that going forward, it will not be enough for people who care about honesty, integrity, and generally just humanity to ignore the ignorant and the dishonest. Perhaps in the same way that we counter racist and hurtful language in classrooms, we will have to actively counter dishonesty every time it comes up. No exceptions. And by we, I mean everyone.

This page is Copyright ©2019 Chris Spackman.
This web site developed entirely on GNU/Linux with Free / Open Source Software.

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Creative Commons License