Just what we needed: something ELSE to be scared of in 2018. The technology to create realistic fake video and audio is moving fast, as I discuss in my latest Campaigns & Elections column. What happens when someone can “make” a public official, celebrity or anyone else appear to say something they didn’t? We’re about to find out — and it’s likely to be far more dangerous than the selectively edited video hitjobs we’ve seen from the likes of Project Veritas:
The new forms of fake video and fake audio won’t share this drawback, since their creators can dream up whatever they want and make someone say it — or at least, make them sound as though they did. This technology has the potential to be truly dangerous. Putting the wrong words in someone’s mouth, believably, could start a war.
Speaking of Project Veritas, my column must have hit a nerve. Sorry guys, selective editing = a hatchet job, and it undercuts whatever legitimate points you might be trying to make.
Back to “real” fake video, though, be sure to check out the PSA embedded below, in which Jordan Peele shows how AI-driven live video editing can put his words in Barack Obama’s mouth:
What should campaigns do? In the C&E column, I talk about preparing for fake video BEFORE it happens. Once you’re a victim, follow Rule Number One: don’t reinforce the message the video’s creators are trying to get across. Instead, try to pivot to the fact that you’re under an illegitimate attack, and enlist your supporters and and high-profile activists to vouch for you (something that’s much easier to do if you’ve built those relationships in advance). See the piece for more.
Thinking beyond 2018, though, we are ALL going to have to be a lot smarter about what we see, hear and read, particularly if it reinforces something we WANT to believe. And that’s what scares me the most.