Here come the Deepfakes

At some point, the onward trudge of technology has taken on more of a steamroll presentation, and much of the nextgen whatever-it-is brings far less progress to society and much more churn. It’s like the rote Silicon Valley diktat, “disrupt things,” has become the marching orders for a doomsday cult. They don’t even need to pretend they’re building consumer ware anymore. They’re hacking the culture for no better reason than to see what might happen—and they’re absolutely sedate with the understanding that we know that’s what they’re doing.

How else can you explain the emergence of deepfakes — the application of AI and CGI to create ersatz video that’s hard (but not yet impossible) to tell from the real thing? So far we’ve seen scarily convincing videos of Zuckerburg, Obama, and others. Match the images with the work of a decently imitative voice actor, tweak the peaks and valleys with ProTools, and you’ve coded yourself carte blanche to make public figures and politicians say and do whatever you please.

Well, almost. The tricksters haven’t quite yet overcome the Uncanny problem; there’s still something indefinably, inherently repellent about deepfakes, that can and should trigger a healthy dose of doubt. Unless of course they’re saying something you want to believe they’d say.

(And this is completely beside the point, but let’s take a sec to recognize what an embarrassingly stupid descriptor “deepfake” is. When did the tech sector  get so bad at naming their output? Right about the time they gave us the ‘Internet of Things’? Dreck. Blech.)

We won’t stay in the uncanny valley forever though. The tech will get better, and inside a year we’ll see ginned up e-kompromat that’ll be next to impossible to debunk. That’s when the politics of scandal gets interesting.

It’s also when American society divides further along its natural fault lines: some of us will be all too ready to believe that Hillary finally owned up to her awful antics in pizza-parlor basements, and the rest of us will be permanently scarred from repeated, vicious face-palming.

The law of unintended consequences comes into play, too. (Or maybe, ‘unexpected’ consequences, because who the hell knows what anyone intends anymore?) There’s a certain class of politician and public figure who’s going to benefit from the proliferation of deepfakes. It’s an ever-ready alibi that frees your inner Nazi. So go ahead and tell a smoky back room full of donors that Hitler actually had some pretty good ideas—one of the waiters might get it on their phone but that’s okay. Just go on Hannity and swear it was a deepfake, and all will be well.

In fact, can’t we start doing that retroactively? How long will it be until Trump is retconning the grab-em-by-the-pussy tape? He already floated a couple trial balloons that it’s not actually him on that tape (after initially admitting it, of course)—and that was at least 18 months before deepfakes entered the cultural consciousness. When will he circle back to that idea? Oh, right about the time it becomes a reelection issue….

So, yeah. Deepfakes are here. They’re not yet pulling at the threads of society, but they soon will be.

As far as I can see we really have one defense, and it’s not coincidentally the same defense we have for the entirety of the muck that big tech and the media and every stuffed shirt and talking head throws at us these days:

Believe nothing. Doubt everything.

About editor, facilitator, decider

Doesn't know much about culture, but knows when it's going to hell in a handbasket.
This entry was posted in Homepage and tagged , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>