fb-pixel

It takes a lot to creep me out these days. Actually . . . delete delete delete.

It only takes a little to creep me out. Like this cool and also quite terrifying new update to Adobe After Effects that I wish I could turn back time and erase from ever happening. It’s called Content-Aware Fill, and yes, I realize that doesn’t sound very scary.

“Content-Aware Fill for video quickly and easily removes distracting or unwanted objects from your compositions,” says a somehow-not-quaking voice in Adobe’s promotional video for the new feature. Meanwhile, an unseen hand selects a klatch of unsightly tourists from a sweeping panorama around what’s supposed to be a medieval castle scene and turns them to turf in a matter of clicks.

Advertisement



You could also use it to nix an obtrusive boom mic, it says. Or anything else that’s disrupting your fantasy, really. It’s kind of amazing. Cue the creepy feeling.

The Internet has made us all a lot more culturally comfy with various degrees of doctoring.

We routinely tweak and tint photos without too much agony over authenticity (and when we do resist the impulse to alter them, it’s a feat worth boasting over with a #nofilter certification). The bold brazenly employ the virtual makeup of FaceTune to airbrush their selfies into featureless perfection; while the ostensibly more modest Instagrammers among us might opt for just a whisper of the Gingham filter like a spritz of No. 5 on the way out the door.

And sure, much of this entry-level Internet fakery is just continuing the long flirtation between beauty and artifice. But this comfort we’ve attained with the unreal could be dangerous given the leaps technology is taking.

I’ve talked a bit in these pages before about the rising threat (and increasing sophistication) of so-called “deepfake videos,” which use facial-mapping technology and machine learning to enable anyone with the proper software and some spare time to play ventriloquist on just about anyone else. (And more quietly concerning is the growing online crowd of computer generated faces.)

Advertisement



Deepfakery is moving into the realm of audio as well. The Cambridge-based firm Modulate creates “voice skins” that alter the shape and sound of your voice, allowing you to “become your in-game character. Join a conversation without worry of discrimination or harassment. Or even show off to your friends by pairing your celebrity skin with a voice to match.” Montreal-based voice synthesis firm Lyrebird foregrounds the freakout potential a little more with its SoundCloud playlists of robo-Trumps, mantrically repeating “I am not a robot, my intonation is always different” in intonations that are always different — and perhaps more human than the source material.

And Adobe’s experimental (and seemingly dormant) technology VoCo can quite effectively synthesize new words out of voices sampled and analyzed from existing recordings. Hearing it in action may leave you at a loss for words (not to worry, it will come up with some) and never wanting to leave a voice mail again.

Which brings us back to Content-Aware Fill — a feature that might inspire more appropriate general panic were it branded more accurately as, say, a Reality Shout Stick or a History Eraser. It’s nearly impossible to imagine a tool that can literally scrub all traces of any object (or person, or act) from video not being put to abusive ends.

Advertisement



For the millions of Adobe-heads out there, its addition to After Effects merely represents a long-awaited promotion of the feature from the still applications of Photoshop to the far more useful realm of video — where manual deletions of unwanted objects can be time-consuming and expensive. It is, after all, a product within product, a convenience within a convenience.

But for the rest of us, who must navigate a world tensely drawn between the poles of the real and the fraudulent, and where the medium of video is increasingly treated like the arbiter of truth, a feature like Content-Aware Fill represents just another slight against the sanctity of perception, and yet one more reason why seeing is no longer believing.


Michael Andor Brodeur can be reached at mbrodeur@globe.com. Follow him on Twitter @MBrodeur.