Deepfakes can be an asset for enterprises — according to Gartner
Deepfakes have earned a bad reputation from their association with political scams and manipulated adult videos. They’ve even been called a potential “threat to democracy.”
But the technology itself — GANs (generative adversarial networks) — is not inherently malevolent.
Apps such as DeepFaceLab and MachineTube made this technology available to the masses; some use it with good humor while others with ill-intentions. But this shouldn’t completely cloud the business potential deepfakes have to produce positive content for enterprises.
“This technology has the potential to be an asset to enterprises in content production, particularly when it comes to personalized content,” said Andrew Frank, Research VP and Analyst at Gartner. “Businesses that utilize mass personalization need to up their game on the volume and variety of content that they can produce, and GANs’ simulated data can help.”
For example, Disney used deepfake technology to include the young version of Harrison Ford as Han Solo into the recent Star Wars franchise films, but other companies are experimenting with ways to make the publishing of targeted content more efficient.
Last year, a video featuring David Beckham speaking in nine different languages for a ‘Malaria No More’ campaign was released. The content was a result of video manipulation algorithms and represented how the technology can be used for a positive outcome — reaching a multitude of different audiences quickly with accessible, localized content in an engaging medium.
Many brands would be eager to take a similar approach; running a targeted social media campaign across various international markets, for example, or emphasizing different parts of a product to different aged audiences — companies can employ deepfake technology to deliver messaging variants that fit the demographics of each customer.
For retailers, meanwhile, such technology can be leveraged to further enhance consumers’ shopping experiences.
Presently, immersive technology such as virtual reality (VR) and augmented reality (AR) are used to help consumers envision products in customized settings or on themselves — it’s not a stretch to imagine user-generated deepfakes could be used in a similar, more realistic way.
Of course, any advances in deepfake technology for positive applications must be countered by an assessment of the threat it poses when used maliciously, and the damage it’s already done.
While many of the fears around the technology are stemming from its potential use in the dissemination of ‘fake news’ for political purposes, Deeptrace reported that 96 percent were pornographic in nature.
Demand for video authentication tools to help identify and eliminate malicious deepfakes is on the rise, but the solutions are far from being ready or cost-effective. Before enterprises move ahead with deepfakes in commercials, marketing content, or experiential retail, guidelines (such as disclosure, e.g. “this video uses AI manipulation”) must be established.
Tech giants like Facebook and Google are piling resources into deepfake detection as part of a global Deepfake Detection Challenge — aware that content that has been “altered in order to mislead the viewer” could massively undermine the integrity of their platforms and delicate trust of their users.
Negative associations will continue to present a barrier to companies aiming to leverage deepfake technology. Of course, some brands may relish, and even benefit from the controversy of adopting such a technique early on. Although these videos themselves could be manipulated further to the detriment of brands’ reputations.
While Frank observed that deepfake technology could prove a practical format for brands to produce manipulated content at volume, he admits using such a technique may not do much to build up the customer trust that is so lacking from the advertising industry.
“Brands are becoming too dependent on artifice when they do things like deploy synthetic customer service representatives, which initially appear to be real people and then you later discover that they are not,” he commented.
“All of that contributes to a general loss of trust, so maybe rediscovering the human element is the key to fighting all of this.”