Fake News on Steroids: Deepfakes Are Coming – Are World Leaders Prepared?
Davos 2019 image via Aaron Stanley for CoinDesk
This is part of a series of op-eds previewing the World Economic Forum in Davos, Switzerland. CoinDesk will be on the ground in Davos from Jan. 20–24 chronicling all things crypto at the annual gathering of the world’s economic and political elite. Follow along by subscribing to our pop-up newsletter, CoinDesk Confidential: Davos.
Arif Khan is the CEO of Alethea AI. The opinions here are his own.
Despite its shortcomings, the World Economic Forum has laudably established itself as an important venue for highlighting controversial technology trends that will have broad impacts on our world.
What 2018 and 2019 were to blockchain and cryptocurrencies on the WEF stage (met with a healthy mix of intrigue and skepticism), 2020 will be to synthetic media, also known by the ominous-sounding euphemism “deepfakes.”
While this concept may seem like sheer science-fiction to most, the WEF has added an entire session to its 2020 program devoted to deepfakes and the risks they pose to societies and political systems. These technologies will also be on full display at several of the side events populating the Davos Promenade.
For a quick primer on why deepfakes are no joking matter, see this video in which Jet Li’s face has been replaced by Binance CEO Changpeng “CZ” Zhao.
While CZ was good-natured enough to retweet the light-hearted parody, he sounded a cautionary note that KYC and video recognition would be challenging to implement if this trend continues.
We’re not in Kansas anymore…
Open-source artificial intelligence algorithms are now enabling the creation of deepfakes at a staggering rate, meaning that the “fake news” controversies witnessed during the 2016 U.S. presidential election will seem like child’s play in 2020. There are two categories of deepfakes: malicious and non-malicious, better known as synthetic media.
Given that most businesses are grossly under-equipped to protect themselves against basic scams like phishing emails, it’s difficult to even to fathom the impacts that deepfake media could have if deployed maliciously against a naive population.
So why does this matter to crypto? After all, we’re a savvy community at the bleeding edge of technology, right? True, but scams still proliferate in the crypto industry – just look at all the “Not giving away ETH” warnings attached to impersonated Crypto Twitter celebs.
With synthetic media, scammers now have a new arsenal to impact the trajectory of a decentralized organization and create unprecedented chaos. We will almost assuredly see deepfakes of crypto-industry titans in 2020. How we respond to this emergent media landscape depends on how well inoculated our societies and communities will be.
Synthetic media will emerge as an asset class in 2020
But there’s no need to run for the hills. Like any technology, the underlying algorithms that develop deepfakes can be put to benevolent uses unlocking economic value and productivity gains, while enabling new modes of artistic expression. While commercial use cases are still in their infancy, 2020 will see concerted attempts at making this technology accessible to the masses in positive ways.
For example, the now famous Peloton Girl’s face was instantly recognizable in the Gin advertisement developed and shot by the prescient Ryan Reynolds. Her likeness and face were “liquid” in the sense that they were superimposable to a diametrically opposite narrative context within a short span of time. In the future, we will witness similar examples, and if her “likeness” is digitally replicable, she’ll have the option of participating in a multitude of differing contexts within a short span of time, enhancing her virality and influence further. This will unlock significant commercial value for actors, talent agencies and their brands.
Synthetic digital copies that are lifelike, emotive and expressive will be instantly swappable and tradeable like game skins and voice skins in online role-playing games. Actors and their talent agencies will be clamoring for the whitespace that synthetically created digital storytelling will enable. Our faces and voices will be rented, modified, reskinned and repurposed.
Other benevolent use-cases could mean preserving a relative’s voice for future generations to hear and learn from or even preserving the likeness of a savant mathematician or musical genius, to serve as a constant source of inspiration for a society or community. Imagine Gandhi teaching you the principles of non-violence or a recently deceased relative reading a children’s audiobook to your yet-to-be-born children. The benevolent applications are endless. Yet the digital rights infrastructure is nascent.
Should we encourage the financialization of our faces and voices?
But, as is the case with other novel technologies, it doesn’t require much imagination to envision how this could quickly turn dystopian. To fight against such a scenario, we must empower individuals to have access to the data that underpins their likeness, faces and voices. No government or corporation should own any human’s identity. Individuals must be empowered to preserve, transact and liquidate their likeness within the correct and evolving legal and intellectual-property landscape. For this to happen, synthetic media will need a new digital rights substrate to manage permissions, transactions and governance.
Rumors have set the creative and tech industries abuzz that both TikTok and Snapchat are developing face-swapping capabilities for the masses. While these centralized data companies will aim for as much value capture in terms of engagement and time spent on their apps, it is critical that a decentralized alternative and ethos emerges to enable individuals to have agency over their likeness and their data.
As faces and voices become fully synthetic in 2020, we risk harming our social and economic fabric if individuals are not able to understand the value of their voices and faces. This will be especially so if all of the value capture occurs at the application layer of centralized companies, as has all too often been the case in the Web 2.0 context.
Will the internet be reduced to a hall of mirrors?
Even prior to deepfakes, the modern world was experiencing truth decay.
The trust that our society places on the digital information that we come across is falling. The constant barrage of false information has left us exhausted and, at times, apathetic. The democratization of AI tools to generate synthetic content will make the current standards of misinformation seem almost benign. With the rise of synthetic media and at-will deepfake creation tools, how will we learn to discern the truth?
Some blockchain-based timestamping solutions offer interesting pathways to enabling a digital registry of all synthetic content, yet the possibilities for centralized control of the registries remain. Other solutions explored enable fingerprinting of digital content and hashing at the point of capture, both relevant but unfortunately narrow use cases that do not fully address the complexity of the problem.
Just like bitcoin and ethereum inadvertently led to individuals learning more about markets, game theory, money production and economics, we would need decentralized alternatives and incentives to allow people to truly understand the value of their digital likeness. Upcoming discussions at the WEF and elsewhere will hopefully establish a baseline grasp of the issues at play and a framework and an understanding for what solutions are required.
In particular, the role that blockchain technology can play in creating an underlying substrate must be highlighted. Yes, we all remember 2018 when blockchain was the solution to every problem on Planet Earth and then some, but it will have to play a role here both in regard to timestamping deepfakes and facilitating the creation of a digital rights management layer – which would thus empower individuals to eventually transact with their face or voice data.
As the stakes continue to get higher and as content becomes increasingly liquid, emergent and viral, we will need to have a solid grasp on our reality and the shared truths that undergird it. If we thought fake news damaged the trust in our society, deepfakes if improperly managed will do far worse.
However, we cannot throw the baby out with the bathwater. The synthetic media phenomenon is here to stay; what is required is a solid understanding of how it can be used to benefit individuals and society, as well as a strategy for mitigating the damages potentially caused by malicious actors.
Disclosure Read More
The leader in blockchain news, CoinDesk is a media outlet that strives for the highest journalistic standards and abides by a strict set of editorial policies. CoinDesk is an independent operating subsidiary of Digital Currency Group, which invests in cryptocurrencies and blockchain startups.