skip to Main Content
bitcoin
Bitcoin (BTC) $ 95,161.50 0.20%
ethereum
Ethereum (ETH) $ 3,437.26 3.81%
tether
Tether (USDT) $ 0.999275 0.22%
xrp
XRP (XRP) $ 2.29 5.44%
bnb
BNB (BNB) $ 689.79 1.90%
solana
Solana (SOL) $ 194.62 7.16%
dogecoin
Dogecoin (DOGE) $ 0.32743 5.48%
usd-coin
USDC (USDC) $ 1.00 0.18%
staked-ether
Lido Staked Ether (STETH) $ 3,428.05 3.51%
cardano
Cardano (ADA) $ 0.92025 4.56%

How Blockchains Can Help Solve AI’s Deepfake Problem

As AI continues to work itself into our daily lives, it’s hard not to see the impact it’s already having on nearly every sector. Within the finance industry, for example, AI is facilitating smarter investments, analyzing market trends and predicting stock performance, ultimately helping individuals and institutions make more informed business decisions.

While most of the advancements with AI are exciting and continue to push different industries forward, there are those abusing the technology for more nefarious purposes. With generative AI, one of the biggest risks that individuals and organizations need to be aware of are called “deepfakes.”

You’re reading Crypto Long & Short, our weekly newsletter featuring insights, news and analysis for the professional investor. Sign up here to get it in your inbox every Wednesday.

Deepfakes are highly realistic digital forgeries produced with AI to manipulate or generate visual and/or audio content. For example, a deepfake might involve an AI-generated video showing a celebrity engaging in actions or making statements that never actually occurred, such as when comedian Jordan Peele created a deepfake of Barack Obama to showcase the threat AI-generated technology could present.

While we may default to believing what we see, this type of forged or deceptive AI-generated content is becoming increasingly more common. Between 2022 and the first half of 2023, deepfakes as a proportion of content in the U.S. increased almost 13 times from 0.2% to 2.6%, according to a recent report from Sumsub Research.

Experts are already concerned deepfakes could be used to try to sway public opinion or influence important events like elections, with bad actors trying to use AI to impersonate elected officials. They are “completely terrified” that the upcoming Presidential race will involve a “tsunami of misinformation,” driven heavily by deepfake and misleading AI-generated content, another recent report noted. Many view deepfakes’ ability to blur the lines between truth and fiction as a fundamental threat to democracies and fair elections around the globe.

So how do we – as a society – mitigate the prevalence and risks of deepfakes, as well as similar risks that may emerge as generative AI only continues to get more sophisticated?

Blockchains could be the crucial technology we need to help tackle this issue. At their core, public blockchains, such as Ethereum, have several key features that make them uniquely positioned to establish authenticity for content and information. This includes blockchain’s inherent transparency, decentralized nature and focus on network security and immutability.

For those unfamiliar, a public blockchain transparently records information in a time-bound manner, accessible to all, globally, and without gatekeeping. This allows anyone to verify the validity of information, such as its creator or a timestamp, making it a source of truth. Public blockchains are also decentralized, eliminating the need for a central decision-maker, and reducing the risk of manipulation. This decentralized structure also offers high network security by eliminating single points of failure, and ensuring an immutable and tamper-resistant record.

Furthermore, blockchains have already demonstrated their ability to authenticate content. For instance, with digital art as non-fungible tokens (NFTs), blockchain tech allows anyone to verify the creator and owner of a piece of art, enabling our ability to distinguish between the original and its potential replicas. This transparency and authentication potential extends to videos, images, and text, providing important foundations for developers to create solutions and tools geared at combating deepfakes, such as OpenAI’s Worldcoin, Irys and Numbers Protocol.

As AI’s impact on society grows, AI-generated content and deepfakes will only become more prominent. Harvard experts already predict that more than 90% of content online will be AI-generated in the future. To protect against threats such as deepfakes, it’s crucial we get ahead of the issue and implement innovative solutions. Public blockchains, collectively owned and operated by users, offer promising features like network security, transparency, and decentralization which can help against the issues deepfakes present.

However, much of the work underway remains in its early stages, and challenges remain with the technical development and widespread adoption of blockchain-related protocols. While there is no quick fix, we must remain committed to shaping a future that upholds truth, integrity, and transparency, as our society navigates these emerging technologies (and the risks they present) together.

Edited by Benjamin Schiller.

Note: The views expressed in this column are those of the author and do not necessarily reflect those of CoinDesk, Inc. or its owners and affiliates.

Loading data ...
Comparison
View chart compare
View table compare
Back To Top