skip to Main Content
bitcoin
Bitcoin (BTC) $ 97,866.22 0.10%
ethereum
Ethereum (ETH) $ 3,427.12 1.72%
tether
Tether (USDT) $ 0.999986 0.05%
xrp
XRP (XRP) $ 2.24 2.09%
bnb
BNB (BNB) $ 709.61 1.12%
solana
Solana (SOL) $ 193.77 2.34%
dogecoin
Dogecoin (DOGE) $ 0.324349 2.33%
usd-coin
USDC (USDC) $ 1.00 0.04%
staked-ether
Lido Staked Ether (STETH) $ 3,422.38 1.75%
cardano
Cardano (ADA) $ 0.886598 3.49%

The Web Wasn’t Built For Privacy – But It Could Be

The Web Wasn’t Built For Privacy – But It Could Be

Privacy means different things to different people. To some, it’s secrecy. To others, it means anonymity. To some others, it’s associated with criminality. 

But privacy is really about power. 

When the web was invented, its openness was key. “The dream behind the Web is of a common information space in which we communicate by sharing information,” Tim Berners-Lee, inventor of the World Wide Web, wrote in 1997. “Its universality is essential: the fact that a hypertext link can point to anything, be it personal, local or global, be it draft or highly polished.”

That openness encouraged people around the world to move their lives, in part, online. And with it, their data, identity, financial information and other key components of their lives. The global pandemic has only increased that data inertia. Now, that information has leaked from our grasp, and is under the purview of nation-states, bad actors, advertisers, social media giants and others.

This essay is part of CoinDesk’s “Internet 2030” series.

The old saying goes “no one knows you’re a dog on the internet.” But at this point, centralized authorities not only know you’re a dog, but also what breed you are, what your favorite kibble is, and whether you’ve been microchipped. More often than not, it’s because you told them. 

Our ideas of privacy used to begin with the idea of our physical body, but such a boundary no longer makes sense. The internet is everywhere and the lines between our bodies and the internet are getting blurrier and blurrier, notes Amy Zalman, part-time professor at Georgetown University and CEO of the foresight consultancy Prescient. The boundary is blurred by how we consent to data being shared and how we give up data to connected devices like video doorbells or smart locks. 

“Our devices are not just connected to the internet, but each other, and the institutions we want privacy from,” said Zalman. “We want privacy from those institutions penetrating us and slicing and dicing us up and giving that information out in various ways.”

And what isn’t being shared is being leaked. Zooko Wilcox, cypherpunk and CEO of the Electric Coin Company, likens the internet to a bucket full of holes, spilling water/data all over the floor. Using discrete privacy tools like VPNs is just plugging one of those many leaks. 

“If you have pervasive leakages, then whoever’s the most powerful gains from that,” said Wilcox. “If we have an internet in 10 years where almost everyone uses Facebook for almost everything then that’s a privacy problem that immediately leads to a power problem.”

There are abusive things in the internet structure right now.

Wilcox said the people who argue you don’t need privacy if you have nothing to hide are comfortable within the status quo. They aren’t being persecuted for dissent. They aren’t attending social justice protests in the U.S. and being digitally tracked and dossiers assembled on and they aren’t the minority Uighur Muslim population in China, which are digitally monitored and locked up in camps. 

“Privacy is just a means to an end,” said Wilcox. And that end is some level of reclaiming power from those who disproportionately possess it. 

How do we reclaim privacy?

But the question of what we mean by privacy rises again when it comes time to ensure it. Do we do so through policy and law? Through tech? Can the internet of today, the way it’s constructed, even preserve our privacy?

Jon Callas, a Senior Technology Associate at the ACLU, said the first thing he thinks of regarding a privacy-focused web as an engineer is what the requirements statement is, or the discrete goals and workflow of any project. Such specifications might work when applied to a single project, but are ill-suited for tackling something as broad and multifaceted as a private web. 

“Give me a use case and a scenario. That would be a touchstone that I could use to put things together,” said Callas. 

Recent polling shows that 2020 could well be an inflection point for privacy, a time in which the U.S. population might be open to scrutinizing what we mean by privacy, and willing to value it in ways we haven’t previously. 

Eighty-one percent of Americans say they have little control over the data collected by companies and the government respectively, Pew Research finds. A majority thinks the risks of companies and the government collecting their data outweigh the benefits. 

Between China’s Great Firewall, the U.S. considering anti-encryption bills and the general fracturing of the internet under the guise of cyber sovereignty, a private web is more important now than ever. 

A day in 2030

You wake up in 2030. Many things look the same. You still have your computer. Your phone. Slack probably still exists. 

When you log on, navigating the web is up to you, but it’s bolstered by your own AI. The AI starts up as soon as you log on, and while you’re working, so is it. It’s trawling the web, parrying spam, searching information indicis without feeding you the top sites Google would send you to keep you in their walled garden as long as possible. Unlike AIs that are working on behalf of a company, this one has a single fiduciary responsibility – you. 

Wilcox said we already rely on algorithms and AI for many parts of our lives. Facebook’s newsfeed decides what friends we see most often. Google decides what information you get. And while sure, there is an element of convenience to that, it’s not serving you. Such technology is ultimately designed to eternally serve the company. 

“Maybe you have the same thing with AI that helps you manage your text messages for your friends, or maybe you even have one that’s loyal to your family,” said Wilcox. 

Callas echoed this idea, imagining a privacy-oriented web where an AI is monitoring your security, looking for data leaks, or filtering spam. Gmail already does something along these lines, flagging spam, and putting emails into inboxes such as primary or promotional. 

But imagine that AI writ-large existing alongside you on the internet. In ten years, the frequency of attacks and attempted data breaches is unlikely to decline. Such attacks happen with speed and execution that make it difficult for a person to counter in real-time.   

lianhao-qu-lfan1gswv5c-unsplash-4

(Lianhao Qu/Unsplash)

Alongside this, Callas said we also might need to rethink the open access nature of the internet. We have many open access systems. For example, you can call any phone number you want. You can text any number you want. But computers enabled us to send hundreds of texts in seconds to people who don’t want them. Giving people more agency and consent, through an AI like this, might mean you have to close access to some of these systems, or at least make them dependent on permissions. 

In such a scenario, someone may try to call you, only to be paused by your AI. Callas lays out a scenario in which such an AI might see that this person has written you an email before, asking you to speak. It’d then go over to LinkedIn and see there is one person you have in common and might suggest you take this call.

“There are abusive things in the internet structure right now,” said Callas. “So we need to have explicit relationships when it comes to information sharing, because some of it we may well be okay with.”

The challenge there is making those relationships explicit, when so much of the data we share is determined by opaque terms of service, third parties, and other data sharing agreements. 

Callas compares our current data rights to a time before food labeling, when companies didn’t have to disclose their ingredients. He can see new rules like that coming down the pipe. 

Apple, which has sought to distinguish itself among big tech companies for its privacy stance, is going to be offering a nutrition label for data of sorts that discloses what an apps collects at a glance, in its new operating system. 

There are also tools like VPNs, encryption, and other things involved. But often, to get back to Wilcox’s bucket metaphor, you’re just plugging holes that are a fundamental part of the underlying structure of the internet, at least as it stands. 

Historically, the internet did not include privacy protections, so people tried to bolt privacy onto the internet.

“Privacy is the elimination of all the holes that are exposing you to someone who would exploit or take advantage of you,” said Wilcox. “That’s not a feature. That’s like an emergent property of the whole system, the whole internet.”

For Michelle Dennedy, a privacy lawyer who has worked at Cisco, Intel, and elsewhere, it comes down to functionalizing consent. Breaking processes down to forms of authorization, and having a way to give that authorization on multiple levels, will be key.

“How long can you look at something? How long is it authorized? These things have to be explicit, and based on informed consent. When I go to the doctor and take my clothes off, I don’t expect there to be cameras in there broadcasting that business for the world to see. But that’s what we have online.”

She sees a future where we use universal modeling language to give software explicit guidelines as to how to manage privacy. What data it’s getting, why it’s getting it, where it’s being stored, who it’s being shared with, all of these questions are ones that once decisions are made, can be reinforced not just using laws or policies, but by the tech itself. 

Intractable problems

Today’s privacy controls are features bolted onto the frame of the internet itself.

“Historically, the internet did not include privacy protections, so people tried to bolt privacy onto the internet,” says Harry Halpin, a radical open-internet advocate and CEO of Nym, a privacy-tech startup. “The way they do that is create a virtual network on top of internet protocols, called an overlay network.”

From there, said Halpin, it’s a matter of disrupting packages of data that flow through the web, carrying everything from search queries to instant messages. Those pieces of data create meta data, which is essentially information about the data that is being sent. 

This, for example, is how the NSA tracked and mapped terrorist suspects’ calls, by seeing what number they called, how long they called for, and how often they called. The data about the data can tell you a lot about the data itself.

Nym is mixing up that metadata through a structure known as a mixnet, which mixes the packages of data together, repackages them, and therefore scrambles the metadata into something unintelligible from what it was before. 

Halpin recognizes, though, that Nym is operating fundamentally on the overlay network, not the very protocol of the internet itself. To really get at it, you’d have to go one layer down to the server-level control by internet service providers. 

“We don’t have access to rebuild the fundamental protocols, and even if you rebuild the protocols, you’d then need to build some of the protections into the routing network and the fundamental hardware,” said Halpin. “Which I think is possible in the future. In that way, you could imagine a completely private internet, with data that’s resistant to mass surveillance and also using very few identifiers.”

From the person-oriented AI, to further legal consent enshrined in tech, or even the server level development of privacy, there are a number of tools that can be used to plug holes in privacy. But to make a truly privacy focused web, you’d need to get rid of the honeycombed bucket we have, and develop one that doesn’t leak at all. 

The slow march of development

A privacy-oriented web would be a challenge even if there weren’t large companies and governments that were interested in preventing it. But some experts also say the internet just doesn’t move fast in its development, and that when it comes to developing a new browser, or rethinking email, those projects in and of themselves take a long time.

Callas began our conversation by discussing how we could improve email, which is one discrete part of the internet. But he said such a project would take ten years. 

And while some developers and companies lived by sayings such as “Move fast and break things,” those stakes are much higher when you’re trying to develop something for privacy. Because if it’s broken, it negates its entire reason for existing. 

Dan Guido, CEO of cybersecurity firm Trail of Bits, said that, while we are likely to see mild improvements in tools like encryption, other tools like a more privacy protecting browsers would be a huge lift. He is surprised that projects like Mozilla Firefox still exist, given most browsers are developed by huge companies that have an incentive to direct users to their products. Weeks after we spoke, Mozilla laid off 250 people. 

“I think that the internet in 10 years is going to look a lot like the internet now with a few minor modifications,” said Guido. “But there will be this divergence of haves and have nots in terms of security and privacy that’s really clear and easy to see, and that grows wider every day.”

In his work as a security professional, he sees that gap most clearly in consumer-facing products versus enterprise ones. Consumer-facing browsers like Chrome or Safari are doing a better and faster job of updating their privacy than, for example, enterprise networks that prize ease of use, stability and interoperability. Just think about how hard it is to get everyone in a workplace to use two-factor authentication. 

He said that some of the major privacy protections might, a little ironically, come through big players like Apple and Google, which are working on these issues, and already have their devices in the hands of millions. 

Callas also expressed openness to vestiges of today’s internet living on in 2030. He doesn’t mind ad targeting, for example, in part because he recognizes it supports so much free stuff on the internet. But he wishes it was more accurate.

This is where the idea of reconceptualizing how we think of privacy is crucial. Because, again, the concept means different things for different people. Callas might be okay with good ads. I might not. I may be okay with a messaging service devoid of frills and run on an independent server. Some people will kill for their emojis. 

A privacy web in 2030 will likely not have everything one person wants. But whether it’s a person-focused AI, or just better encryption, it would offer greater control. Giving people more agency than they have today, where so much of what happens is opaque to the end user, seems like a good, logical step.

“In the future internet, I’ll have all the things I need, and everyone has all the things they need to give them real autonomy and real human dignity,” said Wilcox. 

“They’ll have the ability to socialize and form connections with friends and family and whoever without any third party being able to intermediate either to supply or to censor or to influence their relationships.” 

cd_internet_2030_endofarticle_banner_1500x600_generic_1
Loading data ...
Comparison
View chart compare
View table compare
Back To Top