[Profile picture of Ruben Verborgh]

Ruben Verborgh

Trust takes time

Only people can trust, but only machines scale well.

Today’s websites and apps are built to compensate for an absence of trust, rather than to support its growth. Customers and companies both understand that human handshakes no longer scale in the digital age, and surrender to their replacement by a tap of the finger on a button labeled I accept. While every handshake comes with an expectation of honesty, this button never did: we know we are lying the moment we touch it, as does the author of the legalese no one expects anyone to read. Navigating a trustless world is a heavy price to pay for the boundless freedoms of the World Wide Web—except it doesn’t have to be like this. Although trust remains an inherently human relationship, machines can help us build and evolve our relationships by handling the fiddly bits on our behalf. This liberates people to refocus on gaining mutual value from our relationships, without the huge burden of their setup and maintenance. Everything starts with the realization that data technology is fundamentally trust technology, and thus a transversal layer within any ecosystem rather than just the icing on the cake.

Trust is a journey

Weally, daddy? Weally?
He’s 6 and looks at me, as if I know anything about this world that he does not.

Trust is everything. When they’re small. And when you wish they still were.
It’s instinctive, until it’s not. It’s earned. It’s built.
It breaks. It gets lost. It vanishes for good.
It gets restored.

Nothing makes us more human than our ability to trust another person. When they deserve it, when they’re worthy of it. And unfortunately when they’re not.

I tell him about the tooth fairy, to instill within him the importance of dental hygiene. With the same certainty that a baby tooth underneath his pillow will be rewarded with a shiny gemstone, he recites that morning and evening brushing sessions are supposed to last 2 minutes each and touch upon every single tooth (yes, also those in the back).

I tell him about the ginormous banana peel on the road that caused the car in front of me to spin uncontrollably, so then we all drove very slowly just in case more supersized fruits appeared. And they did—so that’s why daddy arrived a little late today, my boy.

Weally, daddy?

Well, not really. We both laugh as he remembers that giant bananas probably ought to not exist, or at least, that the likelihood of his father pulling his leg—again—far outweighs the practical hassle of landscaping the trees on which they’d have to grow. We like to keep each other sharp.

[photograph of a person holding the hands of a child]

Trust is the essence of every human relationship. For every person, every parent, and every child. We’ve been building this relationship between us from the day he was born. Connecting from one crafty mind to another, we more than most desire to understand the certainties in life within a world that appears so deeply uncertain. And that’s okay. That’s why I’m here.

It doesn’t mean I’ll never lie or disappoint. Or that he wouldn’t have those options.
It means that ultimately—even when this inevitably happensI’ll be there.

There will be ups and downs. We’d both be lying if we pretended there wouldn’t be.
What matters is that I’ll show up.
Perhaps not for every single up. Most definitely for every single down.

They say trust is earned and not given, although that’s generally untrue. Oftentimes trust is thrown, vulnerably, into our unexpecting hands, begging us to become the one who judges whether we’re worth it or not. Can we trust ourselves enough to make that important decision? In any case, as people, we must.

It’s always a risk. Sometimes for a reward, other times for a lesson.
I wish there was a way for him to learn without the lessons that teach us.
Yet I refuse to reduce trust from a leap of faith to merely fate.

Trust is a relationship. It evolves over time.
It’s an inherently human relationship that anchors our presence into a changing world.

Relationships exist for mutual value

Distrust reduces value for all parties

People don’t trust companies with their data. Neither do the companies themselves, because breach after breach exposes how terribly most of them are at keeping personal data safe. That shouldn’t surprise anyone: safeguarding data is an extremely complex matter, and not even the core business of most of the companies involved. As the Big Data craze conned every business into believing they’re now a data company, it’s not hard to imagine that many of them begrudgingly continue to implement personal data storage, for lack of a more readily available universal alternative.

Considering trust as a relationship leads to a slightly more surprising consequence: if people don’t trust companies anymore, it necessarily also means that companies don’t trust people anymore.

We could ridicule those poor companies crying in a corner about how no one trusts them, or we could recognize that they or anyone might mock us in the same way. That’s what it means to be in a relationship: ultimately, we’re both in the same boat. A lack of trust, and hasty attempts to bridge that gap, will eventually hurt us both—only the timelines might slightly differ.

There are no winners when there is no trust and when it is cumbersome or difficult to establish sustainable trust relationships. No potential of trust means no potential for growth or evolution; it’s an uncomfortably and ridiculously high threshold that immediately reaches an unsatisfactory plateau. Nothing symbolizes the failure of trust more than the dreaded consent dialog. In case you’re reading this from a peaceful future where these no longer exist, this is what they usually look like:

[screenshot of Spotify titled “We Care About Your Privacy”, detailing that they and their 702 partners aim to store and/or access information]
Companies do not trust people to trust them, and their websites ask us to accept that fact.

We could lament endlessly over the regrettable state of privacy, yet that won’t change that we’re all losing here. Specifically, all 704 of us, because Spotify invited some of its close friends into what clearly nobody expects to turn into a trust relationship anymore. Here I am, wanting to buy a service. And there they are, wanting something else.

Many companies out there want a relationship with us, more precisely with our data—or so it seems. The current construction of those relationships more closely resembles parasitism rather than commensalism. Such a one-way approach is unsustainable; it’s a short-term game that hurts both parties in the long term. The more frustrating realization is that it doesn’t have to be this way. So much unexplored mutual benefit lies dormant in most of our relationships, because panicked companies grab what they consider low-hanging fruits, even when slightly rotten or outright toxic.

The main result is that companies are nowhere near the most valuable data points that could create and sustain growth in our relationship, and people are getting almost no value out of their own data. As long as we continue to perpetuate the ludicrous hunter–gatherer mindset over imagined data scarcity, we limit the value of our own data to what we can do with it ourselves, with our own hands. Which is, let’s be honest, not very much: other parties could probably help us substantially. Yet we’re told that privacy is data’s holy grail, the ultimate compass for which value we should and shouldn’t have.

Responsible access is more than privacy

The truth is that achieving 100% privacy is simple: throw all of your paperwork and devices into a box, fill it with concrete, and drop it in the nearest ocean. Nobody will be able to access your data—not even you. And that’s precisely the problem. The majority of discussions misrepresent privacy as a single slider, a simple Yes, please! knob everyone obviously wants to crank all the way up all the time.

But it’s unproductive and utterly meaningless to look at privacy as one-dimensional. Nobody protects their home from burglars by cementing all the doors shut and throwing away all keys. The real challenge is that we still need access! We need to enter our houses, so we use keys. We need to log into our accounts, so we use passwords. Unfortunately, anything that makes it easier for us to get in, also makes it easier for others. That’s the fundamental tension: we cannot infinitely make the unwanted harder without ultimately making the wanted impossible. So let’s talk about trade-offs instead.

People don’t only want privacy. We want utility from our data. We deserve utility from our data. We need our data to work for us. How can we get that utility without unnecessary compromise? The information security community disentangles the mythical maximal privacy button into three nuanced sliders in the so-called CIA triad:

[a triangular tension field between Confidentiality, Availability, and Integrity]
The CIA triad more precisely refers to the vague notion of privacy with Confidentiality, which it balances against the desired Availability of the data, as well as the Integrity with which that data is stored and exchanged.

The moral of the story is that no one can simultaneously maximize all three sliders:

  1. Perfect confidentiality means that our data won’t be there when we need it.
    Aisha may or may not be deadly allergic to the food you’re about to serve them.

  2. Perfect availability means that everybody knows.
    Everyone, Aisha left the keys under the doormat when leaving for the restaurant!

  3. Imperfect integrity compromises with a mix of confidentiality and availability.
    Aisha has some kind of food allergy, and eats out on Wednesdays or Fridays.

That’s why I insist that, contrary to popular belief, our data isn’t flowing well enough. Our availability problems remain still so much bigger than our confidentiality issues. We’re missing out on enormous value from potential usage of our data. Despite the scale of personal data misuse, the majority of our current confidentially is nothing but an accidental side-effect from our data not being used to our maximal advantage. And make no mistake: while the triad prohibits simultaneous maximization, minimizing all three sliders is quite possible. That’s why we often end up with the worst of all worlds: even in cases where we forcibly give up confidentiality, we aren’t subsequently receiving the expected value from the increased availability potential.

Let’s halt the narrow gaze on privacy and broaden our focus to the confidentiality/availability trade-off. We’re so preoccupied with preventing the other party in the relationship from gaining value from our data, that we are neglecting to receive crucial value from our data ourselves. When we assume that people’s only role in business-to-consumer relationships is shielding off their data, and companies only role to seize it, neither of us will ever get the utility from our data that would make us both better off.

Consent cannot substitute trust

Brief Explosion Of Trust

The most interesting question isn’t what companies want—because, frankly, we’ll never know unless we scrutinize all 703 privacy policies. The key to understanding today’s lack of trust is investigating why they want data, and more specifically why they need it now.

It’s because every relationship between a person and a company nowadays roughly follows a timeline like this:

[an interaction timeline, starting with a consent dialog, and then loads of data exchanges]
Consent technology sells the illusion that all legal aspects of a complex customer–company relationship can be settled with a single click, which precedes that relationship entirely. Such a Brief Explosion Of Trust eliminates any potential for growth.

Every relationship is started by what I call a Brief Explosion Of Trust, or BEOT for short. A BEOT is the 5- to 10-second span in which a company pretends to settle all of the trust required for a potentially multi-year relationship, deceiving themselves into believing such magical hand-waving to be legally meaningful or somehow technologically fixable. After this intense moment, loads of exchanges take place, in which data, services, goods, and payments change hands. We no longer have to build trust, because consent dialogs are supposedly so powerful that they perform all of the heavy lifting beforehand.

I’m by no means a lawyer, but I am a technologist, and a human living in the real world. As a parent, witnessing the daily investment into the trust relationship with my sons, as a friend, as a colleague, as a partner, and as a former 90s kid in a small town where local bakeries and butchers would gain my loyalty with post-purchase candy, I can assure you: nothing can reduce trust to a 5-second job. That’s not a technological or legal limitation, but a proper recognition of the nature of trust as a human relationship.

Our BEOT concept has little to do with trust, but everything with the Old English bēot, which is a sort of ritualized promise. Before they’d go to war, an Anglo-Saxon warrior would proclaim a bēot to signal their acceptance of a virtually impossible challenge, such that tremendous glory would be theirs when they would actually accomplish it. I suppose it’s a bit like when I visit a website, and before even being allowed to inspect the battleground, I accept an impossible vow to the GDPR gods, based on a piece of text that leaves me functionally more illiterate than the average Anglo-Saxon. Yet I patiently await my glory, increasing my chances through multiple BEOT rituals each day.

Such war metaphors vividly characterize the news media’s undying love for portraying individuals in supposedly endless David versus Goliath battles, in which we tirelessly fight to protect our data from the giant corporations whose sole aim is to steal it all. Such total nonsense fails to capture reality for at least two reasons:

  1. People need more than privacy, as we reviewed earlier.
  2. Companies don’t want to steal data.

In all fairness, most people think they want privacy, and a lot of companies are stealing personal data, so I understand the confusion. But remember that we’re looking into relationships, so whatever hurts one party eventually hurts the other in the long run.

Therefore, we can only come to a mutual understanding by taking both parties’ concerns seriously—which is where today’s trust technology hopelessly fails us all. The state of the art consists of Consent Management Platforms, whose job is to generate the hideous dialogs nobody understands or reads, just so we can all pretend we did, and move on like nothing happened, because nothing actually did.

Everyone knows people don’t want this. Why would anyone think companies want this? Make no mistake: Spotify doesn’t want this either. They do it because they assume they have to. Because it’s what the state of the art in techno-legal solutions has to offer. Companies have been fed the BEOT fable for much longer than we have. Every click on I accept is one of five daily lies for us, and one of thousands for them. They’re being sold the deception that one moment can establish lasting relationships, and that the law—as a result of initiatives like GDPR and the Data Act—leaves no better alternative.

Trust needs a timeline

What sustainable relationships look like

And when we remember that trust is a relationship, a much more profound loss appears: why is it so hard to enthusiastically participate? What if I want to share my data for a specific purpose, because it gives me a benefit I need?

Interestingly, there are troves of data that people would love to share, which many companies are actually afraid to process. Because the more sensitive the piece of data, the more specific it is to a certain person, and hence the more significant it becomes to offer them unique insights or support.

I do want my supermarket to know that my microwave oven is broken, and that I have 7 people over for dinner tonight, and also I came by bike because the van is in the shop, and I have a cat that only likes a specific brand of tuna, which obviously I forgot about now since I clearly have much bigger worries. And on a more serious note, I do want my medical information to be immediately available to any doctor or passerby who finds me in an absolute state of panic.

None of those and millions of other examples will ever happen through a single upfront BEOT of consent, because the whole point of a relationship is that you don’t know where it is going. So that’s exactly what both people and companies need: not systems that pretend to know where we’re going, but rather systems that recognize we don’t yet know where we’re going. We need to allow for evolution and open-endedness.

Because a real-world trust timeline doesn’t resemble a BEOT, but looks like this:

[an interaction timeline in which every exchange of data comes with a little bit of trust]
An evolving trust relationship has no need for overreaching consent dialogs. The exchange of each piece of data is accompanied by a dedicated piece of trust, protecting both parties.

Comparing this to the current timeline of Consent Management Platforms, we notice:

  • On the first website visitnothing happens. That makes sense because, just like in a physical shop, we are just browsing. It’s literally called a Web browser. We weren’t planning any commitments yet; we’re investigating whether we have any interest in doing so. This is the exploration phase.

  • Nobody needs to make any promises they don’t understand or can’t keep. No single point in the relationship needs to establish the exhaustive set of ground rules for all of the possibly forthcoming interactions.

  • Trust is entirely incremental. Each exchange of data only requires the trust pertaining to that specific exchange. This includes assertions about the history of the data, indicating its origin and reliability. It also includes policies on the destiny of the data, describing the allowed usage.

    • The encapsulating trust envelope protects the person, because it facilitates correct processing of the correct data.

    • The trust envelope equally protects the company, because it provides them with specific, legally valid proof as evidence in automated legal audits.

The key aspect is the technical support for the evolvability of the trust relationship. Companies’ data harvesting behavior is largely driven by the fear that it will be too expensive to obtain the data they need at a specific point in the future. Within a sustainable relationship, all you have to do is ask: the right data and trust can be made available momentarily when it adds value for both parties.

And if it doesn’t, well, try asking for more and better data. Since we’re no longer limited by the BEOT at the beginning, any question can now be posed, and some will result in new mutually beneficial data points that most company lawyers had long given up on.

  1. Consent is the most (ab)used but in many ways the least flexible.
  2. Legitimate interest is often claimed, but not always sincerely.
  3. Vital interest safeguards the individual.
  4. Public interest safeguards society.
  5. Legal obligation ensures legal processes can be executed.
  6. Contract allows agreed-upon transactions to take place.

Towards trust technology

While making it marginally more expensive to do the wrong thing, GDPR made it substantially more expensive to do the right thing. In a reality with finite resources, appealing to morality is a double-edged sword: society has constructed an economy that rewards companies who prefer financially responsible choices. Loudly complaining when they subsequently choose the most cost-effective path towards personal data, paints a heroic battleground picture but does very little to improve the status quoand in fact further reaffirms it.

Realistically, we must turn the tide by building technology that makes responsible data handling cheaper than its alternative. Calling out the simple economics of the situation sounds like a cynical statement, but it doesn’t need to become an inconvenient truth if we pay close attention to mutual benefit.

This is where current systems for personal data management fall short: by admitting their own failure to identify win–win opportunities, they don’t even try maximizing the value for both parties. They cannot establish sustainable relationships, because the entire system design embodies the absurd belief that one-sided relationships represent the best we can all agree on. Yet by definition, there is much more value in a two-sided relationship, where contributions benefit both parties.

Although most (ab)use of consent online remains legally unchallenged to this day—and thus still the cheapest option—its interaction timeline clearly illustrates that upfront consent precludes evolvable trust. Although not effectively ruled unlawful, its concept is already stretched so far beyond meaningful boundaries, that no one should expect any more growth for consent-based technologies. Companies will not obtain additional data or insights via consent mechanisms, and people will not gain additional protections.

Due to technology’s fixation on consent, we all share in the consequences:

  • People are missing out on the value within their data that could drastically improve their experiences and quality of life.

  • Companies are missing out on crucial data that could improve the quality of their service offering and their competitive position.

The time is ripe for a new generation of techno-legal systems, both on the side of people and on the side of companies. These systems set up and maintain long-term relationships and negotiate mutual benefit, which ultimately forms the driver of all economic transactions. The cornerstone of the monetary economy is that you and I have something the other desires, and an exchange can result in a positive outcome for both. Removing unnecessary friction facilitates more and better exchanges for all.

Although data informs many of our daily interactions, trust will always remain a relationship between people. Machines cannot replace our trust, but they could so much more precisely support us in getting the value we deserve from our relationships. That’s why trust must be an integral part of data ecosystems, with respect for its evolving timeline rather than obsessing about its start. Making our data work for us truly starts when technology makes our online relationships scale in a sustainable way.

Ruben Verborgh

Enjoyed this blog post? Subscribe to the feed for updates!

Comment on this post