Trust takes time
Only people can trust, but only machines scale well.
Today’s websites and apps are built to compensate for an absence of trust, rather than to support its growth. Customers and companies both understand that human handshakes no longer scale in the digital age, and surrender to their replacement by a tap of the finger on a button labeled I accept
. While every handshake comes with an expectation of honesty, this button never did: we know we are lying the moment we touch it, as does the author of the legalese no one expects anyone to read. Navigating a trustless world is a heavy price to pay for the boundless freedoms of the World Wide Web—
Trust is a journey
Weally, daddy? Weally?
He’s 6 and looks at me, as if I know anything about this world that he does not.
Trust is everything. When they’re small. And when you wish they still were.
It’s instinctive, until it’s not. It’s earned. It’s built.
It breaks. It gets lost. It vanishes for good.
It gets restored.
Nothing makes us more human than our ability to trust another person. When they deserve it, when they’re worthy of it. And unfortunately when they’re not.
I tell him about the tooth fairy, to instill within him the importance of dental hygiene. With the same certainty that a baby tooth underneath his pillow will be rewarded with a shiny gemstone, he recites that morning and evening brushing sessions are supposed to last 2 minutes each and touch upon every single tooth (yes, also those in the back).
I tell him about the ginormous banana peel on the road that caused the car in front of me to spin uncontrollably, so then we all drove very slowly just in case more supersized fruits appeared. And they did—
Well, not really. We both laugh as he remembers that giant bananas probably ought to not exist, or at least, that the likelihood of his father pulling his leg—
Trust is the essence of every human relationship. For every person, every parent, and every child. We’ve been building this relationship between us from the day he was born. Connecting from one crafty mind to another, we more than most desire to understand the certainties in life within a world that appears so deeply uncertain. And that’s okay. That’s why I’m here.
It doesn’t mean I’ll never lie or disappoint. Or that he wouldn’t have those options.
It means that ultimately—
There will be ups and downs. We’d both be lying if we pretended there wouldn’t be.
What matters is that I’ll show up.
Perhaps not for every single up. Most definitely for every single down.
They say trust is earned and not given, although that’s generally untrue. Oftentimes trust is thrown, vulnerably, into our unexpecting hands, begging us to become the one who judges whether we’re worth it or not. Can we trust ourselves enough to make that important decision? In any case, as people, we must.
It’s always a risk. Sometimes for a reward, other times for a lesson.
I wish there was a way for him to learn without the lessons that teach us.
Yet I refuse to reduce trust from a leap of faith to merely fate.
Trust is a relationship. It evolves over time.
It’s an inherently human relationship that anchors our presence into a changing world.
Relationships exist for mutual value
Distrust reduces value for all parties
People don’t trust companies with their data. Neither do the companies themselves, because breach after breach exposes how terribly most of them are at keeping personal data safe. That shouldn’t surprise anyone: safeguarding data is an extremely complex matter, and not even the core business of most of the companies involved. As the Big Data craze conned every business into believing they’re now a data company, it’s not hard to imagine that many of them begrudgingly continue to implement personal data storage, for lack of a more readily available universal alternative.
Considering trust as a relationship leads to a slightly more surprising consequence: if people don’t trust companies anymore, it necessarily also means that companies don’t trust people anymore.
We could ridicule those poor companies crying in a corner about how no one trusts them, or we could recognize that they or anyone might mock us in the same way. That’s what it means to be in a relationship: ultimately, we’re both in the same boat. A lack of trust, and hasty attempts to bridge that gap, will eventually hurt us both—
There are no winners when there is no trust and when it is cumbersome or difficult to establish sustainable trust relationships. No potential of trust means no potential for growth or evolution; it’s an uncomfortably and ridiculously high threshold that immediately reaches an unsatisfactory plateau. Nothing symbolizes the failure of trust more than the dreaded consent dialog. In case you’re reading this from a peaceful future where these no longer exist, this is what they usually look like:
We could lament endlessly over the regrettable state of privacy, yet that won’t change that we’re all losing here. Specifically, all 704 of us, because Spotify invited some of its close friends into what clearly nobody expects to turn into a trust relationship anymore. Here I am, wanting to buy a service. And there they are, wanting something else.
Many companies out there want a relationship with us, more precisely with our data—
The main result is that companies are nowhere near the most valuable data points that could create and sustain growth in our relationship, and people are getting almost no value out of their own data. As long as we continue to perpetuate the ludicrous hunter–gatherer mindset over imagined data scarcity, we limit the value of our own data to what we can do with it ourselves, with our own hands. Which is, let’s be honest, not very much: other parties could probably help us substantially. Yet we’re told that privacy is data’s holy grail, the ultimate compass for which value we should and shouldn’t have.
Responsible access is more than privacy
The truth is that achieving 100% privacy is simple: throw all of your paperwork and devices into a box, fill it with concrete, and drop it in the nearest ocean. Nobody will be able to access your data—Yes, please!
knob everyone obviously wants to crank all the way up all the time.
But it’s unproductive and utterly meaningless to look at privacy as one-dimensional. Nobody protects their home from burglars by cementing all the doors shut and throwing away all keys. The real challenge is that we still need access! We need to enter our houses, so we use keys. We need to log into our accounts, so we use passwords. Unfortunately, anything that makes it easier for us to get in, also makes it easier for others. That’s the fundamental tension: we cannot infinitely make the unwanted harder without ultimately making the wanted impossible. So let’s talk about trade-offs instead.
People don’t only want privacy. We want utility from our data. We deserve utility from our data. We need our data to work for us. How can we get that utility without unnecessary compromise? The information security community disentangles the mythical maximal privacy button into three nuanced sliders in the so-called CIA triad:
The moral of the story is that no one can simultaneously maximize all three sliders:
-
Perfect confidentiality means that our data won’t be there when we need it.
Aisha may or may not be deadly allergic to the food you’re about to serve them.
-
Perfect availability means that everybody knows.
Everyone, Aisha left the keys under the doormat when leaving for the restaurant!
-
Imperfect integrity compromises with a mix of confidentiality and availability.
Aisha has some kind of food allergy, and eats out on Wednesdays or Fridays.
That’s why I insist that, contrary to popular belief, our data isn’t flowing well enough. Our availability problems remain still so much bigger than our confidentiality issues. We’re missing out on enormous value from potential usage of our data. Despite the scale of personal data misuse, the majority of our current confidentially is nothing but an accidental side-effect from our data not being used to our maximal advantage. And make no mistake: while the triad prohibits simultaneous maximization, minimizing all three sliders is quite possible. That’s why we often end up with the worst of all worlds: even in cases where we forcibly give up confidentiality, we aren’t subsequently receiving the expected value from the increased availability potential.
Let’s halt the narrow gaze on privacy and broaden our focus to the confidentiality/
Consent cannot substitute trust
Brief Explosion Of Trust
The most interesting question isn’t what companies want—
It’s because every relationship between a person and a company nowadays roughly follows a timeline like this:
Every relationship is started by what I call a Brief Explosion Of Trust, or BEOT for short. A BEOT is the 5- to 10-second span in which a company pretends to settle all of the trust required for a potentially multi-year relationship, deceiving themselves into believing such magical hand-waving to be legally meaningful or somehow technologically fixable. After this intense moment, loads of exchanges take place, in which data, services, goods, and payments change hands. We no longer have to build trust, because consent dialogs are supposedly so powerful that they perform all of the heavy lifting beforehand.
I’m by no means a lawyer, but I am a technologist, and a human living in the real world. As a parent, witnessing the daily investment into the trust relationship with my sons, as a friend, as a colleague, as a partner, and as a former 90s kid in a small town where local bakeries and butchers would gain my loyalty with post-purchase candy, I can assure you: nothing can reduce trust to a 5-second job. That’s not a technological or legal limitation, but a proper recognition of the nature of trust as a human relationship.
Our BEOT concept has little to do with trust, but everything with the Old English bēot, which is a sort of ritualized promise. Before they’d go to war, an Anglo-Saxon warrior would proclaim a bēot to signal their acceptance of a virtually impossible challenge, such that tremendous glory would be theirs when they would actually accomplish it. I suppose it’s a bit like when I visit a website, and before even being allowed to inspect the battleground, I accept an impossible vow to the GDPR gods, based on a piece of text that leaves me functionally more illiterate than the average Anglo-Saxon. Yet I patiently await my glory, increasing my chances through multiple BEOT rituals each day.
Such war metaphors vividly characterize the news media’s undying love for portraying individuals in supposedly endless David versus Goliath battles, in which we tirelessly fight to protect our data from the giant corporations whose sole aim is to steal it all. Such total nonsense fails to capture reality for at least two reasons:
- People need more than privacy, as we reviewed earlier.
- Companies don’t want to steal data.
In all fairness, most people think they want privacy, and a lot of companies are stealing personal data, so I understand the confusion. But remember that we’re looking into relationships, so whatever hurts one party eventually hurts the other in the long run.
Therefore, we can only come to a mutual understanding by taking both parties’ concerns seriously—which is where today’s trust technology hopelessly fails us all. The state of the art consists of Consent Management Platforms, whose job is to generate the hideous dialogs nobody understands or reads, just so we can all pretend we did, and move on like nothing happened, because nothing actually did.
Everyone knows people don’t want this. Why would anyone think companies want this? Make no mistake: Spotify doesn’t want this either. They do it because they assume they have to. Because it’s what the state of the art in techno-legal solutions has to offer. Companies have been fed the BEOT fable for much longer than we have. Every click on I accept
is one of five daily lies for us, and one of thousands for them. They’re being sold the deception that one moment can establish lasting relationships, and that the law—
Consent can never scale
Real-world consent is not an approach for building trust, but rather an explicit acknowledgment of a pre-existing relationship or even lack thereof. Which means that, in cases when no prior relationship exists—
The legal terms inside the Spotify consent dialog barely obscure its desperate screams: “you don’t trust me—
And the mechanism doesn’t even count as actual consent, for two reasons in particular.
First, consent in daily life is something usually only given when you already trust the other person. For example, within intimate relationships, we choose to give or refuse consent for engaging in activities with a partner, on the basis that we trust this person to respect the boundaries we set together. Circumstances and context vary, which is precisely the point: the desired level of trust is left to the individuals, and then explicitized through a verbal expression of consent or refusal. Similarly, we consent to scheduled medical procedures when we—
Curiously, digital consent instead works like an obstetrician performing a C-section on an unsuspecting father who just desperately needed a can of diet soda and happened to be in the wrong room at the wrong time. Consent cannot meaningfully predate trust. Extinguishing any reasonable assessment of trustworthiness, consent platforms execute a half-baked bait-and-switch, cementing the lack of trust from which they try to distract.
Second, the legal world mandates an agreement to be informed and freely given before it can be called consent. Even though laws vary across jurisdictions, I still need to meet a lawyer willing to risk their career arguing that anyone understands even half of the hundreds of pages we routinely click away, purportedly out of our very own volition.
To top it off, a built-in limitation of consent is that it cannot be automated, by definition. Only humans can consent, and we don’t scale well, so neither does consent. A legal analysis of GDPR crushes any remaining hopes of our smartphones improving consent by handling it for us. That doesn’t invalidate our smartphone’s potential to play a key role for trust. It just means the legal basis for such technologies won’t be consent. Any viable future direction for digital trust requires scalable technologies that take the concerns of all parties seriously within each potential relationship.
Trust needs a timeline
What sustainable relationships look like
And when we remember that trust is a relationship, a much more profound loss appears: why is it so hard to enthusiastically participate? What if I want to share my data for a specific purpose, because it gives me a benefit I need?
Interestingly, there are troves of data that people would love to share, which many companies are actually afraid to process. Because the more sensitive the piece of data, the more specific it is to a certain person, and hence the more significant it becomes to offer them unique insights or support.
I do want my supermarket to know that my microwave oven is broken, and that I have 7 people over for dinner tonight, and also I came by bike because the van is in the shop, and I have a cat that only likes a specific brand of tuna, which obviously I forgot about now since I clearly have much bigger worries. And on a more serious note, I do want my medical information to be immediately available to any doctor or passerby who finds me in an absolute state of panic.
None of those and millions of other examples will ever happen through a single upfront BEOT of consent, because the whole point of a relationship is that you don’t know where it is going. So that’s exactly what both people and companies need: not systems that pretend to know where we’re going, but rather systems that recognize we don’t yet know where we’re going. We need to allow for evolution and open-endedness.
Because a real-world trust timeline doesn’t resemble a BEOT, but looks like this:
Comparing this to the current timeline of Consent Management Platforms, we notice:
-
On the first website visit… nothing happens. That makes sense because, just like in a physical shop, we are just browsing. It’s literally called a Web browser. We weren’t planning any commitments yet; we’re investigating whether we have any interest in doing so. This is the exploration phase.
-
Nobody needs to make any promises they don’t understand or can’t keep. No single point in the relationship needs to establish the exhaustive set of ground rules for all of the possibly forthcoming interactions.
-
Trust is entirely incremental. Each exchange of data only requires the trust pertaining to that specific exchange. This includes assertions about the history of the data, indicating its origin and reliability. It also includes policies on the destiny of the data, describing the allowed usage.
-
The encapsulating trust envelope protects the person, because it facilitates correct processing of the correct data.
-
The trust envelope equally protects the company, because it provides them with specific, legally valid proof as evidence in automated legal audits.
-
The key aspect is the technical support for the evolvability of the trust relationship. Companies’ data harvesting behavior is largely driven by the fear that it will be too expensive to obtain the data they need at a specific point in the future. Within a sustainable relationship, all you have to do is ask: the right data and trust can be made available momentarily when it adds value for both parties.
And if it doesn’t, well, try asking for more and better data. Since we’re no longer limited by the BEOT at the beginning, any question can now be posed, and some will result in new mutually beneficial data points that most company lawyers had long given up on.
Bend the law better
There seems to be one glitch in the plan: are we replacing one giant consent dialog by tons of tiny dialogs for each data point? The answer is yes.
The reason this is not a glitch is because, in contrast to manual consent dialogs, the exchange of each individual data point will be managed automatically by my browser or app according to my preferences. I can pick those preferences freely from the entire CIA triad, rather than my app assuming everyone has the same desires.
The legal lack of automation options proves to be consent’s most pressing restriction. While to some extent, machines can say no
on our behalf and thus not consent, they cannot legally say yes
. The ensuing consent fatigue achieves the polar opposite of what GDPR intended to bring: people have (the illusion of) choice, but exercising this right is so cumbersome that many would rather not have it.
Unfortunately, throughout the GDPR document, legislators seemingly assume that every software system is conspiring against citizens and incessantly trying to deceive us. While this might be the case for the majority of systems today, such a rigid viewpoint prematurely kills innovation towards a future where some systems work for people and strive to protect our best interests. So I like to imagine a future where the burden of relationship management and text processing is outsourced to personal automation.
After all, maintaining relationships is hard work, but people engage with hundreds or thousands of companies—far beyond the scale of what any individual can manage. And frankly, far beyond the scale of what most companies can manage. The only reasonable way forward is for our devices to automate substantial parts of those relationships, such that we can focus on those parts where we get the most value.
Due to its lack of automation avenues, consent-based exchange of data points at a rate of several dozens per minute would necessitate continuous screen tapping. Fortunately, GDPR offers multiple possible legal grounds for the exchange of data:
- Consent is the most (ab)used but in many ways the least flexible.
- Legitimate interest is often claimed, but not always sincerely.
- Vital interest safeguards the individual.
- Public interest safeguards society.
- Legal obligation ensures legal processes can be executed.
- Contract allows agreed-upon transactions to take place.
Our digital assistant could thus leverage the other mechanisms when appropriate—
The right to assistive automation seems like the perfect antidote to consent fatigue, because machines never get tired. Furthermore, it emphasizes that privacy and control over data are a right but not a duty. With the right to care also comes a right to not care or not have to care—
Towards trust technology
While making it marginally more expensive to do the wrong thing, GDPR made it substantially more expensive to do the right thing. In a reality with finite resources, appealing to morality is a double-edged sword: society has constructed an economy that rewards companies who prefer financially responsible choices. Loudly complaining when they subsequently choose the most cost-effective path towards personal data, paints a heroic battleground picture but does very little to improve the status quo—
Realistically, we must turn the tide by building technology that makes responsible data handling cheaper than its alternative. Calling out the simple economics of the situation sounds like a cynical statement, but it doesn’t need to become an inconvenient truth if we pay close attention to mutual benefit.
This is where current systems for personal data management fall short: by admitting their own failure to identify win–win opportunities, they don’t even try maximizing the value for both parties. They cannot establish sustainable relationships, because the entire system design embodies the absurd belief that one-sided relationships represent the best we can all agree on. Yet by definition, there is much more value in a two-sided relationship, where contributions benefit both parties.
Although most (ab)use of consent online remains legally unchallenged to this day—
Due to technology’s fixation on consent, we all share in the consequences:
-
People are missing out on the value within their data that could drastically improve their experiences and quality of life.
-
Companies are missing out on crucial data that could improve the quality of their service offering and their competitive position.
The time is ripe for a new generation of techno-legal systems, both on the side of people and on the side of companies. These systems set up and maintain long-term relationships and negotiate mutual benefit, which ultimately forms the driver of all economic transactions. The cornerstone of the monetary economy is that you and I have something the other desires, and an exchange can result in a positive outcome for both. Removing unnecessary friction facilitates more and better exchanges for all.
Although data informs many of our daily interactions, trust will always remain a relationship between people. Machines cannot replace our trust, but they could so much more precisely support us in getting the value we deserve from our relationships. That’s why trust must be an integral part of data ecosystems, with respect for its evolving timeline rather than obsessing about its start. Making our data work for us truly starts when technology makes our online relationships scale in a sustainable way.