My wife and I had arranged a night out, and since nobody we knew was available to look after our young children, we entrusted their care to a stranger – yes, there’s an app for that. As we stepped out, I wondered if what we had done was insane.
We didn’t know the woman who was looking after our children and we didn’t know anyone who knows her. We didn’t know where she lived, except that it wasn’t very near us. We knew she wasn’t from our cultural “tribe”; we are white and middle class, she wore a hijab and spoke heavily accented English. Our knowledge of her comprised of her name, the number of times her services had been engaged, and a few short testimonials, apparently from real people, though who knows?
In the grand context of human history, what we were doing was outrageous. Nearly everyone who has ever lived would only trust someone they knew to look after their children. But today it does not seem so unusual. People seem to have steeply increased their propensity to trust strangers. We rent out our homes to them, we get into their cars, we meet them for a drink on the understanding that we may be having sex later on (so I hear).
In theory at least, this is a good thing. Social scientists have compiled a mountain of evidence that what they call “social trust” – trust in fellow citizens you haven’t yet met – is the secret to a successful society. Countries with higher trust in strangers have higher economic growth, less corruption, and happier citizens. They have lower suicide rates, less chronic illness and fewer fatal accidents (the economist John Helliwell suggested that if France was as trusting as Norway, its traffic fatalities would be halved). Politicians often debate the best way to increase productivity or improve education. Few propose policies to raise trust. But maybe our smartphones are already providing the answers.
In Who Can You Trust? the writer Rachel Botsman argues that we are at the start of an exciting third age in human trust. The first age was local, when we lived in small groups and everyone knew everyone else. The second, which arrived with the industrial age, was institutional, in which we were able to confidently do business with strangers thanks to a nexus of laws and contracts. The third chapter is distributed, in which trust, instead of flowing vertically via institutions, flows horizontally through a vast, algorithmically organised network. The neighbourly interactions central to pre-industrial society have been recreated, except now a neighbour is anyone with whom we share an app.
Is this really trust? I’m not sure. I trust an eBay vendor to send me the goods because I know it’s in their interest to be rated as a good seller. I trust Uber drivers because I know their livelihood depends on getting good ratings, and they trust me because without a decent rating I won’t be able to use the service.
To me, real trust implies a leap of faith in another person. Rather than extending human trust, it might be better to say that algorithms are replacing it. It’s not the person we trust, it’s the system of mutual compliance. The invention of Bitcoin, in 2009, was partly a response to the financial crisis, which exposed the fragility of a system that relies on people doing the right thing. Humans are caring and decent, but they’re also erratic and dishonest. Machines are reliable and, as yet, free of self-interest. No wonder we’re outsourcing so many decisions to them.
But I don’t think we should underestimate the long-term potential of a technology that encourages us to interact with people we’ve never met. The more we deal with strangers, the more comfortable we get with them. The historian Norbert Elias showed how the great decline in violence that took place in post-medieval European society was accompanied by the emergence of norms of conversation – good manners, essentially. Over the course of centuries, and without anyone planning it, people developed protocols for when to speak in a conversation, and how far to stand from the other person when speaking. These protocols enabled strangers to converse, relatively free from the fear they were about to be smote with a sword. And the more they talked, the less they fought.
Our most noble qualities began as compliance mechanisms; the evolutionary basis of altruism is the knowledge that, somewhere down the line, favours get returned. Writing started out as a technology of compliance. It emerged in ancient Mesopotamia as a way for bureaucrats to keep tabs on farmers. The earliest documents are not love poems or novels, but lists of crops, yields and taxes. Writing things down didn’t negate the need for social trust, but it reduced the burden on it, which eventually enabled such trust to flourish beyond circles of friends and family.
Today’s technologies may perform the same role. They ameliorate our fears of being harmed by people we don’t know, which may, over time, encourage us to engage with them more deeply.
Some of the most successful sharing apps are designed to support human interaction, rather than substitute for it. Airbnb puts a huge amount of thought into establishing a relationship, if only a fleeting one, between host and customer. Peer-to-peer networks can feel impersonal, but that doesn’t mean fellow feeling won’t wrap itself around them over time, like vines over wire.
Our children adored their babysitter, who sang them a lullaby she used to sing to her daughter. The encounter didn’t feel cold and inhuman. It felt like a little reminder of the kindness of strangers.
Ian Leslie is a journalist and author of “Curious” and “Born Liars”.
This article appears in the 07 Nov 2018 issue of the New Statesman, Revenge of the nation state