Show Hide image

I’m a believer

In our increasingly secular society, many religious people feel their voices are not heard. So here,

After four centuries of breathtaking scientific progress, many wonder why intelligent people would still feel the need to believe in God. Andrew Zak Williams decided to find out. Over the course of several months, he corresponded with dozens of scientists and other public figures, quizzing them on the reasons for their faith. Here is a selection of the responses.

Cherie Blair, barrister
It's been a journey from my upbringing to an understanding of something that my head cannot explain but my heart knows to be true.

Jeremy Vine, broadcaster
There is a subjective reason and an objective reason. The subjective reason is that I find consolation in my faith. The objective reason is that the story of the gospels has stood the test of time and Christ comes across as a totally captivating figure.

In moments of weariness or cynicism, I tell myself I only believe because my parents did; and the Christian faith poses more questions than it answers.

But I still return to believing, as if that is more natural than not doing so.

Richard Swinburne, emeritus professor of philosophy, University of Oxford
To suppose that there is a God explains why there is a physical universe at all; why there are the scientific laws there are; why animals and then human beings have evolved; why human beings have the opportunity to mould their character and those of their fellow humans for good or ill and to change the environment in which we live; why we have the well-authenticated account of Christ's life, death and resurrection; why throughout the centuries millions of people (other than ourselves) have had the apparent experience of being in touch with and guided by God; and so much else.
In fact, the hypothesis of the existence of God makes sense of the whole of our experience and it does so better than any other explanation that can be put forward, and that is the grounds for believing it to be true.

Peter Hitchens, journalist
I believe in God because I choose to do so. I believe in the Christian faith because I prefer to do so. The existence of God offers an explanation of many of the mysteries of the universe - es­pecially "Why is there something rather than nothing?" and the questions which follow from that. It requires our lives to have a purpose, and our actions to be measurable against a higher standard than their immediate, observable effect. Having chosen belief in a God over unbelief, I find the Christian gospels more per­suasive and the Christian moral system more powerful than any other religious belief.

I was, it is true, brought up as a Christian, but ceased to be one for many years. When I returned to belief I could have chosen any, but did not.

Jonathan Aitken, former politician
I believe in God because I have searched for Him and found Him in the crucible of brokenness. Some years ago I went through an all-too-well-publicised drama of defeat, disgrace, divorce, bankruptcy and jail. In the course of that saga I discovered a loving God who answers prayers, forgives and redeems.

James Jones, Bishop of Liverpool
One word: Jesus. All that you imagine God would be, He is. His life and His love are compelling, His wisdom convincing.

Richard Chartres, Bishop of London
I believe in God because He has both revealed and hidden Himself in so many different ways: in the created world, the Holy Bible, the man Jesus Christ; in the Church and men and women of God through the ages; in human relationships, in culture and beauty, life and death, pain and suffering; in immortal longings, in my faltering prayers and relationship with Him. There is nothing conclusive to force me into believing, but everything sug­gestive, and constantly drawing me on into the love of Christ and to "cleave ever to the sunnier side of doubt".

David Alton, Lib Dem peer
The notion that humanity and the cosmos are an accident has always seemed implausible. A world littered with examples of complex genius - from developments in quantum theory to regenerative medicine - points us towards genius more perfect and more unfathomable than ourselves. The powerful combination of faith and reason led me as a child to believe in God.

Unsurprisingly, as I matured into manhood, that belief has not been immune against the usual catalogue of failure, sadness and grief; and belief has certainly not camouflaged the horrors of situations I have seen first hand in places such as Congo and Sudan. Paradoxically, it has been where suffering has been most acute that I have also seen the greatest faith.

By contrast, the more we own or have, the more difficulty we seem to have in seeing and encountering the Divine.

Professor Stephen R L Clark, philosopher
I believe in God because the alternatives are worse. Not believing in God would mean that we have no good reason to think that creatures such as us human beings (accidentally generated in a world without any overall purpose) have any capacity - still less any duty - to discover what the world is like.

Denying that "God exists" while still maintaining a belief in the power of reason is, in my view, ridiculous.My belief is that we need to add both that God is at least possibly incarnate among us, and that the better description of God (with all possible caveats about the difficulty of speaking about the infinite source of all being and value) is as something like a society. In other words, the Christian doctrine of the incarnation, and of the trinity, have the philosophical edge. And once those doctrines are included, it is possible to see that other parts of that tradition are important.

Nick Spencer, director of Theos, the public theology think tank
I would say I find Christianity (rather than just belief in God) the most intellectually and emotionally satisfying explanation for being.

Stephen Green, director of the fundamentalist pressure group Christian Voice
I came to faith in God through seeing the ducks on a pond in People's Park, Grimsby. It struck me that they were all doing a similar job, but had different plumage. Why was that? Why did the coot have a white beak and the moorhen a red one? Being a hard-nosed engineer, I needed an explanation that worked and the evolutionary model seemed too far-fetched and needful of too much faith!

I mean, what could possibly be the evolutionary purpose of the bars on the hen mallard's wings, which can only be seen when she flies? Or the tuft on the head of the tufted duck?

So I was drawn logically to see them as designed like that. I suppose I believed in an intelligent designer long before the idea became fashionable. So, that left me as a sort of a deist. But God gradually became more personal to me and I was drawn against all my adolescent atheist beliefs deeper and deeper into faith in Jesus Christ.

Douglas Hedley, reader in metaphysics, Clare College, Cambridge
Do values such as truth, beauty and goodness emerge out of a contingent and meaningless substrate? Or do these values reflect a transcendent domain from which this world has emerged? I incline to the latter, and this is a major reason for my belief in God.

Paul Davies, quantum physicist
I am not comfortable answering the question "Why do you believe in God?" because you haven't defined "God". In any case, as a scientist,
I prefer not to deal in "belief" but rather in the usefulness of concepts. I am sure I don't believe in any sort of god with which most readers of your article would identify.

I do, however, assume (along with all scientists) that there is a rational and intelligible scheme of things that we uncover through scientific investigation. I am uncomfortable even being linked with "a god" because of the vast baggage that this term implies (a being with a mind, able to act on matter within time, making decisions, etc).

Professor Derek Burke, biochemist and former president of Christians in Science
There are several reasons why I believe in God. First of all, as a scientist who has been privileged to live in a time of amazing scientific discoveries (I received my PhD in 1953, the year Watson and Crick discovered the structure of DNA), I have been overwhelmed by wonder at the order and intricacy of the world around us. It is like peeling skins off an onion: every time you peel off a layer, there is another one underneath, equally marvellously intricate. Surely this could not have arisen by chance? Then my belief is strengthened by reading the New Testament especially, with the accounts of that amazing person, Jesus, His teaching, His compassion, His analysis of the human condition, but above all by His resurrection. Third, I'm deeply impressed by the many Christians whom I have met who have lived often difficult lives with compassion and love. They are an inspiration to me.

Peter J Bussey, particle physicist
God is the ultimate explanation, and this includes the explanation for the existence of physical reality, for laws of nature and everything. Let me at this point deal with a commonly encountered "problem" with the existence of God, one that Richard Dawkins and others have employed.
It goes that if God is the ultimate cause or the ultimate explanation, what then is the cause of God, or the explanation for God? My reply
is that, even in our own world, it is improper to repeat the same investigatory question an indefinite number of times. For example, we ask, "Who designed St Paul's Cathedral?" and receive the reply: "Sir Christopher Wren." But, "No help whatever," objects the sceptic, "because, in that case, who then designed Sir Christopher Wren?" To this, our response will now be that it is an inappropriate question and anyone except a Martian would know that. Different questions will be relevant now.

So, likewise, it is very unlikely that we know the appropriate questions, if any, to ask about God, who is presumably outside time, and is the source of the selfsame rationality that we presume to employ to understand the universe and to frame questions about God.
What should perhaps be underlined is that, in the absence of total proof, belief in God will be to some extent a matter of choice.

Reverend Professor Michael Reiss, bioethicist and Anglican priest
At the age of 18 or 19, a religious way of understanding the world began increasingly to make sense. It did not involve in any way abandoning the scientific way. If you like, it's a larger way of understanding our relationship with the rest of the world, our position in nature and all those standard questions to do with why we are here, if there is life after death, and so on. That was reinforced by good teaching, prayer and regular reading of scripture.

Peter Richmond, theoretical physicist
Today most people reject the supernatural but there can be no doubt that the teachings of Jesus are still relevant. And here I would differentiate these from some of the preaching of authoritarian churches, which has no doubt been the source of much that could be considered to be evil over the years. Even today, we see conflict in places such as Africa or the Middle East - killings made in the name of religion, for example. As Christians, we recognise these for what they are - evil acts perpetrated by the misguided. At a more domestic level, the marginalisation of women in the Church is another example that should be exposed for what it is: sheer prejudice by the present incumbents of the Church hierarchy. But as Christians, we can choose to make our case to change things as we try to follow the social teachings of Jesus. Compared to pagan idols, Jesus offered hope, comfort and inspiration, values that are as relevant today as they were 2,000 years ago.

David Myers, professor of psychology, Hope College, Michigan
[Our] spirituality, rooted in the developing biblical wisdom and in a faith tradition that crosses the centuries, helps make sense of the universe, gives meaning to life, opens us to the transcendent, connects us in supportive communities, provides a mandate for morality and selflessness and offers hope in the face of adversity and death.

Kenneth Miller, professor of biology, Brown University
I regard scientific rationality as the key to understanding the material basis of our existence as well as our history as a species. That's the reason why I have fought so hard against the "creationists" and those who advocate "intelligent design". They deny science and oppose scientific rationality, and I regard their ideas as a threat to a society such as ours that has been so hospitable to the scientific enterprise.

There are, however, certain questions that science cannot answer - not because we haven't figured them out yet (there are lots of those), but because they are not scientific questions at all. As the Greek philosophers used to ask, what is the good life? What is the nature of good and evil? What is the purpose to existence? My friend Richard Dawkins would ask, in response, why we should think that such questions are even important. But to most of us, I would respond, these are the most important questions of all.

What I can tell you is that the world I see, including the world I know about from science, makes more sense to me in the light of a spiritual understanding of existence and the hypo­thesis of God. Specifically, I see a moral polarity to life, a sense that "good" and "evil" are actual qualities, not social constructions, and that choosing the good life (as the Greeks meant it) is the central question of existence. Given that, the hypothesis of God conforms to what I know about the material world from science and gives that world a depth of meaning that I would find impossible without it.

Now, I certainly do not "know" that the spirit is real in the sense that you and I can agree on the evidence that DNA is real and that it is the chemical basis of genetic information. There is, after all, a reason religious belief is called "faith", and not "certainty". But it is a faith that fits, a faith that is congruent with science, and even provides a reason why science works and is of such value - because science explores that rationality of existence, a rationality that itself derives from the source of that existence.

In any case, I am happy to confess that I am a believer, and that for me, the Christian faith is the one that resonates. What I do not claim is that my religious belief, or anyone's, can meet a scientific test.

Nick Brewin, molecular biologist
A crucial component of the question depends on the definition of "God". As a scientist, the "God" that I believe in is not the same God(s) that I used to believe in. It is not the same God that my wife believes in; nor is it the same God that my six-year-old granddaughter believes in; nor is it the God that my brain-damaged and physically disabled brother believes in. Each person has their own concept of what gives value and purpose to their life. This concept of "God" is based on a combination of direct and indirect experience.

Humankind has become Godlike, in the sense that it has acquired the power to store and manipulate information. Language, books, computers and DNA genomics provide just a few illustrations of the amazing range of technologies at our fingertips. Was this all merely chance? Or should we try to make sense of the signs and wonders that are embedded in a "revealed religion"?

Perhaps by returning to the "faith" position of children or disabled adults, scientists can extend their own appreciation of the value and purpose of individual human existence. Science and religion are mutually complementary.

Hugh Ross, astrophysicist and astronomer
Astronomy fascinates me. I started serious study of the universe when I was seven. By the age of 16, I could see that Big Bang cosmology offered the best explanation for the history of the universe, and because the Big Bang implies a cosmic beginning, it would require a cosmic beginner. It seemed reasonable that a creator of such awesome capacities would speak clearly and consistently if He spoke at all. So I spent two years perusing the holy books of the world's religions to test for these characteristics. I found only one such book. The Bible stood apart: not only did it provide hundreds of "fact" statements that could be tested for accuracy, it also anticipated - thousands of years in advance - what scientists would later discover, such as the fundamental features of Big Bang cosmology.

My observation that the Bible's multiple creation narratives accurately describe hundreds of details discovered much later, and that it consistently places them in the scientifically correct sequence, convinced me all the more that the Bible must be the supernaturally inspired word of God. Discoveries in astronomy first alerted me to the existence of God, and to this day the Bible's power to anticipate scientific discoveries and predict sociopolitical events ranks as a major reason for my belief in the God of the Bible. Despite my secular upbringing, I cannot ignore the compelling evidence emerging from research into the origin of the universe, the anthropic principle, the origin of life and the origin of humanity. Theaccumulating evidence continues to point compellingly towards the God of the Bible.

Steve Fuller, philosopher/professor of sociology, University of Warwick
I am a product of a Jesuit education (before university), and my formal academic training is in history and philosophy of science, which is the field credited with showing the tight links between science and religion. While I have never been an avid churchgoer, I am strongly moved by the liberatory vision of Jesus promoted by left-wing Christians.

I take seriously the idea that we are created in the image and likeness of God, and that we may come to exercise the sorts of powers that are associated with divinity. In this regard, I am sympathetic to the dissenting, anticlerical schools of Christianity - especially Unitarianism, deism and transcendentalism, idealism and humanism. I believe that it is this general position that has informed the progressive scientific spirit.

People such as Dawkins and Christopher Hitchens like to think of themselves as promoting a progressive view of humanity, but I really do not see how Darwinism allows that at all, given its species-egalitarian view of nature (that is, humans are just one more species - no more privileged than the rest of them). As I see it, the New Atheists live a schizoid existence, where they clearly want to privilege humanity but have no metaphysical basis for doing so.

Michael J Behe, scientific advocate of intelligent design
Two primary reasons: 1) that anything exists; and 2) that we human beings can comprehend and reason. I think both of those point to God.

Denis Alexander, director, Faraday Institute for Science and Religion, Cambridge
I believe in the existence of a personal God. Viewing the universe as a creation renders it more coherent than viewing its existence as without cause. It is the intelligibility of the world that requires explanation.

Second, I am intellectually persuaded by the historical life, teaching, death and resurrection of Jesus of Nazareth, that He is indeed the
Son of God. Jesus is most readily explicable by understanding Him as the Son of God. Third, having been a Christian for more than five decades, I have experienced God through Christ over this period in worship, answered prayer and through His love. These experiences are more coherent based on the assumption that God does exist.

Mike Hulme, professor of climate change, University of East Anglia
There are many reasons - lines of evidence, if you will - all of which weave together to point me in a certain direction (much as a scientist or a jury might do before reaching a considered judgement), which we call a belief.

[I believe] because there is non-trivial historical evidence that a person called Jesus of Naza­reth rose from the dead 2,000 years ago, and
it just so happens that He predicted that He would . . . I believe because of the testimony of billions of believers, just a few of whom are known to me and in whom I trust (and hence trust their testimony).

I believe because of my ineradicable sense that certain things I see and hear about in the world warrant the non-arbitrary categories of "good" or "evil". I believe because I have not discovered a better explanation of beauty, truth and love than that they emerge in a world created - willed into being - by a God who personifies beauty, truth and love.

Andrew Zak Williams has written for the Humanist and Skeptic. His email address is:

This article first appeared in the 18 April 2011 issue of the New Statesman, GOD Special

Show Hide image

The life of Pi

How the gaming prodigy David Braben and his friends invented a tiny £15 device that became the biggest-selling British computer.

If you had visited David Braben’s room at Jesus College, Cambridge in 1983 you would have found an unusual scene. Sure, it was just as cramped, muddled and tinged with the fragrance of generations of undergraduates as that of any other student. But while Braben’s neighbours lined their walls with textbooks and Hollywood posters, the shelves in his room supported cascades of cabling and copper wire. And there in the centre of the desk, amid a shanty town of screws and pliers, an Acorn Atom computer hummed.

Braben knew its insides better than his own. Such was the extent of his frequent and intrusive tinkering that he left the machine’s casing permanently off, leaving the circuitry exposed, like that of a battle-wrecked android. One winter’s day that year, he and a friend, Ian Bell, stood in front of the Atom’s chunky monitor. Braben moved his hand towards the keyboard and, with a tap, executed a Big Bang.

Elite, as Braben and Bell’s universe would later be named, was an ambitious computer simulation of endless rolling galaxies, waiting to be explored via a digital spaceship. To grow such vastness from such rudimentary technology, Braben had to pull off the equivalent of a numerical conjuring trick. Rather than manually plotting cosmic systems by typing star and planet co-ordinates into a database, he used the Fibonacci sequence, which starts with “0” and “1”, and continues the sequence by adding the two preceding numbers. This mathematical curiosity governs a variety of natural phenomena, such as the arrangement of leaves on a tree or the pattern of the florets in a flower, making it the ideal formula to spawn a seed from which virtual galaxies could be generated.

The game offered breadth and depth. You toured the universe in a spaceship, represented on screen by a few scant white lines, free to mine resources, dogfight with pirates or even become a galactic marauder yourself, preying on the cargo ships that sailed along trade routes. While most arcade games of the time brought players into their reality for a few brief minutes before kicking them out again, penniless and defeated, Elite worked at a different pace. Players could spend hours touring its innumerable systems. Braben’s contemporaries were astonished. “We stood around wide-eyed; these were feats of coding we had thought impossible on the low-powered machines of the day,” Jack Lang, a university friend of Braben’s, told me.

Braben and Bell’s invention became a sensation. Elite sold out of its initial run of 50,000 copies in less than two weeks, and went on to sell 600,000 copies across 17 different computer formats, making millionaires of its young creators. The game also inspired a generation of so-called Britsoft programmers who, over the next decade, would make Britain a leading hub for computer-game development, and produce, in Tomb RaiderGrand Theft Auto and Championship Manager, a clutch of enviable and world-renowned names.




Twenty years later, when he was running Frontier Developments, one of the most successful games companies in the UK, Braben noticed a trend. Each time his company advertised a job in programming, ­fewer candidates would apply. “I was expecting the number of applicants to rise because we’d had some positive press,” he told me when I visited him at the Frontier offices in Cambridge.

Braben, who, in his black hoodie, looks significantly younger than his 53 years, runs Frontier from a spacious, glass-fronted office. Nearby, scores of artists, designers and programmers tap and toil in orderly phalanxes of computers. The company, which in 2016 turned over £21.4m, employs more than 300 staff.

“But at that time we found that we were having to hire from abroad,” Braben told me. He called some directors at other British games companies and found that they had the same problem. Then he called the University of Birmingham, where he sat on the advisory board. “They, too, were in crisis: applicants to the computer science course had dropped off a cliff,” he said. “It made no sense to me.”

At the time, Braben was running focus tests with children on one of the company’s games, and he sneaked an additional question into his survey: “What is the most boring lesson at school?” The response left him bewildered – ICT (information and communications technology). “You would think computing would be the most exciting lesson for a child at school, wouldn’t you?” he said.

He called a local schoolteacher. “The issue became immediately obvious: the curriculum was teaching children nothing more than how to use Word and Excel. Programming had been removed from lessons and, in most cases, ICT was being taught by people who were computer-illiterate.” The teacher told him that students would run riot in class. Some children had discovered that by deleting a few critical files from Windows they could ensure that the computer would fail to switch on the next time the machine was rebooted.

“Schools were having to employ people just to repair this vandalism,” Braben said. The drop-off in applicants to computer science courses at universities and for positions in development studios was, he concluded, a result of years of classroom neglect. The Britsoft industry, it seemed, was in danger of collapsing from the bottom up.

Braben wrote to Margaret Hodge, then an education minister in Tony Blair’s Labour government. “I thought they were keen on education,” he recalled. “But when we met, Hodge told me that they were already teaching computer studies. She accused me of special pleading for my industry.” (Hodge has said, through a spokeswoman, that she “does not recall this meeting”.)

Braben told Hodge that she didn’t need to take his word for it; she could simply speak to a few teachers. “It was so frustrating,” he said. “Government was pouring all of this money into things that weren’t necessarily making a difference to getting kids into computer science. I was just trying to point out that the games industry was a huge asset that could be used to inspire kids. Kids like to learn to program if it’s framed around making games.”

This was Braben’s own childhood experience. His father worked for the Cabinet Office researching nuclear physics, and the family moved around, living in Cheshire in Stockton Heath, near Warrington, then briefly in Italy and finally in Epping, in the eastern suburbs of London. All the while Braben was designing games for him and his two younger siblings to play. One of the first was a modified version of battleships, played in the back garden using pieces pilfered from other board games, and based on nautical battles from the Second World War that he had read about in history books.

After he persuaded his parents to buy him the Acorn Atom, Braben progressed to designing computer games. For one of them, he drew a map of the northern hemisphere as viewed from space. He then taped the map to the computer screen and traced the outline of the countries in code. In the resulting game, players assumed either the role of the Americans or the Russians, tasked with sending nuclear bombs arcing across the screen in an attempt to destroy their opponent’s main cities. The winner was rewarded with a rudimentary computer version of their side’s national anthem.

Braben, who attended Buckhurst Hill County High, a grammar school in Chigwell, Essex, was a natural programmer, talented at maths and physics. But the computer on which he learned his basic programming skills, the Acorn Atom – the precursor of the BBC Micro, which would soon be found in many school ICT rooms – made it easy for him.

“It came with everything you needed in the box,” he said. “People say these days that design software costs only around £100, but that’s a huge amount for a kid. The amazing thing was that, with the Acorn and the BBC Micro and many of those other early machines, you had everything you needed to learn how to program anything you could imagine right from the get-go.”

Braben’s talent extended to entrepreneurship. When he was 17, he wrote to a games publisher saying that he believed his games to be as good as theirs. A week later three men in suits showed up at his parents’ house; he was worried about taking his computer to their office on public transport, so they offered to come to him. Astonished at what the boy had managed to achieve with the hardware, they offered him a job on the spot. Braben pretended to mull the offer over for a few days, before refusing the position in favour of studying natural sciences at Cambridge.

It was the memory of these formative experiences to which he returned when he was cold-shouldered by the government. He called Lang, by then an entrepreneur in Cambridge, who said the university there was also struggling to attract computer science applicants. The pair discussed ways to get the subject taught in the classroom, and a plan formed. If they could find a way to teach programming outside the school system, perhaps the schools would follow.

Initially Lang and Braben considered designing a programming course using bespoke software. The problem was that schools and libraries around the country used different versions of Windows. Finding a one-size-fits-all solution for students to compile and run their games proved impossible. Instead, Lang suggested the idea of a budget computer, one that would allow children the freedom to tinker, customise and break things, and then restore it all at the touch of a button.

“It struck me that probably the best way these days for a young student to learn how to program is to buy an old BBC Micro off eBay,” Braben said. “That’s a bit of an admission, isn’t it? It’s also fundamentally capped by the number of BBC Micros that are still working in the world, so it’s not a general solution. But it’s such a good way of learning. It encourages you to experiment. Rebooting a PC can easily damage the software. With the BBC Micro you could do all kinds of outrageous things and then just reset it. The hardware was tough, too.”

It is possible to destroy a BBC Micro, Braben said, but very difficult. So the idea was to build a computer that reflected the Micro’s sturdiness and simplicity: a machine for all-comers, practically indestructible in form, and universal in function. In 2003 Braben, Lang and four of their friends – Pete Lomas, Alan Mycroft, Robert Mullins and Eben Upton (“slightly eccentric guys from Cambridge”, as Braben puts it) – met at a computer lab at the university and, from a shopping list of components, began to price up a microcomputer.

“We knew how cheap components were becoming because of the rise of mobile phones,” Braben said. “But when we came up with the final price we couldn’t believe how low it was.” The group estimated it would be possible to build a home computer with a single USB port and an HDMI (high-definition multimedia interface) connector – which enables the device to be connected to a compatible screen – for £15.




The six men named their invention the Raspberry Pi. “Fruit seemed good; Raspberry particularly good because it’s a bit of a thumb-nose at the convention. We added Pi to make it sound a bit mathematical,” said Braben. They formed the Raspberry Pi Foundation, a charity aiming to “promote the study of computer science and related topics . . . and put the fun back into learning computing”. It was almost a decade before their vision for the micro-budget microcomputer would become a reality.

“We decided that we needed support from a large organisation,” Braben said. “We started speaking to the BBC and spent a few years discussing the project with them as potential partners.” The group even offered to give the corporation the software design free of charge. But the strong initial interest led to a series of interminable meetings, where nobody from the BBC seemed willing to be the one to make the final decision.

“The final meeting I had with the BBC really annoyed me,” he said. “They told me that I needed to seek sign-off from a group that had already signed off on the project, simply because there had been a reorganisation in that group. We were going around in circles. That’s when I realised it wasn’t going to work.”

Immediately after the meeting, a furious Braben strode to the White City office of Rory Cellan-Jones, the BBC’s technology correspondent. Cellan-Jones knew of Braben from reading Francis Spufford’s 2003 book, Backroom Boys, a biography of various British inventors in which Braben and Bell featured prominently.

“When Braben contacted me under the illusion that I was somebody at the BBC with some semblance of power, rather than an infantryman, I was delighted,” Cellan-Jones told me. Yet he was at a loss as to what he could do to help the inventor standing in front of him with a Raspberry Pi in his hand. “I thought to myself: well, there’s nothing I can do with this. I can’t get a crew to film something like that.”

Sensing Braben’s despair, Cellan-Jones suggested that he film a short video on his phone there and then; he would post it to his BBC blog and announce the Raspberry Pi to the world. Doing so might, Cellan-Jones reasoned, force the BBC’s hand. At the very least it would help to gauge public interest in the device.

In a nearby corridor, Braben held the device up to the camera and explained what it was and why it might be important. “It was short and simple,” he recalled. At lunchtime on 5 May 2011, Cellan-Jones posted the video and a story about the computer to his blog. “It’s not much bigger than your finger, it looks like a leftover from an electronics factory, but its makers believe their £15 computer could help a new generation discover programming,” he wrote.

The story went viral, receiving a quarter of a million hits that day. “I was surprised and delighted,” Cellan-Jones said. “It was a great idea from the start. But I encounter lots of great ideas. You get to the stage where you start to believe that nothing will work. Then, every now and again, someone turns up with a rocket ship to Mars.”

Despite the interest, the BBC, as Braben puts it, kept coming up with reasons why the corporation shouldn’t back it. So the six members of the foundation decided to fund the first 10,000 units out of their own pockets. On 29 February 2012, at 5am, Braben began a day of media appearances, first on BBC Worldwide, then on Radio 4’s Today programme. An hour later, the website where the public could order one of the first Raspberry Pis went live. Within five seconds it had sold out.

Unable to keep up with the demand, the website sold far more units than the team had components for. “It went very well indeed,” Braben said.



Since then, the rise of Raspberry Pi has been inexorable, with more than seven million units sold. This fully customisable and programmable computer, no larger than a credit card and only slightly thicker, can be used for everything from controlling the lights in your garage to learning how to become a software developer. In Syria it has been used to create local radio transmitters, able to broadcast messages to towns within a range of up to six kilometres, disseminating information about nearby skirmishes and essential supplies.

The Pi computer has been used to take weather balloons to the edge of space – its four AA batteries draw just enough current to stop the device from freezing – enabling schoolchildren to send teddy bears into the stratosphere to take photographs of the curvature of the planet. It can even broadcast its position by GPS, enabling those children to locate the device when it floats back to Earth. It doesn’t matter too much if it is lost, because it costs as little as £5 in its most basic form. This year, the foundation gave away a basic Raspberry Pi on the front of the MagPi, an affiliated magazine that teaches readers how, among other things, to program a football game from scratch.

Hundreds of thousands of young people have attended the foundation’s educational programmes. In 2015 Raspberry Pi entered into a collaboration with Code Club, an organisation created as a response to “the collective failure to prepare young people for life and work in a world that is shaped by digital technologies”. Code Club now runs more than 3,800 clubs in the UK and over 1,000 more in 70 other countries. Staffed by volunteers, the clubs provide nine-t0-11-year-olds with the opportunity to make things using computers. Roughly 44,000 young people regularly attend Code Clubs in the UK alone; some 40 per cent of these youngsters are girls.

Braben’s plan to get British schoolchildren learning how to program has been even more fruitful. Since Raspberry Pi’s launch, applications for computer science degrees have increased by a factor of six. Data from Cambridge Assessment, the exams and research group, shows a significant increase in numbers of children choosing to study ICT at GCSE level, with a 17 per cent year-on-year rise in 2015.

There have been other beneficial side effects. Thanks to the buzz generated by the Raspberry Pi, and pressure from the foundation as well as Google, Microsoft and others, the government has put computer science back on the national curriculum.

“We’re seeing a huge growth in engagement with computer science in the UK, and Raspberry Pi has been a big part of that movement,” said Philip Colligan, the chief executive of the Raspberry Pi Foundation. “It came along at just the right moment and provided a physical manifestation of the idea that kids should be learning how to make things with computers, not just how to consume.”

Cellan-Jones agrees that the timing of the device’s launch was perfect. “It was certainly part of a wide movement to change how ICT was taught in schools, but of all those efforts I think it played the most important part. By having a physical object it made it tangible.”

Braben believes that the Raspberry Pi and its many imitators are dispelling the mystique that has grown around technology, driven in part, he says, by Apple’s closed systems. It is almost impossible, for example, to remove the cover of an iPhone to see how it works.

“When I was growing up, if my hi-fi was buzzing I’d take the lid off and maybe put some Blu-Tack in to stop the buzzing,” he said. “At some point, this collective fear crept in.”

For Braben, who has two stepchildren, now going on 13 and 18, it’s important for children not to be afraid of the technology on which they rely. “You only need one person in ten to actually study computer science. But for everyone else, having some understanding about, say, what goes on in your phone is incredibly helpful.

“In so many walks of life, whether you’re a builder using power tools or an accountant using accounting software, you are forever being presented with and relying upon technology. Understanding a little about what’s going on, rather than being afraid and embarrassed, is crucial.”

So, too, is having fun along the way. Braben has since returned to the stars of his youth by way of Elite: Dangerous. This sequel to the game that made him his fortune was released in late 2015. Rather than turn to algorithms to scatter the universe with stars and planets, this time the Frontier team re-created our own galaxy.

The digital sky for the revamped game includes every known star present in our own, their positions drawn from the numerous publicly available sky maps, each of which can be visited in the game using a spaceship. Altogether, the game is comprised of 400 billion stars, their planetary systems – and moons – all, like the insides of the computers on which they run, waiting to be explored.

This article first appeared in the 02 February 2017 issue of the New Statesman, American carnage