Show Hide image

A true scientific revolution: the triumph of mathematicians over philosophers

The moment it was accepted that Aristotle had not been right about everything was a crucial turning point in the history of science.

The early-modern Scientific Revolution is still in some populist quarters described as a triumph of experimental reason over religious superstition. It is one of the many virtues of David Wootton’s fascinating history that this canard barely merits a mention, let alone a tedious refutation. For, as he shows, many in the vanguard of the emerging order of the 16th and 17th centuries were religious; they took the new science to be a bulwark against atheism; and, as Wootton plausibly argues, Newtonianism would have been inconceivable without the tradition of belief in a creator God.

In Wootton’s telling, the revolution that created the tradition of science we recognise today was instead a victory of a different kind. The core story spans the long century from the astronomer Tycho Brahe’s first identification of a nova (as we would now say, an exploding star) in 1572, to Isaac Newton’s theory of gravity (1687) and Opticks (1704). Wootton describes it, in terrifically rich detail, as a revolt of mathematicians, wielding numbers and experiments, against philosophers, who assumed that Aristotle had been right about everything.

The mathematicians in this story include early scientists such as Galileo (whom we remember mainly for his telescope but who also conducted pedantic experiments on objects floating in water); they also include, more surprisingly, the artists who first codified the rules of perspective in painting. One of the things that Wootton illuminatingly points out is how many disciplinary hats these intellectual heroes wore. Galileo worked on ballistics problems; Brahe and Edmond Halley were cartographers as well as astronomers; Copernicus was “an expert on monetary reform”. And there are intriguing re-emphases. Copernicus was not that revolutionary, since his heliocentric system preserved the idea of fixed heavenly spheres; Brahe’s rival system, though incorrect, was more important since it did away with them.

But was there really a “Scientific Revolution” in the first place? As Wootton concedes, investigations that are recognisably scientific had proceeded here and there since antiquity. Aristotle’s biology, medieval Arab optics and premodern astronomy were all science and so there exists a “continuity view”, according to which nothing totally unprecedented happened in 16th- and 17th-century Europe. Wootton insists that this is wrong: his thesis is that “modern science” began in the 17th century. But this statement is automatically true by virtue of the author’s positing of a thing called “modern” science, as opposed to what preceded it. So the argument is possibly circular, although the circle is not necessarily vicious.

Wootton persuasively defends, for example, what is known as “the Eisenstein thesis”, put forward by Elizabeth Eisenstein in 1979, which is that the invention of the printing press made the Scientific Revolution possible. Printed books enabled the reproduction of complex diagrams, and researchers could now get an overview of all that had hitherto been thought about a particular problem: they fostered, as Wootton puts it, “a sort of intellectual arms race”. Natural philosophers of the 17th century also had a new family of glass instruments, a “new, critical attitude to established authority” and “a new language”.

What was this new language? It was the language we still speak of facts, evidence, experiments, hypotheses and theories. At the core of this book is a linguistic argument: that the emergence of these words in the 16th and 17th centuries proves that significantly new ideas had emerged. Wootton puts forward a very strong version of this thesis. Before Columbus discovered the Americas, he argues, the idea of “discovery” literally did not exist. Until then, he writes, it was assumed that all discovery was in reality rediscovery of lost wisdom from the ancients. This claim depends on denying that the Latin invenire (“to find or invent”) could ever mean “discover”, even though it was the word that Columbus used to report his discovery.

Other concepts, Wootton points out, could and did exist before their words were coined. Scientific experiments were performed (by Ptolemy, Galen, Alhazen, and so on) before the term “experiment” became commonplace; but what was new in the 17th century, he suggests, was a new respect for experimentation as a path to knowledge and a new “experimental network” for knowledge-sharing. On the other hand, as Wootton shows, the idea of “laws of nature” really was new and depended on the idea of a law-making God. Scientific notions of facts and evidence are shown to have emerged from the law courts. Overall, Wootton justifies nicely his argument that we “tend to overestimate the importance of new technology and underestimate the rate of production and the impact of new intellectual tools”.

This book is one of those for which the reviewer’s term “magisterial” inevitably suggests itself. It is a splendid object, with beautiful text design and typesetting and generous illustrations. It is tremendously good in its deep investigations into how the moon was mapped by telescope, early experiments in creating near-vacuums, or the invention of the first steam engines. (“In order to understand steam engines,” Wootton advises, “it may be helpful to think about methods of making coffee.” I can report that it was.)

The book is less persuasive, however, when it veers into literary and philosophical territory. Wootton thinks he knows that William Shakespeare had “no sense of history” and, indeed, that the playwright “imagined ancient Rome as just like contemporary London but with sunshine and togas”. (Reviewing A D Nuttall’s Shakespeare the Thinker in 2008, Wootton protested, “We don’t actually know what Shakespeare thought” – a view that required no revision.)

More threatening to the pleasure of the disinterested reader will be the significant proportion of the book that constitutes an extended and aggressive warming-over of the “science wars” of the 1990s. For The Invention of Science is not only a history of science but a revisionist historiography of science, in which Wootton attacks allegedly homogeneous schools called “the sociologists of science” and “the cultural determinists”, expending thousands of testy words situating himself carefully between two implausible views, the extreme versions of which are held by almost no one. (He also tries to nudge rival historians and philosophers of science – particularly Thomas S Kuhn, the author of the seminal The Structure of Scientific Revolutions – towards one or other less defensible end of the spectrum.)

Despite Wootton’s protestations, very few people are still so “relativist” that they believe that scientific knowledge is nothing but socially constructed and that it is therefore impossible to say that quantum physics is superior to the theory of the four bodily humours. As few, or fewer, people imagine that scientific knowledge is a transparent window on the truth about the ultimate nature of reality. Wootton eventually concludes that the truth lies somewhere in the middle. Which is, as he would no doubt happily admit, a view that Aristotle would long ago have endorsed.

The Invention of Science: a New History of the Scientific Revolution by David Wootton is published by Allen Lane (769pp, £30)

Steven Poole’s books include Who Touched Base in My Thought Shower? (Sceptre)

This article first appeared in the 03 December 2015 issue of the New Statesman, Syria and the impossible war

Show Hide image

Yes, you could skip brunch and save for a deposit on a house. But why?

You'd be missing out. 

There’s a tiny café round the corner from me, a place so small that you have to leave your Bugaboo pushchair outside (a serious consideration in this part of the world), which has somehow become famous across town for its brunch. At weekends, the queue spills on to the road, with people patiently waiting for up to an hour for pancakes, poached eggs and pondy-looking juices served in jam jars. The food is just as good later on, yet there’s rarely much of a line after 2pm, because brunch is cool in a way that lunch isn’t. Where lunch is quotidian, brunch feels decadent – a real weekend treat.

Though the phenomenon is hardly new – the term was coined by a Brit back in 1895 – brunch has always been more popular in the United States than here, possibly because it’s a meal that you generally go out for and eating out has long been more affordable, and thus common, across the pond. Despite our proud greasy-spoon heritage, the idea of brunch as an occasion with a distinct character, rather than just a wickedly late breakfast, is relatively recent, and it owes much to the increasing informality of 21st-century life.

The Little Book of Brunch by Caroline Craig and Sophie Missing revels in the freedom that the occasion bestows upon the cook, falling as it does outside the long-established conventions of the three-meal
structure. “It’s the meal where you can get away with anything,” they write.

By way of proof, along with eggs Benedict and buttermilk waffles, the book features such novelties as ’nduja-and-egg pizza, spaghetti frittatas and lentil falafels – dishes that you could quite respectably serve for lunch or dinner, yet also contain the cosseting, comforting qualities necessary in a first meal of the day.

Though such culinary experimentation is no doubt attractive to the increasingly adventurous British palate, I suspect that the arrival on these shores of the “bottomless brunch”, a hugely popular trend in the US, may also have something to do with our new enthusiasm for the meal – to the concern of health experts, given that Americans seem better able to grasp the idea of drinking as many Bloody Marys as they can handle, rather than as many as they want.

As David Shaftel put it in an op-ed for the New York Times entitled, wonderfully, “Brunch is for jerks”, this meal is “about throwing out not only the established schedule but also the social conventions of our parents’ generation . . . revelling in the naughtiness of waking up late, having cocktails at breakfast and eggs all day. It’s the mealtime equivalent of a Jeff Koons sculpture.”

The Australian social commentator Bernard Salt agrees, blaming this taste for “smashed avocado with crumbled feta on five-grain toasted bread at $22 a pop” for the younger generation’s failure to grow up, take responsibility and save enough money to buy a house. But as critics observed, house prices in Sydney, like those in the UK, are now so high that you’d have to forgo your weekly avo toast for 175 years in order to put together a deposit, and so, perhaps, it’s not unreasonable to want to live in the moment instead. “We are not going out for brunch instead of buying houses: we are brunching because we cannot afford to buy houses,” as the journalist Brigid Delaney wrote in response.

Baby boomers got the free education, the generous pensions and the houses and left us with shakshuka, sourdough and a flat white. Seems like a fair deal. 

Felicity Cloake is the New Statesman’s food columnist. Her latest book is The A-Z of Eating: a Flavour Map for Adventurous Cooks.

This article first appeared in the 20 April 2017 issue of the New Statesman, May's gamble

0800 7318496