New Statesman screening at BFI Southbank

A few tickets still available for No Man's Land and panel discussion.

A reminder of the New Statesman film screening and panel discussion at the BFI Southbank in London tomorrow (13 February).

Danis Tanovic's 2001 film No Man's Land will be shown in NFT2 at 12.45pm. A panel discussion on the way the media handle conflict, chaired by the New Statesman's culture editor, Jonathan Derbyshire, will follow at 2.30pm.

Panellists:

James Gow is professor of international peace and security at King's College London, and director of the International Peace and Security Programme. Between 1994 and 1998, he served as an expert adviser and expert witness for the Office of the Prosecutor at the UN International Criminal Tribunal for the Former Yugoslavia, where he was involved in establishing subject-matter jurisdiction. He was also the first witness to give evidence in the Trial Chamber and the first person ever to give evidence at an international criminal tribunal. Professor Gow has subsequently continued to work with the tribunal. His most recent book is War, Image, Legitimacy: Viewing Contemporary Conflict (with Milena Michalski).

Dr Gregory Kent has written about the break-up of Yugoslavia both as a journalist and as an academic. He is the author of Framing War and Genocide: British Policy and Media Reaction to the War in Bosnia. As director of graduate studies in human rights and international relations at Roehampton University, London, his broad research interests include historical and political issues in war and genocide, and the problems of political communication in such contexts.

James Rodgers is Europe regional editor at the BBC World Service. He spent ten years as a foreign correspondent, during which time he reported from Chechnya, Gaza -- where he was the only international correspondent permanently based in the territory -- and Iraq, where he was one of the first journalists to get to Saddam Hussein's bunker following the Iraqi dictator's capture in 2003. He is currently writing a book on conflict reporting.

To receive a complimentary ticket, email events@newstatesman.co.uk

OLIVER BURSTON
Show Hide image

How science and statistics are taking over sport

An ongoing challenge for analysts is to disentangle genuine skill from chance events. Some measurements are more useful than others.

In the mid-1990s, statistics undergraduates at Lancaster University were asked to analyse goal-scoring in a hypothetical football match. When Mark Dixon, a researcher in the department, heard about the task, he grew curious. The analysis employed was a bit simplistic, but with a few tweaks it could become a powerful tool. Along with his fellow statistician Stuart Coles, he expanded the methods, and in doing so transformed how researchers – and gamblers – think about football.

The UK has always lagged behind the US when it comes to the mathematical analysis of sport. This is partly because of a lack of publicly available match data, and partly because of the structure of popular sports. A game such as baseball, with its one-on-one contests between pitcher and batter, can be separated into distinct events. Football is far messier, with a jumble of clashes affecting the outcome. It is also relatively low-scoring, in contrast to baseball or basketball – further reducing the number of notable events. Before Dixon and Coles came along, analysts such as Charles Reep had even concluded that “chance dominates the game”, making predictions all but impossible.

Successful prediction is about locating the right degree of abstraction. Strip away too much detail and the analysis becomes unrealistic. Include too many processes and it becomes hard to pin them down without vast amounts of data. The trick is to distil reality into key components: “As simple as possible, but no simpler,” as Einstein put it.

Dixon and Coles did this by focusing on three factors – attacking and defensive ability for each team, plus the fabled “home advantage”. With ever more datasets now available, betting syndicates and sports analytics firms are developing these ideas further, even including individual players in the analysis. This requires access to a great deal of computing power. Betting teams are hiring increasing numbers of science graduates, with statisticians putting together predictive models and computer scientists developing high-speed software.

But it’s not just betters who are turning to statistics. Many of the techniques are also making their way into sports management. Baseball led the way, with quantitative Moneyball tactics taking the Oakland Athletics to the play-offs in 2002 and 2003, but other sports are adopting scientific methods, too. Premier League football teams have gradually built up analytics departments in recent years, and all now employ statisticians. After winning the 2016 Masters, the golfer Danny Willett thanked the new analytics firm 15th Club, an offshoot of the football consultancy 21st Club.

Bringing statistics into sport has many advantages. First, we can test out common folklore. How big, say, is the “home advantage”? According to Ray Stefani, a sports researcher, it depends: rugby union teams, on average, are 25 per cent more likely to win than to lose at home. In NHL ice hockey, this advantage is only 10 per cent. Then there is the notion of “momentum”, often cited by pundits. Can a few good performances give a weaker team the boost it needs to keep winning? From baseball to football, numerous studies suggest it’s unlikely.

Statistical models can also help measure player quality. Teams typically examine past results before buying players, though it is future performances that count. What if a prospective signing had just enjoyed a few lucky games, or been propped up by talented team-mates? An ongoing challenge for analysts is to disentangle genuine skill from chance events. Some measurements are more useful than others. In many sports, scoring goals is subject to a greater degree of randomness than creating shots. When the ice hockey analyst Brian King used this information to identify the players in his local NHL squad who had profited most from sheer luck, he found that these were also the players being awarded new contracts.

Sometimes it’s not clear how a specific skill should be measured. Successful defenders – whether in British or American football – don’t always make a lot of tackles. Instead, they divert attacks by being in the right position. It is difficult to quantify this. When evaluating individual performances, it can be useful to estimate how well a team would have done without a particular player, which can produce surprising results.

The season before Gareth Bale moved from Tottenham Hotspur to Real Madrid for a record £85m in 2013, the sports consultancy Onside Analysis looked at which players were more important to the team: whose absence would cause most disruption? Although Bale was the clear star, it was actually the midfielder Moussa Dembélé who had the greatest impact on results.

As more data is made available, our ability to measure players and their overall performance will improve. Statistical models cannot capture everything. Not only would complete understanding of sport be dull – it would be impossible. Analytics groups know this and often employ experts to keep their models grounded in reality.

There will never be a magic formula that covers all aspects of human behaviour and psychology. However, for the analysts helping teams punch above their weight and the scientific betting syndicates taking on the bookmakers, this is not the aim. Rather, analytics is one more way to get an edge. In sport, as in betting, the best teams don’t get it right every time. But they know how to win more often than their opponents. 

Adam Kucharski is author of The Perfect Bet: How Science and Maths are Taking the Luck Out of Gambling (Profile Books)

This article first appeared in the 28 April 2016 issue of the New Statesman, The new fascism