End of an era?

Despite the economic downturn, money still dominated the conversations at last week's Frieze Art Fai

[The organisers of the Frieze Art Fair report that this year's event offers "clear evidence of renewed confidence in the contemporary art market". Our guest blogger Dany Louise reflects on what Frieze says about art, commodification and aesthetic judgement.]

The Frieze Art Fair was born out of the dwindling power of the public sector and the cult of entrepreneurialism. It is art primarily as big business, not as public service.

The famous Frieze tent is both a literal and a metaphorical marketplace: a demonstration of the market's dominance, not just of the art world, but of public discourse, too. We have reached a point where no alternative to the market model is taken seriously. There was some exceptionally good art on show at Frieze, but it is inescapable that much of that work is valued primarily for its financial potential rather than its intrinsic worth. It is also clear that, at the stratospheric end of the financial spectrum, the excessive rewards given to certain artistic brand names are both ludicrous and disturbing.

Critically acclaimed artists with a proven body of work made over many years sell for decent prices: Andreas Gursky's magnificent Kathedrale at White Cube sold for €500,000. Two emergent artists whose idiosyncracy, skill and vision stand out sell for prices that are reasonable in comparison to others -- a sumptuous Ged Quinn painting was sold for £55,000, and one of Natalie Djurberg's Venice Biennale "claymation" films is "still cheap" at £14,000, for one of an edition of four. Which, if the gallery takes 50 per cent, leaves the artist with a taxable £7,000 for a piece of work that undoubtedly took £7,000 of labour, probably more. And that's without even attempting to quantify her intellectual property. Times by four and she will receive £28,000. Here is one artist who is not being over-rewarded. (It's interesting to note the strategy of artificial scarcity being applied to moving image works.)

It has become a truism that the price of art is frequently dissociated from critical judgement of its quality. The market over the past ten years has become speculative, in effect a reflection of what very rich collectors and/or investment funds want to buy for future gain. There is a gaping disconnect between what is happening in the country's art schools, which are concerned with giving artists a theoretical underpinning for their work, and the art marketplace. The Frieze Foundation attempts to address this with specific commissions from working artists. But these were fairly low-profile at this year's fair, with the exception of Ryan Gander's project We are Constant.

If the marketplace is to have such influence, the role of our public-sector museums and galleries as arbiters of judgement and status becomes ever more important. Yes, curators come to Frieze to see particular work and discuss future exhibitions in the public sector, and museum directors buy at the fair. But the ratcheting up of prices over the past decade has priced public collections out of the market for a great deal of work, in the UK at least. This is notwithstanding the Outset/Frieze Art Fair Fund for the benefit of the Tate collection, which bought six works costing £120,000 in total -- welcome, but pennies in the grand scheme of things.

Frieze holds a mirror up to an era in which money doesn't just talk very loudly, it all but overwhelms the conversation.

OLIVER BURSTON
Show Hide image

How science and statistics are taking over sport

An ongoing challenge for analysts is to disentangle genuine skill from chance events. Some measurements are more useful than others.

In the mid-1990s, statistics undergraduates at Lancaster University were asked to analyse goal-scoring in a hypothetical football match. When Mark Dixon, a researcher in the department, heard about the task, he grew curious. The analysis employed was a bit simplistic, but with a few tweaks it could become a powerful tool. Along with his fellow statistician Stuart Coles, he expanded the methods, and in doing so transformed how researchers – and gamblers – think about football.

The UK has always lagged behind the US when it comes to the mathematical analysis of sport. This is partly because of a lack of publicly available match data, and partly because of the structure of popular sports. A game such as baseball, with its one-on-one contests between pitcher and batter, can be separated into distinct events. Football is far messier, with a jumble of clashes affecting the outcome. It is also relatively low-scoring, in contrast to baseball or basketball – further reducing the number of notable events. Before Dixon and Coles came along, analysts such as Charles Reep had even concluded that “chance dominates the game”, making predictions all but impossible.

Successful prediction is about locating the right degree of abstraction. Strip away too much detail and the analysis becomes unrealistic. Include too many processes and it becomes hard to pin them down without vast amounts of data. The trick is to distil reality into key components: “As simple as possible, but no simpler,” as Einstein put it.

Dixon and Coles did this by focusing on three factors – attacking and defensive ability for each team, plus the fabled “home advantage”. With ever more datasets now available, betting syndicates and sports analytics firms are developing these ideas further, even including individual players in the analysis. This requires access to a great deal of computing power. Betting teams are hiring increasing numbers of science graduates, with statisticians putting together predictive models and computer scientists developing high-speed software.

But it’s not just betters who are turning to statistics. Many of the techniques are also making their way into sports management. Baseball led the way, with quantitative Moneyball tactics taking the Oakland Athletics to the play-offs in 2002 and 2003, but other sports are adopting scientific methods, too. Premier League football teams have gradually built up analytics departments in recent years, and all now employ statisticians. After winning the 2016 Masters, the golfer Danny Willett thanked the new analytics firm 15th Club, an offshoot of the football consultancy 21st Club.

Bringing statistics into sport has many advantages. First, we can test out common folklore. How big, say, is the “home advantage”? According to Ray Stefani, a sports researcher, it depends: rugby union teams, on average, are 25 per cent more likely to win than to lose at home. In NHL ice hockey, this advantage is only 10 per cent. Then there is the notion of “momentum”, often cited by pundits. Can a few good performances give a weaker team the boost it needs to keep winning? From baseball to football, numerous studies suggest it’s unlikely.

Statistical models can also help measure player quality. Teams typically examine past results before buying players, though it is future performances that count. What if a prospective signing had just enjoyed a few lucky games, or been propped up by talented team-mates? An ongoing challenge for analysts is to disentangle genuine skill from chance events. Some measurements are more useful than others. In many sports, scoring goals is subject to a greater degree of randomness than creating shots. When the ice hockey analyst Brian King used this information to identify the players in his local NHL squad who had profited most from sheer luck, he found that these were also the players being awarded new contracts.

Sometimes it’s not clear how a specific skill should be measured. Successful defenders – whether in British or American football – don’t always make a lot of tackles. Instead, they divert attacks by being in the right position. It is difficult to quantify this. When evaluating individual performances, it can be useful to estimate how well a team would have done without a particular player, which can produce surprising results.

The season before Gareth Bale moved from Tottenham Hotspur to Real Madrid for a record £85m in 2013, the sports consultancy Onside Analysis looked at which players were more important to the team: whose absence would cause most disruption? Although Bale was the clear star, it was actually the midfielder Moussa Dembélé who had the greatest impact on results.

As more data is made available, our ability to measure players and their overall performance will improve. Statistical models cannot capture everything. Not only would complete understanding of sport be dull – it would be impossible. Analytics groups know this and often employ experts to keep their models grounded in reality.

There will never be a magic formula that covers all aspects of human behaviour and psychology. However, for the analysts helping teams punch above their weight and the scientific betting syndicates taking on the bookmakers, this is not the aim. Rather, analytics is one more way to get an edge. In sport, as in betting, the best teams don’t get it right every time. But they know how to win more often than their opponents. 

Adam Kucharski is author of The Perfect Bet: How Science and Maths are Taking the Luck Out of Gambling (Profile Books)

This article first appeared in the 28 April 2016 issue of the New Statesman, The new fascism