Policy Exchange has a new report out today, and I’m not going to lie, my attention was piqued by the pun-tastic title, The Superfast and the Furious, because, wow.
Anyway, it makes a number of interesting recommendations, mostly going against the trend in recent years for promoting the spread of so-called “superfast” broadband – usually delivered by fibre-optic cables, and largely confined to dense built-up areas.
Instead, the authors, Chris Yiu and Sarah Fink, argue that the government should refocus on helping the people who remain offline, since:
Whether or not the UK has the fastest superfast broadband relative to other countries is a redundant question.
There has always been a target of delivering broadband of at least 2Mbps to the 10 per cent of houses which won’t be able to get superfast broadband, and in fact, it’s that target which the report suggests may need to be recalibrated. It points out that setting an absolute level of what constitutes “acceptable” broadband speed is foolhardy: when the target was set, 2Mbps was fast; now it’s the minimum requirement to use iPlayer, a standard technology; tomorrow it may be too slow to do other things which we have come to expect as standard. One option they propose instead is to track “broadband poverty”, identifying the number of houses where the best broadband option is a certain percentage below the median.
The report is an important counter to the prevailing trend in internet policy, which seems to be driven a bit too much by the fact that superfast broadband is cool, while replacing miles of copper wire with slightly better copper wire in rural Cumbria isn’t. After all, the leap from no internet to some is far greater than the leap from fast to superfast – and the damage caused by having none at all is real and concerning. A recent Oxford University study found that “there are substantial educational advantages in teenagers being able to access the internet at home”, for instance, while the report itself cites the fact that small businesses which “embrace” the internet grow “substantially faster” than those which remain offline.
But the thing which the report misses is that there’s a second priority which ought to be key for the government to press for, and that’s reliability. The authors pass this off as a matter for competition:
For the general public, broadband price and reliability matter as much as raw speed, and the optimal trade-off will vary from home to home and over time. The best way through is to let the market balance different needs, which in turn requires effective competition between providers.
I’m not so sure that’s correct. Advertised reliability is certainly something which providers compete on, but due to the stickiness of the market, it appears that they rarely need to live up to those promises.
Increasingly, uptime, rather than speed, is the limit to wider adoption of the “internet economy” which Yiu and Fink are so keen to trumpet (citing figures which show that around eight per cent of UK GDP is due to the internet); the fear, or experience, of a connection failure can lead to understandable reluctance to make too many operations dependent on the net. This is true of a number of hoped-for internet driven productivity enhancements. Consider telecommuting, for example. Anyone who has experienced multiple-day outages will know the fear that one could happen when crucial work is riding on it.
The question is whether more reliable connections can be achieved through the market alone. I have my doubts. The market for high-speed internet only really became competitive once bogus claims were cracked down on by the ASA – but providers have steered clear of making similarly testable claims about connection stability. And switching companies remains such a hassle that it exerts a massive drag on the efficiency of competition to motivate anything.
Still, we must hope for a b++++DROPPED CONNECTION++++