Faltering Nintendo will be just fine if it moves into tablets

Nintendo's losing money, and won't puts its games on iOS or Android because it doesn't trust anyone else's hardware - so why not start making tablets for gamers?

When the Wii U was first announced at E3 in 2011, one crucial detail was left out by Nintendo of America’s president, Reggie Fils-Aime - whether it was a new console or not. It was introduced as “a new gaming companion”, a logical next step to the Wii’s knock-out success at bringing casual gamers, families and friends together. The videos showed the new touchscreen controller from every angle, but not the new box that it was meant to connect to - the new box that looked almost identical to the old one.

Nintendo’s president, Satoru Iwata, admitted at the time that it wasn’t a perfect launch, even if he stopped short of calling it a “blunder”. The problem is, Nintendo’s still struggling against that misconception. Here’s Polygon on Nintendo’s latest “hey guys, did you know the Wii U is an entirely new console?” ad campaign:

"Some have the misunderstanding that Wii U is just Wii with a pad for games, and others even consider Wii U GamePad as a peripheral device connectable to Wii," said Iwata during the company's financial results briefing earlier this year. "We feel deeply responsible for not having tried hard enough to have consumers understand the product."

Iwata said at the time that Nintendo will endeavour to help consumers understand the console and bulk up its software lineup to help the Wii U regain its sale momentum.

Nintendo issued a message to Wii owners in May outlining that its new hardware is not a Wii upgrade but an "all-new home console from Nintendo" that "will change the way you and your family experience games and entertainment."

This week, Nintendo announced that it had made its first annual loss for more than 30 years - that’s as long as it’s been in the computer console business - and that it had slashed its 2013 sales projections for the Wii U from 9m units to 2.8m. Its shares have taken a tumble by 6.2 percent, making it a 65 percent drop in value since 2009. We’re a long way from the heady “Nintendo: We print money!” headlines from five or six years ago, when the DS and Wii were dominant.

Not that Nintendo is likely to fold any time soon, or even consider itself no longer a console company, as happened to Sega in 2001 after the Dreamcast bombed. As Keza MacDonald at IGN points out, Nintendo effectively has $10bn in cash reserves from its last three decades of pretty much constant profitability, so it can suck up a few years of losses while it figures out where to go next. That’s the key issue.

The 3DS isn’t as successful as the DS was, and isn’t quite making its projections - which is understandable, as the mobile gaming market has been pretty comprehensively altered by smartphones and tablets - but it’s still a success. It’s just not as successful as it could be, and it’s certainly not compensating for the flat-lining Wii U.

The big third-party games aren’t on Wii U, it’s underpowered compared to the XBox One and the PS4, and its key gimmick - that controller - isn’t particularly impressive. As for the Wii’s innovative motion controls, well, Microsoft and Sony have pretty comprehensively copied them. Kinect’s a lot better at it too, arguably. Grandma and grandpa don’t really see why they need a new console, either, when the one they bought just a few years ago still works fine.

Nintendo’s been adept at pulling radical, industry-changing escapes from irrelevancy before. So, in that spirit, here’s a proposal - Nintendo needs to expand its product categories to include tablets and smartphones, running Android.

Not stock Android, of course - it would be rejigged (or “forked”, in developer lingo) to conform to Nintendo’s aesthetic and anti-piracy demands, no doubt. The success of Amazon’s Kindle Fire range, which uses a custom Android build, shows there are viable niches for devices that excel in one area - the Kindle positions itself as the tablet for readers, but imagine a Nintendo smartphone and/or tablet that offered access both to the massive range of normal Android apps and exclusive Nintendo games, both classic and new.

Nintendo could be the company that produces the definitive gaming tablet. Hell, it’s already halfway there with its eShop - it just needs to work on getting a larger range of licenses from older publishers for some classics, and it’ll be golden. There's also a good argument (as made by Wired's Chris Kohler) that Nintendo's charing too much for older games, considering how much they may have dated. 

There’s no doubt that the industry trend is for device convergence. People are less and less tolerant of having to carry around more than one device for gaming. The key for Nintendo is to offer a device that could conceivably be that single device, while also offering the things Nintendo needs to make its games work - like, say, physical buttons. Have you tried playing some of the old Sonic ports on normal tablets? They’re horrid and sluggish to play with a virtual, on-screen touchpad.

It’s a boring cliche for writers to call for Nintendo to make games for Android or iOS - or even to port older GameBoy games, like the first Pokemon games, over - but the company has always resisted because its entire design aesthetic has been that it can’t guarantee software quality without also being in control of the hardware.

It’s not dissimilar to Apple’s approach, frankly, and since it’s served them pretty well so far, it’s not something that would conceivably be sacrificed so easily. Staying out of the general marketplace by sticking to their own device would also prevent an absolutely critical mistake on Nintendo’s part, which is to sacrifice game quality in favour of the quick, small, freemium model that is favoured on smartphones. Nobody wants to see a Nintendo reduced to that.

Go the other way, instead, and create a device that offers access to the library the rest of the world wants, plus quality on top. Have the NinTablet or NintenPhone link up to the Wii U’s successor too, if Shigeru Miyamoto insists upon the dual-screen thing - but accept that the era of single-purpose devices for the living room is over, too, and take that into account when working on the Wii U 2. History has shown that as long as Nintendo’s mobile health has been assured, the company thrives.

Playing a Wii U at E3 in 2012. (Photo: Getty)

Ian Steadman is a staff science and technology writer at the New Statesman. He is on Twitter as @iansteadman.

Show Hide image

The City of London was never the same after the "Big Bang"

Michael Howard reviews Iain Martin's new book on the legacy of the financial revolution 30 years on.

We are inundated with books that are, in effect, inquests on episodes of past failure, grievous mistakes in policy decisions and shortcomings of leadership. So it is refreshing to read this lively account of a series of actions that add up to one of the undoubted, if not undisputed, successes of modern ­government action.

Iain Martin has marked the 30th anniversary of the City’s Big Bang, which took place on 27 October 1986, by writing what he bills as the inside story of a financial revolution that changed the world. Yet his book ranges far and wide. He places Big Bang in its proper context in the history of the City of London, explaining, for example, and in some detail, the development of the financial panics of 1857 and 1873, as well as more recent crises with which we are more familiar.

Big Bang is the term commonly applied to the changes in the London Stock Exchange that followed an agreement reached between Cecil Parkinson, the then secretary of state for trade and industry, and Nicholas Goodison, the chairman of the exchange, shortly after the 1983 election. The agreement provided for the dismantling of many of the restrictive practices that had suited the cosy club of those who had made a comfortable living on the exchange for decades. It was undoubtedly one of the most important of the changes made in the early 1980s that equipped the City of London to become the world’s pre-eminent centre of international capital that it is today.

But it was not the only one. There was the decision early in the life of the Thatcher government to dismantle foreign-exchange restrictions, as well as the redevelopment of Docklands, which provided room for the physical expansion of the City (which was so necessary for the influx of foreign banks that followed the other changes).

For the first change, Geoffrey Howe and Nigel Lawson, at the Treasury at the time, deserve full credit, particularly as Margaret Thatcher was rather hesitant about the radical nature of the change. The second was a result of Michael Heseltine setting up the London Docklands Development Corporation, which assumed planning powers that were previously in the hands of the local authorities in the area. Canary Wharf surely would not exist today had that decision not been made – and even though the book gives a great deal of well-deserved credit to the officials and developers who took up the baton, Heseltine’s role is barely mentioned. Rarely is a politician able to see the physical signs of his legacy so clearly. Heseltine would be fully entitled to appropriate Christopher Wren’s epitaph: “Si monumentum requiris, circumspice.”

These changes are often criticised for having opened the gates to unbridled capitalism and greed and Martin, while acknow­ledging the lasting achievements of the new regime, also explores its downside. Arguably, he sometimes goes too far. Are the disparities in pay that we now have a consequence of Big Bang? Can it be blamed for the increase in the pay of footballers? This is doubtful. Surely these effects owe more to market forces, in the case of footballers, and shortcomings in corporate governance, in the case of executive pay. (It will be interesting to see whether the attempts by the current government to address the latter achieve the desired results.)

Martin deals with the allegation that the changes brought in a new world in which moneymaking could be given full rein without the need to abide by any significant regulation. This is far from the truth. My limited part in bringing about these changes was the responsibility I was handed, in my first job in government, for steering through parliament what became the Financial Services Act 1986. This was intended to provide statutory underpinning for a system of self-regulation by the various sectors of the financial industry. It didn’t work out exactly as I had intended but, paradoxically, one of the main criticisms of the regulatory system made in the book is that we now have a system that is too legalistic. Rather dubious comparisons are made with a largely mythical golden age, when higher standards of conduct were the order of the day without any need for legal constraints. The history of insider dealing (and the all-too-recently recognised need to legislate to make this unlawful) gives the lie to this rose-tinted picture of life in the pre-Big Bang City.

As Martin rightly stresses, compliance with the law is not enough. People also need to take into account the moral implications of their conduct. However, there are limits to the extent to which governments can legislate on this basis. The law can provide the basic parameters within which legal behaviour is to be constrained. Anything above and beyond that must be a matter for individual conscience, constrained by generally accepted standards of morality.

The book concludes with an attempt at an even-handed assessment of the likely future for the City in the post-Brexit world. There are risks and uncertainties. Mercifully, Martin largely avoids a detailed discussion of the Markets in Financial Instruments Directive and its effect on “passporting”, which allows UK financial services easy access to the European Economic Area. But surely the City will hold on to its pre-eminence as long as it retains its advantages as a place to conduct business? The European banks and other institutions that do business in London at present don’t do so out of love or affection. They do so because they are able to operate there with maximum efficiency.

The often rehearsed advantages of London – the time zone, the English language, the incomparable professional infrastructure – will not go away. It is not as if there is an abundance of capital available in the banks of the EU: Europe’s business and financial institutions cannot afford to dispense with the services that London has to offer. As Martin puts it in the last sentences of the book, “All one can say is: the City will survive, and prosper. It usually does.”

Crash Bang Wallop is not flawless. (One of its amusing errors is to refer, in the context of a discussion of the difficulties faced by the firm Slater Walker, to one of its founders as Jim Walker, a name that neither Jim Slater nor Peter Walker, the actual founders, would be likely to recognise.) Yet it is a thoroughly readable account of one of the most important and far-reaching decisions of modern government, and a timely reminder of how the City of London got to where it is now.

Michael Howard is a former leader of the Conservative Party

This article first appeared in the 20 October 2016 issue of the New Statesman, Brothers in blood