MUSEUM OF THE CITY OF NEW YORK, USA/BRIDGEMAN IMAGES
Show Hide image

The eclipse of the West

What has driven the new age of isolation - and the return of great power politics?

This is the cover story from this week’s New Statesman, The eclipse of the West. Subscribe here.

In May 2015 Russia held its Victory Day parade in Red Square, Moscow, to celebrate the 70th anniversary of Nazi Germany’s surrender at the end of the Second World War. The ceremony was boycotted by the country’s former Western allies in protest at Moscow’s interference in eastern Ukraine, though the military procession featured contingents from China and India. Addressing the crowd, President Vladimir Putin complained, “In the past decades, we have seen attempts to create a unipolar world” – by the United States, in cahoots with its allies. By the end of December 2016, with Russia claiming its version of success in the Syrian War and beginning to play kingmaker in Libya, Putin declared in an interview on Russian national television that Western efforts had failed. “We are already living in different times,” he said. “The global balance is gradually restoring.”

From Moscow to Beijing, there is no shortage of those ready to declare the “end of the American century”. Yet what is striking is how much traction this notion has gained in the West. In European capitals, the long-held habit of griping about America’s leadership in international affairs has been replaced by a growing concern about a world in which Washington’s commitment to internationalism is diminished. In the US, meanwhile, there was a time when creeping pessimism about the nation’s ability to shape the world would have seemed sacrilegious. Yet the post-mortems on the “age of unipolarity” – an era in which one power enjoyed a predominance of cultural, economic and military power in the international system – are coming thick and fast. There are trends at work that cannot be explained merely by the election of Donald Trump as president, though he is in part a beneficiary from them.

In Making the Unipolar Moment, Hal Brands describes what is happening as the natural passing of a phase in international affairs, brought about by the convergence of several historical forces, not least the implosion of the Soviet Union – America’s greatest rival – in 1989. Another interpretation, by Michael Mandelbaum in Mission Failure: America and the World in the Post-Cold War Era, is that the diminution of US power is in part the consequence of overstretch and blowback from its misguided zeal to reshape the world in its image. In this version of events, nothing did more damage than the attempt to bring liberal democracy to Iraq in 2003, with all the blood and treasure that was spent in pursuing the cause.

The new vogue for self-examination should not be confused with any abandonment of Washington’s aspirations to “primacy”. Despite the undeniable creep of world-weariness, it is no easy task to wean the US off its habit of “leading from the front”. In Trump’s formulation, it is time for America to start “winning again”. This does not imply a continuation of the humble retreat that began under Barack Obama. Yet there is no denying that a new narrative has taken hold. The rising power of China, the blunting of US power abroad and the stunting of growth at home have led to a realisation that “pre-eminence” cannot be taken for granted. It is for this reason that America’s international commitments – from Nato to the UN – are about to undergo an audit. Those of us who have got used to operating in this orbit must be prepared to move faster on our feet.

Since the First World War, the question of “what America does next” has been more important to the security and health of the West than anything else. The truth, however, is that America has always been uncertain about the costs of the global leadership envisaged by President Woodrow Wilson and encapsulated in his “Fourteen Points”, outlined in January 1918. For much of the past century, to borrow Henry Kissinger’s formulation, it has been an “ambivalent superpower”. On both left and right, there has been incessant grumbling against elites who were thought to be preoccupied with America’s standing on the international stage to the detriment of the health of the republic at home.

The voices of the dissenters grew louder after the wars in Vietnam and Iraq, but they have been ever present in the debate. It is a mistake to see Donald Trump’s victory as a wild aberration from the American national story; rather, it was the forthright expression of sentiments that have bubbled under the surface for more than a century.

***

Trump’s plea to put “America first” has a long lineage, and so does the unvarnished assertion of commercial aggrandisement as the guiding light of foreign policy. The America First Committee, a vehicle for isolationist sentiments that opposed intervention in the Second World War, was dissolved in December 1941, four days after the Japanese attack on Pearl Harbor. Yet the sentiments that it expressed did not disappear. By the end of the war, as the US worked closely with the UK to create a new international system – fastened down through the Bretton Woods system, the creation of the United Nations and the building of Nato – there were many objections raised to the course of US foreign policy.

One was that Americans were picking up the bill for European security in a way that freed up the funds for a British experiment in socialism under the Labour prime minister Clement Attlee. Another, shared by many senior diplomats in the early stages of the Cold War, was that the British were taking advantage of the growing divide between the Soviet Union and the West to continue to pursue their imperial “great game” with the Russians in the Middle East and the eastern Mediterranean.

It was only after the triumph of foresighted American statecraft under the postwar secretary of state George Marshall that the US learned to take the long view and to come to terms with its superpower status. With leadership of the free world came a growing sense of the mission’s gravity. For some, it was a gift bestowed by providence; for others, it was something of a cross to bear. Either way, generations of American elites were trained to assume these global responsibilities.

The people who hold these views have not disappeared in the space of one presidential campaign. Before Donald Trump’s election, Washington was dominated by those who believed that America was the “indispensable nation”. Among this cohort were many liberal internationalists who were concerned about a growing perception of American retreat under Barack Obama. If Hillary Clinton had won the presidency, they would now be in the ascendancy.

It is worth pausing for a moment to consider this alternate reality. Clinton believed that, under Obama, the United States had been too reticent in asserting itself and too complacent in letting the US-led order decay in the Middle East, eastern Europe and the Asia-Pacific region. “Don’t do stupid shit” – Obama’s mantra – was, in Clinton’s view, an inadequate organising philosophy for a nation of this status and historical calling. That she served only one term as secretary of state gave her a chance to distance herself from aspects of Obama’s foreign policy on Syria and Ukraine. Likely Clinton appointees, such as Michèle Flournoy, who was odds-on to be her secretary of defence, also stayed aloof during Obama’s second term. This was partly because they were confident that they would be granted the opportunity to do it better.

Those who hung on in the hope that a more activist foreign policy would emerge (such as the then US ambassador to the United Nations, Samantha Power) looked increasingly forlorn. There was something pathetic, in the true sense of the word, about the sight of Power, who rose to prominence as an anti-genocide campaigner, chiding Russia at the UN for its actions in Syria while the nation that she represented opted to stay on the sidelines.

During last year’s presidential election campaign, many of Hillary Clinton’s critics warned that she was a “liberal hawk” and more likely to engage the US in conflict overseas than Trump. The American left was not galvanised by the prospect of a return to the business of policing international order under Clinton. Bernie Sanders raised the alarm at Secretary Clinton’s interest in Henry Kissinger’s latest book, World Order (published in 2014), and at the way that she called Kissinger her friend.

On the right, the cost of US hegemony also became a live issue during the primaries. The many Republican foreign policy experts who placed a premium on the continuation of US leadership on the world stage were alarmed by the prospect of a Trump presidency. Their concerns manifested themselves in the “Never Trump” letter, which was signed by some of the most influential figures in the Republican national security establishment. In both the Democratic and the Republican Parties, therefore, the champions of a US-led world order have found themselves locked out in ever-growing numbers.

This trend did not start with Trump, even if he has given it the fullest ­exposition. The Obama world-view – sprinkled with moral philosophy and the theology of Reinhold Niebuhr – appealed to many sophisticated minds in the West. However, it turned out to be much more pessimistic, restrained, introspective and centred on America than it appeared in those heady days in 2009 when he won a Nobel Peace Prize. The anti-Bush he may have been; yet a world healer he was not, nor did he pretend to be one.

At first glance, Obama and Trump could not be more different, but they share at least two core convictions. The first is that the US has been too intoxicated by the old way of thinking about its power: an obsession with maintaining “credibility” and acting as the guarantor of global peace and security. The second is that the US is paying too high a price for the privilege. Thus Obama was willing to break away from the “Washington playbook” when he resisted pressure to take military action against Bashar al-Assad’s regime in Syria, after his “red lines” on the use of chemical and biological weapons were crossed. Those who despair that Trump respects no playbook must acknowledge that the one in the Oval Office was looking pretty dog-eared.

Of Trump’s foreign policy pronouncements to date, what has caused most panic in Western capitals is his suggestion that Nato, in its current form, is “obsolete”. Once again, however, we could do more to distinguish between the message and the messenger. America’s exasperation at the failure of its Nato allies to pull their weight on defence spending has been growing for years. It was Obama who announced what he called the “anti-free-rider campaign”, referring to the European nations that had grown lazy under the protection of the US security umbrella. Symptomatic of this, he hinted, was the poor performance of Britain and France in Libya following an intervention that they had pushed for in 2011.

As for the sanctity of Nato, there have been several senior European statesmen willing to play fast and loose with it long before Trump. Last year the French president, François Hollande, said: “Nato has no role at all to be saying what Europe’s relations with Russia should be . . . For France, Russia is not an adversary, not a threat. Russia is a partner.” In Britain, the leader of Her Majesty’s Opposition, Jeremy Corbyn, has stated that British troops stationed in Estonia are a provocation to Moscow and that Nato should have been wound up in 1990 along with the Warsaw Pact.

***

Those who speak of the imminent decline of the West often view it through the lens of the growing power of Asia, or in terms of the US’s declining competitiveness against new superpowers such as China and India. Yet the more immediate challenge is its internal fragmentation in the face of these pressures.

For Brexit Britain, access to new markets and centres of innovation in Asia is highly prized. Part of the rationale behind Brexit is that the EU lacks the requisite dynamism to wrap up quick deals. Even outside the EU, however, it is not so easy to escape entangling commitments. Under David Cameron, Britain was prepared to risk the wrath of  the US in signing up to the China-led Asian Infrastructure Investment Bank. Given the importance of agreeing to a trade deal with the US, Theresa May’s government will now have to think twice before attempting such a trick.

Such realpolitik calculations give our foreign policy a 19th-century feel. On the one hand, this may be a natural turning of the historical wheel. On the other hand, since the end of the Cold War, the West has lost a narrative about itself and a vision of how the world is supposed to work. This, in part, is an intellectual problem. The post-1945 international system was built on certain assumptions that reflected the views of the Allies who triumphed in the Second World War. Chief among these was a version of historical development that held that economic and social progress would create the foundations for peace.

Many of these assumptions have been challenged in Western states by populations which reject the world-view that they imply. And they are fraying under the pressure of what the writer Pankaj Mishra, borrowing from Friedrich Nietzsche, calls the politics of ressentiment. Until a successor vision emerges for the management of global affairs, one that has a broad domestic consensus behind it, it will be our fate to deal with the moving parts – the changing alliances, porous borders and emerging threats – as they collide and splinter.

Much has been said about the internal crises draining the legitimacy of the Western elites, the ripping up of consensus and the quasi-revolutionary mood that is sweeping across nations. And yet, to an extent that has not been fully grasped, the crisis of the West has been tied to repeated failures in foreign policy.

Since the start of this century, the limits of Western power have been illustrated time and again – nowhere more so than in the Middle East. Compounding this, there has been a loss of appetite for lengthy and complicated foreign entanglements – in diplomacy as much as in war – and of the patience needed to see them through.

The Western way of war has become discredited in Afghanistan, Iraq and Libya. The fashion for counterinsurgency that characterised the past two decades partly grew out of a desire to evolve towards a more sophisticated, humane and more politically palatable use of force. In extremis, there was talk of campaigns being won – such as when British troops were sent to Helmand Province, Afghanistan, in 2006 to wrest control from the Taliban – without a shot being fired. Even in the rare cases of success, such as the US-led “surge” in Iraq, the political and financial costs of such lengthy campaigns are unsustainable. Not before time, rusty old concepts such as “deterrence” are being given a hearing again.

Blessed with decades of relative security, we have lost the custom of thinking strategically. Having enjoyed a preponderance of force and wealth, we have failed to grasp the changing nature of power in international affairs. Since 1989, from a position of strength, the West has evangelised about its capacity for “soft power”, even attempting to quantify it as some sort of saleable commodity. Russia – a country with scandalously low life expectancy, haemorrhaging population levels and a sclerotic economy – has made a mockery of this. Moscow has not only deployed conventional “hard” power in Syria and Ukraine, but crafted its own version of “soft”, or cultural, influence using instruments such as the media groups Sputnik and RT (formerly Russia Today).

Underpinning all of this is a loss of confidence in the merits of “Western civilisation” that would have seemed odd to our forebears in 1945. It is too easily forgotten that the vision of liberal internationalism was Western in inception, and it was based on a belief in the legitimacy and superiority of the Western way of government. Although imperfections were admitted, the organising philosophy was to apply these goods – such as the rule of law and the principle of self-determination – on an international scale.

By the same token, the linkage of our domestic political contracts to the ways in which our nations behave in their relationships with others is deeper than is sometimes understood. The foundation stone of the post-1945 world order was the Atlantic Charter of 1941-42. As Elizabeth Borgwardt explains in her wonderful book, A New Deal for the World, it can be understood as a globalised version of Franklin D Roosevelt’s domestic New Deal politics and the broader conception of liberty contained therein.

It was in the same spirit that William Beveridge began his white paper of 1942 with the statement that a “revolutionary moment in the world’s history is a time for revolutions, not for patching”. In a series of newspaper articles, Beveridge interspersed his advocacy for its implementation on the home front with articles in support of what later became the UN. For the generation that fought the war, the two causes – domestic political renewal and the ­construction of a parliament of nations – were indivisible.

As last year’s presidential election got under way, the Princeton foreign policy expert G John Ikenberry argued that Roosevelt had bequeathed the US a “centrist tradition of American world leadership”, marked by a “strong bipartisan internationalist tradition”. A radical conservative critique, he warned, was challenging “the progressive foundations of Pax Americana” by disparaging the New Deal foundations on which American internationalism was based.

There are those who would have us neatly separate the domestic and foreign into separate spheres. Yet there is a reason why a desiccated version of foreign policy realism or naked rationalism – the type of cultish obsession with the “national interest” that often emerges on the right in times of international flux – has never been pre-eminent among the West’s leading states. For the past century at least, the practice of Western foreign policy has been tied to an organising philosophy, a larger vision of how the world should work, bolstered by myth.

This required both theologising and evangelicalism in the name of universal goals. An element of “sacred drama”, as Conor Cruise O’Brien explained in his 1968 book on the United Nations, served a higher purpose. The risk has always been that sacred dramas are pushed too far – that the champions of international peace built their castles in the air, placing their faith in vapid utopianism that evaporates at the first sign of stress. And even though the post-1945 world order has lasted for more than 70 years, many of the myths around it have run their course.

The “rules” we often talk about are conceptual and moral as much as they are legally binding. In truth, the fate of Syria shows that, when it comes to maintaining certain international standards, it is the combination of political will and power that matters. Too often, the lawyerly emphasis on rules has ignored that they are unenforceable without order. It is a lesson that many liberal internationalists have found hard to stomach, to the detriment of their project.

***

As with the Russian Revolution of 1917 or the fall of the Berlin Wall in 1989, watershed moments in international history can creep up on us without much warning. Since the Brexit vote and the election of Donald Trump, it has become fashionable to invest 2016 with a sense of great historical significance – as a year that future historians will look back on with sorrow and furrowed brows. This is before time.

A true historical sensibility should warn us against such fatalism. The Western world faces many challenges – none more pressing than its declining share of global wealth and population compared to Asia’s leading states. Yet looking to the future with trepidation should not take the form of giving in to despair. To do so is to court a self-fulfilling prophecy.

Some of the most tumultuous years in history have been spurs to acts of great ­forward thinking and imagination. It was in 1933 – which began with Adolf Hitler becoming the chancellor of Germany – that H G Wells published The Shape of Things to Come. Part novel and part “history” of the future, it tells an alternative story of humanity up to the year 2106. The world that the book depicts is one in which Franklin D Roosevelt fails to implement the New Deal and an economic crisis lasts for 30 years, punctuated by a second world war. (Wells predicts that it would begin in January 1940, sparked by a clash between Germany and Poland over Danzig.) There is no victor; the leading powers emerge exhausted, unable to prevent a plague that emerges in 1956 – spread by a group of baboons that escape from London Zoo – and wipes out much of the world population.

Here Wells envisages a benevolent “dictatorship of the air”, which takes shape at an international conference convened in Basra in 1965. The dictatorship goes on to attempt to eradicate the world’s leading religions, but eventually melts away a century later, making way for a peaceful humanitarian utopia in which the struggle for material existence has ended, and a society that could therefore be governed by reason. The last recorded event in the book takes place on New Year’s Day 2106, when there is a levelling of the remaining skeletons of the skyscrapers of New York.

As idiosyncratic as this may have been, it was the work of visionaries such as Wells that spurred the statesmen of the West to take hold of historical developments and to try to build a version of this world state. As much as anything, it was about ensuring the survival and adaptation of a beleaguered and near-bankrupt Western civilisation.

The first paragraph of The Shape of Things to Come describes a world experiencing a “confluence of racial, social and political destinies”. With that, “a vision of previously unsuspected possibilities opens to the human imagination”, which entails “an immense readjustment of ideas”. Civilisation, as Wells puts it in words we could heed today, is “in a race between education and catastrophe”.

Where today are the leaders or intellectuals in the West capable of offering a vision of the shape of things to come, around which their allies or populations might rally? In the short term, we have seen few front-line politicians since Tony Blair who have provided us with a view of the world around them (however disputed their vision might have been) and the nation’s place within it. The great foreign policy speech seems to be a relic of the recent past.

If there is anyone looking to Donald Trump’s White House for a vision of a “new world order”, they will be disappointed. In his inauguration speech, there was no softening of his line and he wasted no time in reiterating that his priority was to put “America first”. In the past 70 years, there have been few such unambiguous exhortations of this creed.

In the absence of an international vision, however, the burning question is whether Trump’s foreign policy will follow a method; or, failing that, a pattern. An optimistic view of this has been ventured by the historian Niall Ferguson, who has suggested that Henry Kissinger, who is 93, has provided a script for the global rebalancing that may begin under Trump. The US president has sought Kissinger’s counsel since his election, as did Hillary Clinton in the run-up to the vote. There are rumours that Kissinger may be used in an attempt to reset relations for Russia. It was notable, too, that he was being feted in Beijing just as Trump was tweeting against China for its behaviour in relation to Taiwan and the South China Sea.

Will Trump’s foreign policy follow Kissingerian grooves, in the form of some sort of triangulation of great-power diplomacy between Moscow and Beijing? Is there a strategic rationale behind the bombast, or could one emerge?

This is a possibility but nothing more. The first few days of the new presidency do not suggest that Trump the campaigner is about to give way to a statesman with foresight. Nonetheless, it is likely that US foreign policy will settle into a recognisable rhythm over the course of the year.

President Trump’s propensity for slaying sacred cows is not shared by his nominees for secretary of state (Rex Tillerson, a former chief executive of the ExxonMobil oil company) or secretary of defence (General James Mattis). Both have stressed the importance of America’s alliance network and their belief in the importance of Nato. The same applies to Nikki Haley, the governor of South Carolina and former Trump critic, who has been chosen as ambassador to the United Nations. The US state department and the Pentagon have grown used to acting in certain ways that suggest that a revolution on American foreign policy will not occur overnight.

The most important variable in second-guessing Trump’s foreign policy is the extent to which he will seek to control it from the White House, continuing a trend of recent years, or leave his appointees to their work. If the Oval Office becomes the locus of action, the role of General Mike Flynn, Trump’s controversial national security adviser, is likely to be of growing importance.

***

For the past hundred years – but particularly since 1945 – Britain has carved out a privileged place for itself by operating in the slipstream of US foreign policy. In that time, the UK’s greatest strategic nightmare has been the prospect of an American retreat from its global responsibilities. There have been periods, as during the interwar years, in which the US preferred to mind its own business rather than engage in the business of world government. It is no coincidence that these were some of the most perilous years in British history.

Despite the hand-wringing that greeted Donald Trump’s victory, these habits are deeply ingrained in our diplomatic and national security establishments and cannot easily be changed. Those arguing that it is time to break from the US and seize the opportunity for a new relationship with Europe, in which Britain plays the role of security provider, are both regurgitating an old argument and presenting a false dichotomy. Likewise, the idea that the leadership of the free world has passed to Angela Merkel’s Germany is absurd.

The saga of the bust of Winston Churchill in the Oval Office – beloved by George W Bush, removed by Obama and brought back by Trump – has become a rather tired metaphor for the state of Anglo-American relations. In truth, the British delegation in Washington has engaged in catch-up since Trump’s surprise victory but there are signs that the nettle has been grasped. As 2016 drew to a close, the British ambassador to the US, Kim Darroch, held his nose to deliver a speech at a US conservative think tank, the Heritage Foundation. Speaking the language of “burden-sharing”, he announced that one of the UK’s two new ­super-carriers, HMS Queen Elizabeth, is scheduled to sail through the South China Sea on her maiden deployment in 2020, a restatement of the shared Anglo-American commitment to free navigation of the seas.

The Prime Minister has already managed to bump herself up the queue and is the first foreign leader to make the pilgrimage to the Trump White House. According to the Sunday Times, Bernie Sanders has expressed the hope that the UK might perform the function of a “moral conscience” in relation to the Trump administration’s foreign policy. There will be developments in US foreign policy that will be hard to stomach, on matters from the Iran nuclear deal to climate change. Equally, the stakes are now so high – on trade and security – that Britain will have to pick carefully those issues on which it dissents.

In this new world, the choice facing Britain might seem stark. On further reflection, however, it is no choice at all. A rebalancing of the international system is about to begin, involving the world’s major powers. The cosy “universalist” language to which we have grown used (and of which we are the foremost purveyors) may belong to another era. Britain can gripe from the sidelines and negotiate ourselves into irrelevance as the curator of the old order, or do its utmost to be present at the creation of the new.

A sprinkling of H G Wells might enliven our sense that there is a future to be grasped and an opportunity to contribute to a larger vision of how the world should work. Yet there has to come a time when we draw a line under the fin de siècle angst and get on with it.

John Bew is a professor of history at King’s College London, the author of “Citizen Clem: a Biography of Attlee” (Riverrun) and a New Statesman contributing writer

John Bew is a New Statesman contributing writer. His most recent book, Realpolitik: A History, is published by Oxford University Press.

This article first appeared in the 26 January 2017 issue of the New Statesman, The eclipse of the West

Creative Commons
Show Hide image

Starting Star Wars: How George Lucas came to create a galaxy

On the 40th anniversary of the release of the original Star Wars, George Lucas biographer James Cooray Smith shares the unlikely story of how the first film got made.

While making THX 1138 in 1970, writer/director George Lucas told composer Lalo Schifrin that he wanted to make a Flash Gordon picture, an updating of the 40s sci-fi serials that he’d enjoyed as a child. It would, however, be those serials not as they were, but how he remembered them as having been. When the rights to these proved unavailable, he began to work on original idea, hoping to create something similar, but which he would own himself.

In January 1973, after completing his 50s nostalgia picture American Graffiti but before its release, Lucas began his outline for this space adventure. The first line of this near-incomprehensible document was The Story of Mace Windu. Mace Windu, a revered Jedi-Bendu of Opuchi who was related to Usby CJ Thape, Padewaan learner to the famed Jedi.’

"Jedi" was a word Lucas had coined to describe a clan of warrior mystics who were essential to his story. A man whose fascination for Japanese cinema had become a general interest in Japanese cultural history, he’d named them after the branch of Japanese drama that drew moral and instructive lessons from stories set in the past – Jidai geki.

This version is set in the thirty-third century and features a teenage Princess, droids, an Evil Empire and a grizzled Jedi warrior, General Skywalker, whose plot role resembles Luke’s from the finished film, although his character is Obi-Wan Kenobi’s. It climaxes with a space dogfight and ends with a medal ceremony. Among the planets named are Alderaan (here the Imperial capital) and Yavin, at this point the Wookiee homeworld. Some characters from this draft (Valorum, Mace Windu) would eventually find a home in The Phantom Menace more than twenty years later.

By May Lucas had a 132 page script, The Adventure of Anikin Starkiller. Skywalker had acquired the forename Luke but was no longer the protagonist. This was Anikin (sic) Starkiller, one of the sons of General Skywalker’s old comrade, the partially mechanical renegade Kane Starkiller. Anikin had to protect a Princess, aided by two robots R2-D2 and C-3PO.

Lucas had worked backwards from Flash Gordon, looking to uncover the source of his appeal, hoping to transfer it to his own story. Once he’d worked his way through the comic strips of Gordon’s creator Alex Raymond, he tackled Edgar Rice Burroughs, Jules Verne and Edwin Arnold’s Gulliver on Mars. Conversations with his New Hollywood peers about the archetypes thrown up by his reading – and which he increasingly saw everywhere – brought him into contact with Joseph Campbell’s then newly published Myths to Live By (1972) an anthology of lectures and essays from a man who devoted his career to identifying the basic archetypal characters and situations which he felt underpinned all human mythologies.

"The book began to focus what I had already been doing intuitively" Lucas later said, an idea which seemed to him to itself reinforce Campbell’s contention that such archetypes and situations dwelled in a collective unconsciousness. Lucas expanded his reading to epics of all kinds, and began planning a visual style that would combine the vistas of Japanese master director Akira Kurosawa with the kind of static-camera realism which he’d used on American Graffiti.

Lucas wanted over-exposed colours and lots of shadows, but shot in a way that made them seem unremarkable. Seeing the Apollo missions return from the moon "littered with weightless candy bar wrappers and old Tang jars, no more exotic than the family station wagon" had illustrated to him the problem with every fantasy movie ever made. Their worlds never looked like people lived in them. His film would depict a "used future". Describing the aesthetic he’d sought to American Cinematographer he explained: "I wanted the seeming contradiction of…fantasy combined with the feel of a documentary."  To Lucas Star Wars wasn’t science fiction, it was "documentary fantasy".

There was only one studio executive Lucas thought had any hope of understanding what he was trying to do, Fox’s Alan Ladd Jr, son of the late actor. Like Lucas and his contemporaries in New Hollywood, Ladd was a man driven by a love of cinema. Lucas could communicate with him through a shared vocabulary, describe a planned scene as being like something from The Searchers (John Ford, 1956) or Fahrenheit 451 (Francois Truffaut, 1966) and be understood. Ten days after his presentation to Ladd, they signed a development deal. Fox agreed to pay Lucas $15,000 to develop a script, plus $50,000 to write the movie and another $100,000 to direct it, should it actually be made. American Graffiti associate producer Gary Kurtz was named as Producer for Star Wars, and received $50,000.

The script development money gave Lucas enough to live on whilst he continued work on the screenplay. As he did so it changed again; a ‘Kiber Crystal’ was written in and then written out. Skywalker became Deak Starkiller’s overweight younger brother before becoming the farm boy familiar from the finished film. Characters swapped names and roles. A new character named Darth Vader – sometimes a rogue Jedi, sometimes a member of the rival ‘Knights of Sith’ – had his role expanded. Some drafts killed him during the explosion of the Death Star, others allowed him to survive; across subsequent drafts his role grew. Some previously major characters disappeared altogether, pushed into a "backstory", Lucas choosing to develop the practically realisable aspects of his story.

This is an important clarification to the idea that Star Wars was "always" a part of a larger saga, one later incarnated in its sequels and prequels. That’s true, but not in an absolutely literal way. Star Wars itself isn’t an excerpted chunk of a vast plotline, the rest of which was then made over the next few decades. It’s a distillation of as much of a vast, abstract, unfinished epic as could be pitched as a fairly cheap film to be shot using the technology of the mid 1970s. And even then much of the equipment used to make the film would be literally invented by Lucas and his crew during production.

In August 1973 Graffiti was released and became a box office sensation, not only did the profits make Lucas rich (he became, at 29, a millionaire literally overnight) its success meant that Lucas was able to renegotiate the terms of his Fox deal. Rather than making demands in the traditional arenas of salary and percentages Lucas wanted control of the music, sequel and merchandising rights to his creations. Fox conceded him 60 per cent of the merchandising, aware of its potential value to them, but eventually agreed that Lucas’s share would rise by 20 per cent a year for two years after the film’s release. Few films made money from spin-off products for a whole 24 months, and Star Wars would surely be no different. Lucas got the sequel rights as well, albeit with the proviso that any sequel had to be in production within two years of the film’s release or all rights would revert to Fox.

Most important amongst Lucas’ demands was that, if it went ahead, he wanted the film to be made by his own company, not by Fox. That way he could control the budget and ensure all charges and costs made to the production were legitimately spent on the film. The experience of watching Mackenna’s Gold being made while a student on placement a decade earlier had taught him just how much money a studio could waste, and on a film like Star Wars – which was both ambitious and would inevitably be under-budgeted – it was crucial that this did not happen. Control of the music rights also had a sound reason behind it. Universal were making a fortune out of an American Graffiti soundtrack that was simply a repackaging of old hits featured in the movie. Of the profits of this Lucas saw nothing despite having selected the tracks featured and fought long and hard for their inclusion in his film.

In March 1975, Ladd took Lucas’ draft to the Fox board. They passed it and budgeted the film at $8.5m. Characters bounced in and out of that script right up to the preparation of the shooting draft, dated 15 January 1976. This was tailored to be as close to the film’s proposed budget as possible, and contain as many of the ideas, characters and situations Lucas had spent the past few years developing as he considered feasible.

This draft is the first version of the script in which Kenobi dies fighting Vader. Previously he had been injured, but escaped with Luke’s party. Alec Guinness, who had already been cast, was initially unhappy with this change, but was persuaded by Lucas that a heroic death followed by appearances as a spectral voice would prove more memorable to audiences than his spending the last third of the film sitting on Yavin whilst the X-Wings went into battle.

Filming began on location in Tozeur, Tunisia on 22 March 1976. Before shooting Lucas sat his crew down and made them watch four films which he felt between them defined what he was after in Star Wars. Stanley Kubrick’s 2001 (1969), Douglas Trumbull’s 1975 Silent Running, Sergio Leone’s Once Upon a Time In the West and Fellini’s Satyricon (Both 1969). The Leone picture was full of the sun-blasted vistas Lucas wanted to evoke for Tatooine, and the Fellini film, with its aspects of travelogue and attempts to portray an entire society in a fly-on-the-wall manner gave an idea of the "documentary fantasy" approach the director was so keen on. All four films shared one vital element: they’re windows onto lived-in worlds remarkable to audiences but regarded as ordinary by the film’s characters.

The first scenes shot for Star Wars were those of Luke buying Artoo and Threepio from the Jawas outside his foster parents’ home. Producer Kurtz had allowed 11 days for the shoot, after that a borrowed army C130 Hercules was scheduled to pick up the cast and crew.

A few days into shooting, creature make-up man Stuart Freeborn was taken ill and had to be flown back to Britain where he was diagnosed with pneumonia. Other crew members contracted dysentery. On 26 March Tunisia experienced its first winter rainstorm for half a century, damaging equipment and exterior sets delaying filming of key scenes.

Lucas wanted the stormtroopers to ride ‘dewbacks’, dinosaur-like domesticated beasts that allowed the troops to move across the desert. One dewback was built, out of foam rubber stretched over a wire frame. It could only be used in the background and no one was ever seen riding one. The other live animal Lucas wanted to portray was a Bantha, a huge horned, shaggy beast reminiscent of a prehistoric mammoth. It was to be the mode of transport for the Tusken Raiders, faintly Bedouin, vaguely mechanically-enhanced humanoids who attacked Luke in the Jundland wastes. In the end, creating the beasts proved impossible, and while they were referred to in dialogue in scenes that were shot (‘bantha tracks…’) none of their sequences were lensed.

As hard as the shoot was on Lucas, he at least had an idea of what he was trying to do and how it would all fit together. The actors, suffering stomach troubles, sunburn and long days, were less clear. Anthony Daniels trapped inside an almost immovable fibreglass body suit suffered the worst. Twenty five years later he would give credit for helping him to get through the Tunisia filming to Alec Guinness. "He was incredibly kind to me…I firmly believe that I wouldn’t have completed that arduous task of shooting without him."

Once the Tunisian shoot was over, the cast moved to EMI Elstree Studios in Borehamwood, outside of London. Star Wars was being made in the UK because it wasn’t possible to shoot the film in Hollywood at that time, not that Lucas – with his lifelong disdain of LA itself – wanted to anyway. Star Wars required nine stages simultaneously, something that no Hollywood studio complex could guarantee at anything like sufficient notice. In March 1975 producer Kurtz had flown to Italy to look at studio space, but found nothing suitable. He then caught a plane to London, where Lucas joined him.

Together they scouted UK film studios. Pinewood was a possibility, but management insisted Lucasfilm hire their technicians, a condition which became a deal-breaker. Neither Shepperton nor Twickenham had enough sound stages (although the giant Stage H at Shepperton  - bigger than any stage at Elstree – would ultimately house one scene of the film) which left only EMI Elstree. Then losing £1 million a year, Elstree was being kept open more or less on the insistence of Harold Wilson’s government, whose allies in the Trades Union movement considered the closing of the facility unconscionable. Elstree had no staff, and anyone who wished to rent it had to supply their own technicians and much of their own equipment. Off-putting to many, it sealed the deal for Lucas and Kurtz, who wanted to move their own people in. They hired the facility for seventeen weeks starting at the beginning of March 1976.

To design and build the sets needed to turn to Elstree into a realisation of Lucas’s screenplay they hired John Barry, a British designer who had worked under Ken Adam on Barry Lyndon (Stanley Kubrick, 1975) a film Lucas admired enough to hire its costumier John Rollo as well.

Elstree’s two largest stages were given over to Mos Eisley Spaceport and the interior of the Death Star. Both the Mos Eisley hangar bay and the one inside the Death Star which replaced it on the same stage were constructed around the full size Millennium Falcon set created by John Barry’s protege Norman Reynolds. Built by Naval engineers at Pembroke Dock, Wales it was 65 feet in diameter, 16 feet high and 80 feet long. It weighed 23 tonnes.

The absence of Stuart Freeborn, still recovering from Tunisia, meant that most of the aliens seen in the Mos Eisley cantina sequence were completed by assistants and lacked any articulation at all. Unhappy with the scenes as shot, Lucas resolved to do to re-shoots back in the USA.

The last scenes to be shot were for the opening battle, as Vader and his stormtroopers boarded the blockade runner. With little time Lucas used six cameras, manning one himself (Kurtz manned another) and shot the sequence in two takes. The six cameras produced so many different perspectives on the action that even the duplicated events that are in the film are unnoticeable. The finished sequence, chaotic though the creation of it was, is amongst the best put together moments in the movie, a superb evocation of Lucas’ documentary fantasy approach, and the cameras dart in and out of the action like reporters shooting newsreel footage. Virtually the first live action seen in the picture, its style later went a long way towards convincing audiences that what they were seeing was somehow real.

Principal photography completed on 16 July 1976, although some re-shoots and pick up shots for the Tatooine sequences were undertaken in Yuma, Arizona in early 1977. Amongst those scenes shot were those featuring the Banthas. Lucas borrowed a trained elephant from Marine World, and had it dressed to resemble a more hirsute, fearsome pachyderm. Mark Hamill was unavailable to participate. He’d crashed his car of the Antelope Freeway in LA shortly before and was undergoing painful facial reconstructive surgery. Although Hamill should have been involved in the re-shoot, in scenes of Luke’s landspeeder moving across the desert, Lucas had no choice but to film them without him; he took a double to the shoot, dressed him in Luke’s costume and put Threepio in the foreground. Also re-shot, over two days in La Brea, California, were portions of the cantina sequence. New cutaways and background shots were filmed to be inserted into the Elstree footage in order to eliminate as of the unsatisfactory masks as possible.

While supervising editing of the film Lucas experienced chest pains, and was rushed to hospital where he was treated for a suspected heart attack. He was later diagnosed with hypertension and exhaustion, both exacerbated by his diabetes.

Fox were by now trying to book Star Wars into cinemas, and had picked a release date in May, long before the 4th July public holiday, long regarded as the opening weekend of summer. Fox wanted $10m in advance bookings for Star Wars, desperate to recoup an investment that internal studio sources had now decided was foolish. They secured less than $2m, and achieved that only by implying to theatres that they wouldn’t be offered Charles Jarrot’s much-anticipated The Other Side of Midnight if they didn’t sign up for Star Wars too. Before its release several exhibitors complained at this "block booking" and filed suits; Fox was later fined $25,000 for the practice, punished for forcing cinemas to agree to show something which was, by the time they paid the fine, the most financially successful movie ever made.

In early 1977 Lucas screened Star Wars for a group of friends, it was nearly finished – although the opening crawl was longer and many of the special effects shots were absent, represented instead by sequences from World War II films and real combat footage shot by the USAF. Among those present were Brian De Palma, Alan Ladd Jnr, Steven Spielberg and Jay Cocks. Martin Scorsese had been invited but troubles editing his own New York, New York meant he didn’t turn up.

De Palma hated Star Wars, and spent the post-screening dinner rubbishing it to anyone who would listen. Others present were unsurprised, De Palma had behaved in the same way during the group screening of Scorsese’s’ Taxi Driver; laughing loudly through Cybill Shepherd’s conversations with Robert de Niro, and at one point shouting "Shit!" halfway through a tense scene. Only Spielberg seemed impressed, and told Lucas that he thought Star Wars would take $100m. Lucas pointed out that nothing took $100m, and countered that Spielberg’s Close Encounters of the Third Kind would do better at the box office. The two directors wrote what they considered realistic estimations of what each other’s film would make in its first six months of release on the inside of matchbooks, which they then traded. By the time Lucas got round to opening Spielberg’s matchbook and saw the figure $33m in his friend’s scrawling hand Star Wars had already made ten times that.

Odd as it seems now, when every blockbuster is prefaced by months of breathless, unrelenting media "enthusiasm", Star Wars wasn’t released on a wave of hype or accompanied by an extensive marketing campaign. It was released (on 25 May 1977) to thirty-two screens, after a barely publicised premiere at Mann’s Chinese Theatre in Hollywood. It made $2.8m in its opening week, but didn’t receive a nationwide release for two months. Despite almost unprecedented success in preview screenings, Fox were still unsure of what to do with Lucas’ bizarre children’s film. Indeed it, only got a Hollywood opening at all because William Friedkin’s Sorcerer – which had been intended for this slot at Mann’s – wasn’t finished.

So negative had advance feeling about Star Wars been that Lucas left the country; he was still in LA on opening day, finishing the sound edit (he was unhappy with the copy playing downtown, and unknowingly embarking on a lifetime of revising his movie) but the next day he and his wife (and Star Wars film editor) Marcia flew to Hawaii, where they were joined by friends, including Spielberg and Amy Irving. It was an attempt to escape what Lucas felt would be the inevitable terrible reviews and wrath of the studio. Even when Ladd called him to share his excitement over the movie’s colossal opening weekend, Lucas was unmoved; all movies labelled science fiction did well in their first few days due to the business attracted by the neglected fanbase for such things. It was only when the film continued to do outstanding business and was expanded to more and more theatres that Lucas considered returning early from his holiday, and began to realise that the film he’d just delivered had changed his life.

As "Star Wars" expanded into more cinemas, and people began to queue round the block to see it, shares in Fox climbed from well under $10.00 to $11.50 each; over the next three months the value rose to $24.62, nearly trebling in price, such was the film’s value to the embattled studio. It was a magnificent vindication for Alan Ladd Jr, who had more than once had to intervene to stop colleagues closing down the film’s production completely. He had never lost faith in Lucas and his bizarre idea, but he was virtually the only person employed by Fox itself who hadn’t.

Just a few weeks before, as the end of the financial year approached, Fox had tried, and failed, to sell its investment in Star Wars to a German merchant bank as an emergency pre-tax write off.

0800 7318496