Posts

Jury Nullification: The Short History of a Little Understood Power

Jury Nullification: The Short History of a Little Understood Power

Richard Marshall is a PhD student in History at the University of Plymouth. His doctoral research explores the place of trial by jury in the politics, culture and society of late eighteenth-century English radicalism. He is supervised by Dr James Gregory and Dr Claire Fitzpatrick.

Keywords: Juries, Nullification, Legal History, Popular Justice, Repression, Liberties, Rights

Jury nullification (or Jury Equity) is perhaps the greatest safeguard against unjust laws or excessive punishment to exist in Britain. It is the practice whereby a jury delivers a verdict contrary to the evidence, law and judicial directions by acquitting a defendant they believe beyond a reasonable doubt guilty by the letter of the law, but on grounds of conscience think should not be punished.

This little understood power has existed since a 1670 ruling declaring that no juror could be punished for a ‘wrong’ verdict, which when coupled with the double jeopardy rule leaves the door ajar for juries to essentially override laws.

For centuries it was considered a critical element of the constitution, a check against tyrannical government and unfair laws but above all a mechanism that permitted the people to directly influence, comment upon and force reform of laws they deemed morally suspect. What’s more, it was widely understood and discussed as a normal part of the legal process. The idea a jury of ‘freeborn Englishmen’ could not deviate from a judge’s charge or the legal letter was an almost universally rejected one. Indeed, it was not uncommon for jurors to be encouraged to ignore or reject judicial guidance and actively consider not just the evidence but the interests of their community, religion, nation, and constitution.[1] When an English juror entered the jury box in the eighteenth century, he was expected to act for his country and discretion was the norm. Granted not everyone accepted the idea but none denied the existence of nullification nor the rights and powers of jurors as the ultimate arbiters of justice. Most today though will never have heard of this power. For too many in the modern legal fraternity nullification is a dirty word, with a myriad of objections raised against it.[2] Most notably, they argue it is ineffective at procuring meaningful change and encourages prejudice among jurors.

I was brought to think about the history of this murky power by the recently introduced Police, Crime, Sentencing and Courts Bill, the politically interested attempt to attack freedoms of protest, speech and the rights of the traveller community all at once. Regarding protest, it seeks to lower the legal test required for police to act against otherwise legitimate protest, with its most egregious element being the new offence of ‘public nuisance’. This will carry a maximum ten-year prison term and is defined as causing ‘serious harm’ to the public through ‘serious annoyance, serious inconvenience or serious loss of amenity’. Meanwhile the Bill also proposes to criminalise trespass, an effort to attack the freedoms of the traveller community opposed by the majority of policing bodies including both the National Police Chiefs Council and the Association of Police and Crime Commissioners.

The backlash has been immense. Politicians, lawyers, civil rights organisations, charities and many others have spoken out against the Bill, as have many thousands in mass demonstrations across the nation. It would not be unfair to suggest that a significant minority, perhaps more, oppose the provisions of this Bill as infringements on liberty. This was the very sort of legislation radicals of the past would have hoped and indeed encouraged jurors to nullify.

But does this mean we should today? For me, the answer is an unquestionable yes. Nullification exists for just these circumstances, to frustrate legislation passed or enforced spuriously or for political reasons. And the past provides plentiful justifications and precedents for its use.

In the first instance, our national history is littered with cases where juries, in spite of evidence and judicial direction, acquitted defendants to the benefit of society. Arguably the most well-known is that of Clive Ponting, the British civil servant who leaked documents relating to the sinking of the General Belgrano during the Falklands Conflict. His trial for allegedly breaching the Official Secrets Act was a cause célèbre and his unexpected acquittal, contrary to the judge’s direction, caused enough consternation for the Conservative government to remove the ‘Public Interest Defence’ Ponting relied on from Official Secrets legislation in 1989.

But the Ponting case is an outlier. For one, the acquittal had a basis in law thanks to the Public Interest Defence. A true act of nullification occurs where the law does not provide an adequate escape route and its provisions are thus arbitrary. The new Policing Bill promises to be such a law. To justify employing nullification against such an Act, there are a myriad of examples where jurors have acquitted in the face of such legislation. Take for instance the case which established the practice, the trial of two Quakers William Penn and William Meed in 1670.

The pair were charged under the Conventicle Act which restricted non-Anglicans to meetings of no more than five people. Meed and Penn were arrested preaching to several hundred.

Despite overwhelming evidence and threats from the judge, the jury refused to convict, believing the law to be morally wrong. The subsequent legal ruling that no juror could be punished merely for their decision still reverberates today and is even commemorated at the Old Bailey. This commemoration strikes me as almost a burlesque given the staunch opposition of the courts and Crown to jurors being permitted to understand the ramifications.

Download PDF

Bibliography

Primary Sources

Hawles, J., The Englishman’s Right: A Dialogue between a Barrister at Law and a Juryman &c. (London, 1680).

 

Secondary Sources

Darbyshire, P., ‘The Lamp that Shows that Freedom Lives—Is It Worth the Candle?’ Criminal Law Review (1991), pp. 740–752.

Handler, P., ‘Forgery and the End of the “Bloody Code” in Early Nineteenth-Century England’, Historical Journal, 48/3 (2005), pp. 683–702.

Harling, P., ‘The Law of Libel and the Limits of Repression, 1790–1832’, Historical Journal, 44/1 (2001), pp. 107–134.

McGowen, R., ‘From Pillory to Gallows: The Punishment of Forgery in the Age of the Financial Revolution’, Past and Present, 165 (1999), pp. 107–140.

Wharam, A., The Treason Trials, 1794 (Leicester, 1992).

Notes

[1] Guides for jurors often encouraged them to remain independent of the judge and emphasised that they represented their fellow citizens in court. A powerful and frequently reprinted example was J. Hawles, The Englishman’s Right: A Dialogue between a Barrister at Law and a Juryman &c. (London, 1680).

[2] See P. Darbyshire, ‘The Lamp that Shows that Freedom Lives—Is It Worth the Candle?’ Criminal Law Review (1991), pp. 740–752.

Nordic Studies in 2021: When Vikings Raid Real Life, Our Good Intentions Get Pillaged

Nordic Studies in 2021: When Vikings Raid Real Life, Our Good Intentions Get Pillaged

Beth Rogers is a PhD student at the University of Iceland in Reykjavík, Iceland, studying topics of food history and medieval Icelandic culture for her thesis, “On with the Butter: The Cultural Significance of Dairy Products in Medieval Iceland.” The project is hosted by the Institute of History at the Centre for Research in the Humanities.

Beth has written more than 30 popular and academic articles, including 2 book chapters, on such varied topics as Viking dairy culture, salt in the Viking Age and medieval period, food as a motif in the Russian Primary Chronicle and the literary structure of Völsunga saga. Her other research interests include: medieval Literature (especially sagas), military history, emotions in literature, Old Norse mythology and folklore, and cultural memory.

The impact and degree of white supremacist appropriation of Nordic culture in Scandinavian Studies has been the cause of recent public interest and scholarly debate. Viking and medieval imagery was seen displayed, for example, by participants in the Unite the Right rally in Charlottesville, Virginia, in 2017. But the most prominent display was during the violent attack on the United States Congress at the U.S. Capitol on January 6, 2021, when these images re-emerged, most visibly in the form of tattoos and clothing worn by Jake Angeli (real name Jacob Chansley),  the self-described ‘QAnon Shaman’. Angeli was instantly recognisable, wearing furs and horns and a face painted in red, white, and blue, while his bare chest blazed with black lines of Yggdrasil, Mjölnir, and the Valknut. News outlets leaped to provide context to this oddball stand-out among the mob of Americans angry at the outcome of the presidential election of November 2020, explaining the meaning of his tattoos for readers who had not seen them before or did not know of their associations with Norse mythology. The media attempted to clarify that Angeli was not part of the Antifa or the Black Lives Matter movements, but QAnon, a political and social conspiracy group which has gained prominence in recent months since its appearance on internet message boards in October 2017. Neither the Antifa nor the BLM movement is known for using any Nordic cultural symbols, yet in the immediate aftermath of the attack on the US Capitol, confusing claims that Angeli was actually part of the BLM movement spread quickly on social media. 

 Social media image circulated heavily in the days following the attack on the US Capitol. Origin unknown. A Google Reverse Image Search returned no information. 

Angeli’s interviews with BrieAnna J. Frank, reporter for The Arizona Republic, leave little doubt as to his right-wing ideological leanings. His frequent appearances at events carrying a sign stating, ‘Q sent me!’ further confused the issue. According to the BBC, QAnon is ‘a wide-ranging, completely unfounded theory that says that President Trump is waging a secret war against elite Satan-worshipping paedophiles in government, business and the media.’

Angeli himself, currently awaiting trial in federal prison (where he has experienced problems with the lack of organic food), has expressed regret over his actions. In an interview with US news program 60 Minutes, Angeli spoke from jail – in an unsanctioned interview which resulted in a ‘scolding’ for his lawyer from a judge – and insisted that, ‘I regret entering that building. I regret entering that building with every fibre of my being’ (0:43 – 0:47). His actions ‘were not an attack on [the United States]’, Angeli insisted. Instead, 

I sang a song, and that’s a part of Shamanism. It’s about creating positive vibrations in a sacred chamber. I also stopped people from stealing and vandalising that sacred space – the Senate. I also said a prayer in that sacred chamber because it was my intention to bring divinity and to bring God back into the Senate. […] That is the one very serious regret that I have, was [sic] believing that when we were waved in by police officers, that it was acceptable. (0:39 – 0:46)

Angeli awaits trial on six counts of misconduct, including violent entry and disorderly conduct in a Capitol building, as well as demonstrating in a Capitol building.

Angeli’s appropriation of Nordic symbols is of course part of a broader Viking cultural renaissance, yet don’t let all this take away from your enjoyment of the current Viking-themed pop culture extravaganza. Vikings are fun! It’s not all white supremacy. It’s wonderful to see people deeply interested and invested in the thrills, chills, twists and turns of these larger-than-life characters on our screens, set against a backdrop of Nordic culture and history that is sometimes richly coloured and always sketched in familiar lines: struggle, sacrifice, and hope. The image of the Viking in pop culture today is so unquestioned – hairy, violent, marauding – The Guardian can’t suffer through so much as a paragraph of historical context without getting distracted by their coolness. Dr Simon Trafford, Lecturer in Medieval History and Director of Studies at the University of London explains the Viking attraction:

The parallels with what we look for in our rock stars are just too obvious. The Vikings were uproarious and anti-authoritarian, but with a warrior code that values honour and loyalty. Those are evergreen themes, promising human experiences greater than what Monday morning in the office can provide.

If you caught American Gods in either series form or its original novel (published 2001; series premiere on American cable network Starz in 2017), you know that Vikings are dull-witted, filthy murderers who would slaughter their own friends and family members in frenzied sacrifice to Óðinn then wait for the wind to return to their sails, leaving very few alive left to row the longship home. If you’re a fan of Vikings (2013-2020, with a planned spin-off series titled – what else? – Vikings: Valhalla), you know that Vikings are the rock stars of history, wearing copious amounts of leather and guyliner artfully smudged around their piercing eyes as they gaze out to sea, bursting with manly intensity. You know. But do you really? 

More problematic is when these tropes, images, and signifiers are part of darker, more nebulous, and disturbing parts of history, and how that history can be forgotten, covered up, manipulated, or even wilfully ignored in the current moment. The tropes, images and signifiers of a culture which are chosen and carried forward in time take on a life of their own, often changing their meaning drastically.

Historians and armchair enthusiasts, pagans and reenactors, artists and others who enjoy learning about Nordic culture and Scandinavian history around the world groaned in unison after the Capitol invasion, aware that the United States was bringing us another fight, so soon after the dust had settled over the last one. In Charlottesville, Virginia, on August 11–12, 2017, far-right groups, including self-identified members of the alt-right, white nationalists, neo-Nazis, Klansmen, and various right-wing militias gathered to present a unified and radical right-wing front as well as to protest the proposed removal of the statue of General Robert E. Lee from Charlottesville’s former Lee Park. Symbols like the óðal rune, the cross of the Knights Templar and the black eagle of St. Maurice, among others, were splashed across the screens of horrified viewers. After the initial shock over the cultural clash in Virginia, which like the attack on the Capitol brought about death and injury, those who spend their lives plumbing the mysteries of history were left to pick up the pieces and decide what to do to avoid being painted with the same Swastika.

What has been observed in social media dissemination and discussion of Nordic cultural symbols illustrates that the general public has at best an incomplete understanding of the use of Viking symbology in connection with the German nationalist movements of the eighteenth and nineteenth centuries, which culminated in two destructive World Wars. Markers of Nordic culture have a tendency to recur throughout history, from their origins in the Viking Age through to the twentieth century and the present day: specifically, Valhalla, Vínland, the Valknut, Mjölnir (Thor’s Hammer), Yggdrasil (The World Tree), and runic inscriptions, including rune-like magical staves such as Vegvísir and the Æishjalmur. Such iconography has been deployed almost randomly (and therefore meaninglessly) to create a connection between Viking culture and an ideology of whiteness, masculinity, and power.   

Recently, a scandal erupted in the hallowed halls of the Academy over the correct next steps to take: how to continue to do what we love as researchers and teachers, but also speak to a wider community and political developments causing direct harm? Differences in opinion led to social media chaos, accusations of doxxing, threats and scathing blog posts by the two front runners in the debate. Dorothy Kim, an Assistant Professor of English at Vassar College, and Rachel Fulton Brown, an Associate Professor at the University of Chicago, squared off in a nerdy, gladiatorial smackdown. As Inside Higher Ed noted, Fulton Brown agreed that white supremacists often use medieval imagery to invoke a mythical, purely white medieval Europe. However, she disagreed with Kim’s assertion that white professors needed to explicitly state anti-white supremacist positions in the classroom. For Fulton Brown, the teaching of history by itself, immersion in the concepts and understanding of changes over time, will stem the tide of white supremacist misuse and misunderstanding. Medievalists unhappy with the handling of the issues by some institutions boycotted conferences.

As the debate raged on, white supremacy continued its dark work. A mass shooter in Christchurch, New Zealand, posted, ‘See you in Valhalla’ before killing 49 people at two mosques and injuring dozens more. Educational institutions have not, and still do not, appear to be doing enough. The Southern Poverty Law Center, which monitors hate groups throughout the US, tracked 838 hate groups’ growth and movement across the country in 2020, brought on in part by a 55% surge in the number of US hate groups since 2017 (though down from its all-time high in 2018). Political elections have put an increasing number of  populist, nationalist, and right-wing figures in office throughout Europe, spurred by rising anti-immigration sentiment, frustration with the political status quo, concerns about globalization, and fears over the loss of national identity. The issue has become so muddled that some educational material must make clear that although a given Nordic cultural symbol, such as Thor’s hammer (Mjölnir) is a hate symbol, it is also commonly used by non-racist neo-pagans and others, and so it should be carefully judged within its context before the viewer assumes the one wearing it to be a member of a hate group. Nothing is black and white. Everything is uncertain. 

Instructors and teachers have recommitted to doing better, echoing statements like that of Natalie Van Deusen, Associate Professor at the University of Alberta. In her own classes, Van Deusen makes a deliberate effort to highlight the flourishing ethnic and cultural exchange among Nordic people of the periods she covers, mainly the late eighth to early eleventh centuries. She includes in her teaching lesser-seen and -heard viewpoints, such as their relationships with the Sámi, indigenous peoples of northern Scandinavia, or trade with the East:

I strive to teach in a way that doesn’t solely focus on Norse-speaking peoples, who were by no means the only ones to occupy the Nordic region during this period, nor were they without influence from surrounding cultures. 

We have to do more, go further, explore deeper, and keep talking about this until there is no question where we stand, as individual scholars, or as people within our communities who care about accuracy (as far as it can be established), diversity (as much as the evidence supports), and education. Always education. 

Dr. Van Deusen remains more committed than ever to keeping the conversation going, saying in a recent interview, ‘I think it’s a willingness to talk when people want to talk to us about these things, and a willingness (as scholars an educators of this period) to acknowledge that this is a real issue.’ For Van Deusen, at this point

[W]e can’t not address it, and the last thing I would want is for someone to be in my class for the wrong reasons and twist my words because I didn’t explicitly say “I’m not here for validating these interpretations” – which I do now, at the beginning of each term.

This is why, through the pages of The MHR, you’ll hear more from me soon. Drawing on a range of evidence, from modern news to textual and archaeological evidence, my colleagues and I will examine the ways in which Viking culture has been and is manipulated, used, and misrepresented by those who seek to create an underlying continuity, real or imagined, stretching directly back to the people of the past known as the Vikings.

So, hold on to your butts! Like the blurry outline of a longship on the horizon, I shall return!

Download PDF

The Problem with Prison – From an Academic Who’s Been There

The Problem with Prison – From an Academic Who’s Been There

Gary F. Fisher is an inter-disciplinary teacher and researcher in the liberal arts tradition. He received his doctorate in Classics from the University of Nottingham in 2020 and has published research on a variety of subjects, ranging from the history of education to twentieth-century travel literature. He is currently employed by Lincoln College.

The National Records of Scotland recently released their much-delayed statistics concerning the number of drug-related deaths that occurred during the year of 2019. They revealed a continued rise to a record high, firmly cementing Scotland as the drug deaths capital of Europe. This, combined with the fact that Scotland also boasts the largest prison population per capita in Western Europe, has prompted a slew of articles proposing various types of reform. So far, so familiar. ‘Prisons in this country are a mess’ is one of the few sentiments that appears to be shared on both sides of the political aisle in the UK. It is a sentiment that you can find in both the New Statesman and The Spectator. As the Oxford History of the Prison has noted, it is a sentiment that has persisted for over two centuries, at least since the prison reformer John Howard roundly criticised the condition of eighteenth-century Britain’s criminal justice system in his 1777 The State of Prisons.

That, sadly, is where the unity ends. While Britons can collectively agree that they are not happy with the state of our prisons, we differ wildly in our proposed solutions. More than that, we cannot even agree on precisely what the problem is. On the one hand, there are those who believe prisons have become too soft, that they are limp-wristed holiday camps in which society’s foulest enter through a revolving door to be waited on hand and foot before being released all too quickly upon an unsuspecting public. On the other hand, there are those who view prisons as brutal and draconian institutions, in which cruel and oppressive restrictions on individuals’ human rights serve to entrench and reinforce criminal behaviour to no palpable social benefit. The former will question what punishment or deterrent is offered by giving potentially violent and dangerous criminals free access to entertainment resources and educational courses, luxuries that law-abiding citizens would only be able to access at personal expense. The latter will appeal to examples of ‘humane incarceration’, typically exhibited in Scandinavian countries, and cite the reduced rates of re-offending associated with a more rehabilitative approach. Representatives of these two broad camps regularly spar on the airwaves, yet these clashes rarely serve to advance the dialogue. One can watch two daytime television debates entitled ‘Are we too soft on prisoners?’, one from ten years ago and one from only last year, and see almost the exact same points and rebuttals being made, with no significant innovations in reasoning having being made in the intervening decade. Neither side will engage with, let alone be convinced by, the arguments of their opponents. In fact, it’s questionable whether they even hear each other. 

Courting one of these two sides has become something of a prerequisite for elected office in the UK. Upon his election in 2019, Boris Johnson immediately sought to placate the ‘tough on crime’ crowd by introducing reforms to expand sentence durations and prison numbers. On the other side of the debate, since her election as Scottish First Minister in 2014 Nicola Sturgeon has consistently cultivated favour amongst those who favour a progressive approach to criminal justice by introducing reforms to move the focus of Scottish criminal justice away from punishment and towards rehabilitation. This politicisation of debate has hardly helped matters and, frustrated by the fruitlessness of dialogue on the subject, one criminal defence lawyer recently penned a plea in Scottish Legal News. Iain Smith implored legislators to ‘move beyond tokenistic, meaningless terms like being “hard” or “soft” on crime’, and instead adopt a ‘smart’ approach that focuses on achieving the goal of reducing offending through solutions that occur outside the walls of a prison. As well intentioned as Smith’s plea may be, it seems unlikely to gain traction. 

Highly conscious of this fatalistic public attitude towards the prison system, I first passed through the gates of a category C men’s prison at the beginning of 2020. I should stress that I entered voluntarily, joining the staff as the manager of the prison’s library. It was not long into my career as the librarian that I found the same two broad camps that occupied public dialogue had carried through into the four walls of the prison. There were those staff who pined for the days of ‘proper prison’ and suggested that the modern officer ought to be renamed the ‘custody butler’ for the amount of waiting on their charges that they were increasingly expected to perform. Drawn against them were those who embraced the sector’s increased emphasis on supporting rather than simply securing their residents, and quietly derided those individuals they believed to be insufficiently ‘pro-prisoner’ in the execution of their duties. The incompatibility of these two schools was driven home to me when the library found itself in possession of several copies of a workout guide that could be completed in a cell with no equipment. One of my colleagues was excited by the prospect of distributing this to the men in their cells so that they could continue to exercise while the ongoing pandemic limited their yard time. The other asked ‘why would we want to help them get stronger?’ 

This was more than a simple difference in methods: it was a disagreement as to what the fundamental goal of prisons actually was. To some, their goal was to punish, and anything other than Dickensian horror would represent a betrayal of that goal. To others, their aim was to reform and support, and any interruption to their counselling sessions and wellbeing workshops constituted a frustration of that aim. No wonder these two sides are unreceptive to each other’s arguments: they’re arguing from different pages. They don’t merely disagree on what is wrong with prisons, they disagree on what a ‘correct’ prison would look like.

The reality is that the punitive and rehabilitative approaches, while not necessarily completely mutually exclusive, are at the very least highly counter-productive. It is somewhere between hard and impossible to adopt one without at the very least undermining the other. During my time in the prison library, I witnessed how these two approaches played against each other, with the prisoners caught in the middle. To illuminate this, consider a few examples. Upon being sentenced, a newly convicted offender will be transported to their holding facility in a secure transport vehicle, colloquially known as a ‘sweatbox’. Inside the sweatbox the offender is crammed into a tiny cubicle smaller than an airplane bathroom. So small is this space that taller offenders are unable to properly sit down within them, instead having to half-stand, half-sit for the duration of their indeterminately long journey from court to prison. The transport’s blacked-out windows make the prisoner invisible to the world outside and he is assigned his number and stripped of all personal effects. With this removal of identity completed, the prisoner is then assigned a counsellor and mental health support worker to help him explore the roots of his criminality and come to terms with the very personality that has just been stripped from him. 

A prisoner is encouraged to explore his faith and has regular contact with a large team of professionally-trained chaplains from a variety of religious denominations. But if he indulges too deeply in this faith he will find himself earning the attention of counter-extremism professionals who will monitor him and his reading habits closely and reprimand him or extend his sentence if they deem it appropriate. 

He is given a wide variety of learning opportunities and encouraged to attain qualifications that will hold him in good stead upon his return to the outside world. But he is also barred from accessing the greatest source of learning and information ever created: The World Wide Web. He instead has to make do with the finite materials that the already overstretched library and education services are able to provide and risk a revocation of his learning privileges should a book be returned delayed, damaged or dog-eared.

Of course, those who believe their job is to punish and those who believe their job is to rehabilitate are frustrated with each other. They are not merely pursuing different objectives, they are actively cancelling out each other’s efforts. I am not the first to have noted the incompatibility of these goals. In 2019 James Bourke, the governor of H.M.P. Winchester, was reported in The Daily Telegraph criticising the culture of ‘fantasy’ surrounding what prisons are able to achieve, and how the present attempt to achieve both punishment and rehabilitation has left both goals unfulfilled. Although conditions in prison may be a horrifying deterrent for the ‘nice, white middle-class’ people who legislate for them, Bourke claims their structure and security means they can act as a ‘place of refuge’ for those from more troubled backgrounds. Meanwhile, the idea that a five-week prison sentence is enough time to successfully rehabilitate a convicted criminal after years, if not decades, of accumulated suffering and habitual criminality is derided as a ‘fantasy’. In attempting to both deter criminal behaviour  through harsh punishment and rehabilitate convicted criminals through reformative support, prisons seem doomed to fail both goals.  

This leads back to the condition of public debate about criminal justice reform. Despite the great amounts of ink spilled and spittle launched debating the conditions of our prisons, we never seem to have actually tackled this most fundamental of issues: what do we actually want our prisons to achieve? Do we want to rehabilitate, or do we want to punish? Neither goal is palpably ridiculous, but we can’t have our cake and eat it too. As it stands, the same arguments will continue to be repeated, drug-related deaths will continue to rise, and the one thing that we’ll all be able to agree on will be that we’re not happy with the state of our prisons.

In the BBC’s recent political thriller Roadkill, the Prime Minister, played by Helen McRory, informs Hugh Laurie’s Justice Minister that,  ‘We lock people up. We’re famous for it. We like locking people up. It’s in our character’. For the foreseeable future, that seems unlikely to change. 

Download PDF

British High-Seas’ Sovereignty: A ‘Fisherman’s Tale’

British High-Seas’ Sovereignty: A ‘Fisherman’s Tale’[1]

Dr David Robinson is the Editor-in-Chief of The MHR and an Honorary Post-Doctoral Fellow of the University of Nottingham. In this Spotlight article, he discusses Britain’s shifting (shifty?) presentation of history over fishing rights…

For the past five years, sovereignty has been the dominant feature of British public discourse. In particular, its gradual erosion since the UK joined the European Community in 1973. Nowhere was the apparent decline in Britain’s ability to maximise its advantages and strategic resources more apparent than in the fishing industry. The nation’s once-proud and dominant trawler fleet, the argument was advanced, was dealt an iniquitous losing hand from the bottom of the deck, by faceless sleight-of-hand Eurocrat croupiers, in the guise of the European Fisheries Policy.

Tensions arose recently when French fisherman attempted a blockade of Jersey, angry at what they saw as the late imposition of licencing requirements by Jersey, under the new UK-EU Trade and Cooperation Agreement (TCA). When British Prime Minister, Boris Johnson, dispatched the Royal Navy the popular British tabloid newspaper, The Sun, responded in fine form with the headline, ‘Take Sprat! Jersey fishing: Royal Navy warships see of blockade and send 56-strong fleet of French boats packing!’

For Brexiteers, the decimation of the British fishing industry has long been a direct result of the restrictions placed on British vessels fishing their own territorial waters. At the same time, foreign incursions were encouraged and EU quotas allowed them to land far bigger catches.

Despite representing just 0.12% of the British economy, fishing was presented as a paradigmatic symbol of the European yoke. And Britain was not to be yoked, particularly by a continent that owed its freedom to the sacrifices made, twice, by the flower of British youth. Arch-Brexiteers frequently suggested parallels between two world wars and Britain’s exit from the EU. Nigel Farage, for example, was pictured in the right-wing press standing next to posters advertising Christopher Nolan’s latest movie and extolling Britain’s Remain-supporting youth ‘to go out and watch #Dunkirk’!

This was a powerful argument, as it offered an example of why Britain should leave the EU that went beyond simple economics. Broad Remain counter-arguments that focused on the loss of a few percentage points of GDP over a couple of decades were a brittle defence of EU membership, especially when British seadogs were left muzzled and tied to the pier stanchions of Grimsby and Hull, whilst Spanish and French armadas ruled the North and Irish Seas as well as our Channel waters. When the referendum cards went down, symbolic beat shambolic; the only way the Remain campaign can really be summarised. Besides, it was argued, leaving the EU would mean an economic resurgence for the British fishing industry. EU boats would be banished for good from UK territorial waters, Britain would reassert its traditional sovereignty and fish ‘a sea of opportunity’.

As it has transpired, however, negotiations with the EU have not concluded as well as British fisherman and Brexit voters had hoped. For a variety of reasons, which could be grouped under the heading ‘reality’, the British fishing industry’s hopes have foundered on the rocks of political and economic expediency, and they have taken to the media, post-Brexit, crying ‘betrayal!’.

That, however, is not the point of this article.

What is, instead, is a selective version of history, successfully deployed to persuade British voters that their future lay in the past. Or, rather, a return to a previous state of ‘sovereignty’ that never really existed. The broader lesson is that sovereign and economic interests are best served not by looking backwards to an imagined past, but by a realistic appreciation of a nations’ current tactical and geo-political strengths.

The historical irony is that the British government’s claims over sovereignty and territorial waters are the very opposite of the traditional case made by British fisherman and governments defending their interests since the nineteenth century. Central to this evolving story has been Britain’s fishing rivalry with Iceland.

In the late 1890s, the accepted limit of territorial coastal sovereignty was three miles. For reasons that went beyond fishing rights, Britain’s official position, defending its dominance of the high seas, was that the three-mile limit was ‘a principle on which we might be prepared to go to war with the strongest power in the world.’[2] So when Iceland got uppity around 1890 and banned foreign trawlers from fishing within four miles of its coast, bays, and fjords to protect dwindling fish stocks, a Royal Naval display of ‘gunboat diplomacy’ put them back in their place. This display of ‘might is right’ held until the early 1950s, by which time the Americans were mightier. When the United States asserted its right to defend its fisheries well beyond the three mile limit, Iceland followed suit and, despite strenuous British diplomatic and legal efforts, it was forced to accept Iceland’s enforcement of a four-mile exclusion zone.[3]

Iceland, however, had played the smart game. Unable to depend on naval power, it leveraged its strategically important geographical position, its membership of NATO, coupled with an oil-for-fish agreement with the Soviet Union, to persuade the broader Western powers that, excuse the pun, there were bigger fish to fry. Concerned about a potential Icelandic pivot eastward, Britain was persuaded to acquiesce.[4]

Some turned to history to vent their frustration at Britain’s inability to enforce its ‘rights’ through military means. In 1955, senior Foreign Office civil servant, Jack Ward, lamented a government reluctance to sink the Icelandic coastguard as ‘sadly lacking in the Nelson touch.’[5] The British also reminded Iceland that, ‘they had been fishing in Icelandic waters since the early fifteenth century and therefore had traditional rights to fish there.’[6] Sixty years later, such arguments from history failed to impress British negotiators when Olivier Leprêtre, the president of the northern France fisheries committee, noted that ‘fishermen have always followed the fish. At the start of the last century, my great grandfather fished in the Thames estuary.’

The 50’s skirmish was the prelude to further conflict between the two nations, dubbed the Cod Wars. In 1958, Iceland declared a twelve-mile territorial limit, from which British trawlers were to be excluded. After two years of hostilities which saw the Icelandic coastguard, British trawlers and Royal naval warships exchange boarding parties, the British backed down once again.

Emboldened by their successes, Iceland continued to extend its territorial claims. To fifty miles in 1972, and to 200 miles in 1975. This time, the threat to life was real, with several deliberate and accidental collisions between trawlers, British warships and the Icelandic coastguard. Britain even deployed the Royal Air Force in an intimidatory capacity. Good sense prevailed when attempts by Halifax aircraft to use trailing cables (communication aerials) to rip the aerials from Icelandic trawlers passing the positions of their British counterparts to the Icelandic coastguard were aborted due to the serious threat to the lives of Icelandic seamen.[7]

Once again Iceland understood their tactical strengths, leveraging broader support by threatening to leave NATO and expel the US military from their key strategic base at Keflavik. These second and third Cod Wars were, again, concluded on favourable terms to Iceland. Britain grudgingly accepted the same 200-mile territorial limit it has, more recently, vociferously defended, citing the red line of ‘taking back control’ of its ‘historically’ sovereign waters.[8]

Interestingly, like Britain recently, internal politics played a significant part in Iceland’s approach to negotiations. Whilst many in their government were minded to be less belligerent, the popular Icelandic Communist Party forced their hand by whipping up populist support for strongly opposing the British.[9] Here is one lesson: all external conflict is, to some degree, interconnected with internal politics.

In a recent ironic twist which has slipped beneath the radar, the Royal Air Force has just named one of its new Poseidon MRA1 maritime patrol aircraftSpirit of Reykjavik in honour of the role played by the Icelandic capital and its people in enabling the Allied victory during the Battle of the Atlantic.[10] You couldn’t make it up.

Perhaps we might dub recent negotiations some kind of fourth Cod War, albeit less dramatic. Who won the latest round? In terms of internal politics, the powerful bloc of ‘Vote Leave’ politicians that now control the British government, hands down. Their recourse to history, however fallacious and disingenuous (fishy?), was strongly persuasive. Of course, there were many different motivations for leaving the EU. As a good friend of mine pointed out, ‘were I to know that the UK would, economically, sink into the sea, I’d still have voted Brexit!’ Repeating that to other Leave-supporting friends has resulted in enthusiastic agreement. One has to wonder if the future of the fishing industry was really their top priority.

The problem with this interpretation is that the long-standing, structural issues facing Britain’s fishing industry lie not in EU unfairness or belligerence, but in decades of underinvestment and neglect by the British government. European fishers long ago bought up British trawler quotas as long term investments, whereas the latter, unsupported by their government, were happy to sell for a short term killing and redeploy their boats in the North Sea oil industry. The British ‘fishing industry’ is as much an international trade in quotas, controlled by foreign interests and a few wealthy British families as it is concerned with actually catching fish. This was no basis for a strong negotiating position with EU officials who were never to be influenced by the arguments about Britain’s ‘historic sovereignty’ of the seas, which have reversed over a century, however persuasive they were to English ‘patriots’.

So, the trawlermen of Hull, Grimsby and Peterhead feel betrayed with a mostly as-you-were agreement on fishing with the EU. As John Lichfield puts it, ‘conclusion…You can win a political argument with lies and myths. Governing or negotiating with them is as useful as fishing without nets.’

Still, all may not be lost. An internal DEFRA email recently noted that, to protect their fishing waters, Britain has just ‘12 vessels that need to monitor a space three times the size of the surface area of the UK.’ Iceland was successful with less. In 1958, they ‘had no navy and the Icelandic coast guard had only seven small ships with one gun each.’[11] The difference is that Iceland had a better appreciation of where its strengths lay and a genuine commitment to its fishing industry. And, despite internal political disagreements, they were able to unite around their nation’s practical interests. Britain’s perceived aggression and intractability in the Cod Wars ‘united the Icelandic nation, from Left to Right’, despite their internal political fractures.[12] It seems this is a lesson Britain has failed to learn as a similar belligerence has united a traditionally divided EU.

Domestic political victory is somewhat pyrrhic when it fails to achieve its broader economic aims and leaves many of its supporters feeling betrayed. In 1951, the Icelandic press noted that Britain appeared to think it was still ‘living in the seventeenth or eighteenth century.’[13] In many ways, it seems it still does. In any reasonable assessment, Britain stands in the first rank of nations. It is a danger to itself and a loss to the broader international community that it continually insists on looking backwards to its past to find its future.

Download PDF

Bibliography

Primary Sources

The National Archives of the U.K.: Public Record Office, Ministry for Agriculture and Fisheries, 41/674, Grey to Findlay, Oslo, 26 June 1911

The National Archives of the U.K.: Public Record Office, Foreign Office, 371/116445/NL1351/186, F.O. draft submission, Sept. 1955.

Vísir (Icelandic Daily), 5 June 1958.

 

Secondary Sources

Gudmundsson, G. J., ‘The Cod and the Cold War,’ Scandinavian Journal of History, 31 (2006), pp. 97-118.

Johannesson, G. T., ‘How “cod war” came: the origins of the Anglo-Icelandic fisheries dispute, 1958–61’, Historical Research, 77 ( 2004), pp. 544-74.

Kurlansky, M., Cod: A Biography of the Fish that Changed the World (New York, 1997).

Notes

[1] A boastful or exaggerated claim

[2] The National Archives of the U.K.: Public Record Office, MAF 41/674, Grey to Findlay, Oslo, 26 June 1911

[3] G. T. Johannesson, ‘How “cod war” came: the origins of the Anglo-Icelandic fisheries dispute, 1958–61’, Historical Research, 77 ( 2004), pp. 544-74, pp. 545-9.

[4] Johannesson, ‘How “cod war” came’, 548-9.

[5] T.N.A.: P.R.O., FO 371/116445/NL1351/186, F.O. draft submission, Sept. 1955.

[6] G. J. Gudmundsson, ‘The Cod and the Cold War,’ Scandinavian Journal of History, 31 (2006), pp. 97-118, p. 100.

[7] My thanks to Sqn. Ldr. R. P. Robinson (retd.) for his remembrances.

[8] Gudmundsson, ‘The Cod and the Cold War’, pp. 108-10.

[9] Gudmundsson, ‘The Cod and the Cold War’, 102.

[10] Once again, I am indebted to Sqn. Ldr. R. P Robinson for drawing my attention to this point..

[11] M. Kurlansky, Cod: A Biography of the Fish that Changed the World (New York, 1997), p. 162.

[12] Johannesson, ‘How “cod war” came’, 564.

[13] Vísir (Icelandic Daily), 5 June 1958.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Contemporary Scottish Diplomacy: Some Recent and Distant Parallels

Contemporary Scottish Diplomacy: Some Recent and Distant Parallels

The Scottish government under the SNP has frequently employed diplomacy to help secure its strategic goals. In response to this prevalence, this article explores how contemporary Scottish diplomacy compares to three other related forms of diplomacy, drawn from both the recent and the distant past, in the hope that these might pave the way for increased understanding of the Scottish government’s actions.

Keywords: Diplomacy, Paradiplomacy, International Relations, Scotland, Medieval, Modern, Brexit, Scottish Independence.

Biography: Jamie Smith is a fourth-year PhD History student at the University of Nottingham. His research explores Anglo-Scottish and Anglo-Welsh diplomacy between 927 and 1154, using comparative and interdisciplinary methods.

Scots went to polls this May for the Scottish parliament election, re-electing the pro-independence Scottish National Party (SNP), which first came to power in 2007. Within the SNP manifesto was a section on ‘Scotland in the World’, outlining their foreign policy plans: increased funding for international development, the adoption of a feminist foreign policy, and closer ties to European countries. Diplomacy is nothing new for the Scottish government. On its website you will find a section on international relations, outlining ongoing policies. These include engagement strategies with numerous countries, ministerial visits, a strategic review of Scoto-Irish relations, the promotion of arctic cooperation, running offices abroad, and a commitment to a UN agreement. Yet this enthusiasm for diplomacy seemingly runs contrary to the Scottish parliament’s actual powers. When this devolved parliament was established in 1998, international relations were reserved, remaining under Westminster’s remit. The existence of contemporary Scottish diplomacy has drawn both academic and non-academic comment. Writing for the website UnHerd, for example, Henry Hill has highlighted the tension between Scottish diplomacy and Westminster’s powers. Regardless of whether it is right or wrong, Scottish diplomacy is happening, and therefore we should look to debate its objectives, methods and outcomes, not just its existence. Hardly unusual, the SNP’s diplomatic actions share common features with other case studies, from both the recent and distant past. Here I highlight three such parallels, exploring diplomacy conducted by other sub-state governments, governments in exile, and medieval figures. These comparisons present innovative ways of interpreting Scottish foreign policy, which future scholars can utilise and build on, paving the way for increased understanding of present-day Scottish politics.

Diplomacy conducted by other sub-state governments and entities is one obvious, relevant comparison. The Scottish government is certainly not unique in having foreign policies. Scholars have taken an interest in the diplomacy of towns, regions and other sub-state nations, such as Wales, Macao, US states and Japanese islands, to name but a few examples.[1] Whilst there are many potential routes to explore, the theme of differentiation might prove particularly fruitful. This is where a sub-state administration pursues diplomatic actions that contrast or counter the foreign policy of the state that the sub-state sits within. For example, during the 1960s the Quebecois government was concerned that the Canadian foreign ministry was unrepresentative of French speaking people and overlooking the Francophone world. Consequently, the government of Quebec addressed this deficit, forging its own ties with French speaking countries, notably signing an agreement on education with France which they hoped would help overhaul their own education system.[2] Alternatively, in the 1980s the US government intervened in Nicaragua, supporting forces that were attempting to oust the ruling Sandinista government. To demonstrate opposition to their own government’s foreign policy and to show solidarity with Nicaragua, US cities and towns began twinning with Nicaraguan ones.[3] Today, 32 Nicaraguan municipalities are stilled twinned with a US counterpart.

The Scottish government under the SNP has pursued a similar policy in relation to the European Union (EU), following the Brexit referendum in 2016, which saw the UK collectively vote to leave the EU, but the majority of Scots vote to remain. Whilst the UK government spent the past few years negotiating the state’s exit from the supranational institution, the Scottish government has sought to strength its ties with the EU. For instance, the Scottish 2020-21 budget declares that the government’s goals include ‘ensuring Scotland remains a valued and well-connected nation, despite the UK’s decision to leave the EU’ and to ‘demonstrate our ambition for independent membership of the European Union.’ Several types of diplomatic practices complement these goals, such as meetings between members of the Scottish government and European figures. In 2018 and 2019, Scottish ministers made 80 diplomatic trips to European capitals, with Nicola Sturgeon, the current Scottish first minister, meeting with the EU’s Brexit negotiator, Michel Barnier. They have also used the Scottish government’s international office in Brussels to network with European officials. One of the last events hosted there prior to the COVID-19 pandemic was a Burns Night VIP supper, attended by representatives from Croatia and Germany, ‘as well as MEPs and other high level EU contacts.’ Given the focus on the EU, it is perhaps no coincidence that, with a budget of £2,079,000 a year, the Brussel’s office receives more funding than any of Scotland’s other international offices. Reaching out to the EU, both through rhetoric and diplomatic actions, is intended to smooth an independent Scotland’s accession into the bloc. Whilst the EU was cool towards Scottish membership during the referendum on independence in 2014, the SNP hopes that its overtures will encourage the EU to take a more positive stance on the issue. Regardless, their diplomacy can be seen as another example of sub-state differentiation, which seeks to counter the UK government’s Brexit policy, and work to ensure that Scotland can resume its membership of the EU in the near future.

Another comparison can be made with governments in exile, and the common desire for legitimacy. As noted above, establishing international offices and meeting foreign dignitaries are two tranches of Scottish diplomacy. Including the aforementioned Brussels office, there are eight of these based around the world, often in strategically important locations such as Beijing and Washington. They operate as pseudo-embassies, advocating Scottish interests and strengthening ties with host countries. As for the summits and conferences that Scottish officials have taken part in, these have not only been with representatives of the EU and its member states. A joint heritage initiative involving the Scottish government and five global heritage sites provided Alex Salmond, at the time first minister of Scotland, with the opportunity to meet key Chinese officials and the then Indian prime minister, Manmohan Singh.[4] Likewise, during a trip to North America in February 2019, Nicola Sturgeon met with UN officials as well as the governor of New Jersey.

These tactics echo those employed by governments in exile. One such example of this is the Tibetan Government in Exile, set up in northern India by the Dalai Lama in 1960, with the goal of restoring independence to Tibet following its annexation by China. Like the SNP, it has made diplomacy a central part of its policy plan, establishing pseudo-embassies in eleven cities and organising pseudo-summits between the Dalai Lama and foreign leaders.[5]

Whilst outwardly different, the SNP controlled Scottish government and the Tibetan Government in Exile have similar goals. Both groups seek to assert themselves as the legitimate government of a nation that is currently controlled by another government (although even the most hard-line SNP member would struggle to make the case that Scotland’s position within the UK is exactly comparable to the Chinese occupation of Tibet). Thus, these groups appropriate the symbols of statehood. Since diplomacy is traditionally seen as under the remit of state governments, these groups increase their legitimacy by practicing it.[6] The significance of their diplomatic acts are perhaps best demonstrated by their opponents’ responses. After Prime Minister David Cameron met the Dalai Lama in 2012, the Chinese government condemned the meeting, stating: ‘We ask the British side to take the Chinese side’s solemn stance seriously, stop indulging and supporting “Tibet independence” anti-China forces [sic].’ Similarly, the UK government has attempted to curtail or limit Scottish diplomacy. For example, then foreign secretary Jeremy Hunt stopped the foreign office from providing consular support for Sturgeon’s trips abroad on the grounds she was using them to drum up support for independence. The overlap then between these groups, their goals, and the responses to their diplomatic actions suggest that comparisons between Scottish diplomacy and governments in exile could provide important insight.

Finally, modern Scottish diplomacy warrants comparison with examples of medieval diplomacy. Contrary to more traditional views, scholars argue that diplomacy in the modern world is no longer the sole preserve of the state. Rather, globalisation has caused diplomacy to fragment, with numerous non-state entities forging their own foreign policies. This includes the sub-state governments and governments in exile mentioned above, as well as supranational institutions, such as the EU, and both legitimate and illicit NGOs, such as multinational corporations and terrorist networks. In line with this, scholars such as John Watkins and Jakub Grygiel have called for comparisons between the increasingly “post-state” diplomacy of the modern world and interactions in the pre-state period.[7] Medieval diplomacy was similarly multifaceted, not solely restricted to kings, but involving popes, bishops, magnates, heirs, claimants and exiles, amongst many others.

Whilst there are many possible comparisons, medieval earls are perhaps the most relevant to the modern Scottish government, and particularly the strategy of counterbalancing. Earls were the medieval equivalent of sub-state governments, ruling over territory but subordinate to another sovereign power, in their case a king rather than a state. One such example, who is discussed in my thesis, is Earl Ælfgar of Mercia, a prominent magnate during King Edward the Confessor of England’s reign (1042-66). This was a period of instability within England, involving both intra-noble division, and disputes between the magnates and the king. In 1055, Ælfgar, who at the time was earl of East Anglia, was outlawed by the English court. In response, he went to King Gruffydd ap Llywelyn of Wales, and together they led a joint attack on Hereford. The English court caved into this military pressure, and reinstated Ælfgar as an earl.[8] K. L. Maund suggests Ælfgar had already reached out to Gruffydd prior to 1055.[9] Whether the alliance had been prearranged or not, Ælfgar certainly built on it, marrying his daughter to Gruffydd, an event usually dated to c. 1057.[10] This seemingly put him in a good position. When Ælfgar, now earl of Mercia, was outlawed once more in 1058, we are told he was later reinstated, thanks again to Gruffydd’s military backing.[11]

During a period of intra-England disunity, Earl Ælfgar tied himself to a neighbouring king, securing a military ally and safe haven, to counterbalance the domestic division he faced. The Scottish approach to the EU follows a similar rationale. Independence would politically and economically divide Scotland from the rest of the UK, with the potential for a decline in trade being a major cause for concern. The SNP’s solution is EU membership. Fiona Hyslop, the Scottish Cabinet Secretary for Economy, Fair Work and Culture, responds to concerns about independent Scotland’s trade by pointing to the Republic of Ireland: ‘Through membership of the EU, independent Ireland has dramatically reduced its trade dependence on the UK, diversifying into Europe and in the process its national income per head has overtaken the UK.’ The aforementioned Scottish diplomacy with EU leaders aims to secure Scotland’s future membership of the EU, and thus another market that will counterbalance losing access to the UK one. The main difference between this and Ælfgar’s diplomacy is that he was forced to seek foreign help due to intra-English disputes, whilst the SNP wants a self-imposed division with the UK, which they will need to counterbalance.

Evidently, whether we like it or not, Scottish diplomacy is a prominent feature of the contemporary international landscape. Consequently, I have highlighted here three relevant comparisons which could improve our understanding of it: the diplomacy of other sub-state governments, governments in exile, and medieval individuals, such as earls. Other modern sub-state governments, such as the Quebecois government, are perhaps the most applicable to the Scottish government given their similar constitutional positions. Although, all three approaches could prove useful for scholars of politics and international relations over the next few years. Having been re-elected, Nicola Sturgeon plans to call another independence referendum once the COVID-19 pandemic has passed, meaning that Scottish diplomacy will likely remain an important subject of analysis, both during a future referendum campaign and in the early days of an independent Scotland.

Download PDF

 

Bibliography

Primary Sources

‘Anglo-Saxon Chronicle’, in English Historical Documents: c. 1042-1189, ed. by D.C. Douglas and G.W. Greenaway (London, 1953), pp. 107-203.

Orderic Vitalis, The Ecclesiastical History, ed. and trans. by M. Chibnall, 6 vols. (Oxford, 1969-80).

Secondary Sources

Clarke, A., ‘Digital Heritage Diplomacy and the Scottish Ten Initiative’, Future Anterior: Journal of Historic Preservation, History, Theory and Criticism, 13 (2016), pp. 51-64.

Criekemans, D., ‘Regional Sub-State Diplomacy from a Comparative Perspective: Quebec, Scotland, Bavaria, Catalonia, Wallonia and Flanders,’ The Hague Journal of Diplomacy, 5 (2010), pp. 37-54.

Grygiel, J., ‘The Primacy of Premodern History’, Security Studies, 22 (2013), pp. 1-32.

Jakubec, P., ‘Together and Alone in Allied London: Czechoslovak, Norwegian and Polish Governments-in-Exile, 1940-1945’, The International History Review, 42 (2020), pp. 465-84.

Maund, K. L., ‘The Welsh Alliances of Earl Ælfgar of Mercia and his Family in the Mid-Eleventh Century’, in R.A. Brown (ed.) Anglo-Norman Studies XI(Woodbridge, 1988), pp. 181-90.

McConnell, F., Moreau, T., & Dittmer, J, ‘Mimicking State Diplomacy: The Legitimizing Strategy of Unofficial Diplomacies,’ Geoforum, 43 (2012), pp. 804-14.

Paquin, S., ‘Identity Paradiplomacy in Québec,’ Québec Studies, 66 (2018), pp. 3-26.

Tavares, R., Paradiplomacy: Cities and States as Global Players (Oxford, 2016).

Watkins, J., ‘Toward a New Diplomatic History of Medieval and Early Modern Europe’, Journal of Medieval and Early Modern Studies, 38 (2008), pp. 1-14.

Williams, A., ‘Ælfgar, earl of Mercia’, Oxford Dictionary of National Biography, 23 September 2004.

 

Notes

[1] D. Criekemans, ‘Regional Sub-State Diplomacy from a Comparative Perspective: Quebec, Scotland, Bavaria, Catalonia, Wallonia and Flanders’, The Hague Journal of Diplomacy, 5 (2010), pp. 37-54.

[2] S. Paquin, ‘Identity Paradiplomacy in Québec’, Québec Studies, 66 (2018), pp. 19-21.

[3] R. Tavares, Paradiplomacy: Cities and States as Global Players (Oxford, 2016), pp. 234-35.

[4] A. Clarke, ‘Digital Heritage Diplomacy and the Scottish Ten Initiative’, Future Anterior: Journal of Historic Preservation, History, Theory and Criticism, 13 (2016),  p. 56.

[5] F. McConnell, T. Moreau and J. Dittmer, ‘Mimicking State Diplomacy: The Legitimizing Strategy of Unofficial Diplomacies’, Geoforum, 43 (2012), pp. 806-08.

[6] P. Jakubec, ‘Together and Alone in Allied London: Czechoslovak, Norwegian and Polish Governments-in-Exile, 1940-1945’, The International History Review, 42 (2020), pp. 468.

[7] J. Grygiel, ‘The Primacy of Premodern History’, Security Studies, 22 (2013), p. 2; J. Watkins, ‘Toward a New Diplomatic History of Medieval and Early Modern Europe’, Journal of Medieval and Early Modern Studies, 38 (2008), p. 5.

[8] ‘Anglo-Saxon Chronicle’ (Henceforth ‘ASC’), in English Historical Documents: c. 1042-1189, ed. by D. C. Douglas and G. W. Greenaway (London, 1953) (Henceforth EHD II), pp. 132-34.

[9] K. L. Maund, ‘The Welsh Alliances of Earl Ælfgar of Mercia and his Family in the Mid-Eleventh Century’, in R. Allen Brown (ed.) Anglo-Norman Studies XI (Woodbridge, 1988), p. 185.

[10] Orderic Vitalis, The Ecclesiastical History, ed. and trans. by M. Chibnall, 6 vols. (Oxford, 1969-80), 2, pp. 138-39: A. Williams, ‘Ælfgar, earl of Mercia’, Oxford Dictionary of National Biography, 23 September 2004: https://doi-org.ezproxy.nottingham.ac.uk/10.1093/ref:odnb/178. Accessed 13 April 2021.

[11] ‘ASC’, EHD II, p. 136.

Election Day and Britain’s Existential Crisis

Election Day and Britain’s Existential Crisis

In this Spotlight article for Election Day in the UK, David Robinson discusses how the past forty years of Britain’s economic and social history have led to a divided nation and a present day existential crisis, which no main political party seems willing to discuss.

Across the UK, local elections are being held today, May 6th. The electorate can be forgiven if they hadn’t noticed. There has been little in the way of campaigning. This was, perhaps, to be expected. The opposition parties may have felt somewhat encumbered by a perceived duty not to interrupt the ongoing vaccine roll-out, whilst the lack of oppositional challenge suits the incumbent Conservatives, so why say anything when saying nothing may well preserve the status quo?

Yet, there is so much that remains unresolved from the politics of the past few years. The rise of European populism, the election of Donald Trump, and Brexit are understood to have been more than electoral choices in the normal sense. They were also expressions of anger and frustration by swathes of people who feel a lack of dignity in insecure and unfulfilling work that leaves them economically disadvantaged, left out of mainstream society, and looked down upon by university-educated, home-owning ‘elites’ with emotionally satisfying and financially rewarding careers. As I will argue, even had they campaigned hard, there is little that any of the main parties would have said that would have gone any way to addressing these problems.

Was it not ever thus? The ‘haves’ and ‘have nots’? Well, yes, but the less advantaged have, traditionally, at least had some political representation. Today, many feel that it is not they who have deserted the parties they have traditionally supported, but the latter who have deserted them in moving towards a form of identity politics and attempting to appeal more to the ‘liberal elite’ than the traditional ‘working class’.

Alongside this, the workplace has fundamentally changed. Over forty years, workers have had to exchange occupations that might not have paid as well as others but were secure. These were jobs which came with a justified sense of dignity, a sense that one was contributing meaningfully to society in comparison to the menial, insecure and poorly paid work in warehouses and call centres. In contemporary Britain, there is a sense in which half the population are the working poor, engaged in administrating, collating, and delivering consumer goods to the other half. The result is the anger and vitriol that has split the UK and other democracies and has come as a shock and surprise to those who had not even realised the extent of this disaffection.

The Brexit vote and Trump’s Presidential campaign fully understood and leveraged these feelings. Rather than the conventional political strategy of trying to unite the electorate behind a set of ideas, the cracks in society were deliberately and crudely crowbarred open more widely, encouraging the view that the nation was divided into two, diametrically opposed halves. This strategy, combined with the anger and hopelessness of so many, is nothing short of an existential threat to society and democracy. While the existential nature of this threat may have been subsumed beneath the emergency of the pandemic, it remains ready to remerge in full force as restrictions are lifted and the gap between the haves and the have-nots becomes visible once more.

But how did we get to this position of disaffection and division? What are the political forces and ideological concepts which brought us here?

In the late 1970s, there was a fundamental shift in ideas about how economies ought to function. More than a decade of stagnant growth, labour conflicts, and high unemployment opened the door to a set of ideas, long-held by some but rejected up to that point by governments of all stripes. Since the inter-war years, economists such as Friedrich Hayek and, later, Milton Friedman, had argued for less central decision-making and planning by governments, in favour of ‘the market’. The mechanisms of supply and demand, and competition between profit-motivated companies would, they argued, provide what people wanted far more efficiently than traditional government planning and control.

Furthermore, the idea of governments and civil servants working for the ‘public good’ was rejected. Instead, it was maintained, such public offices were used by their occupants to build their own careers and operate in their own self-interest.

All this would be swept away. By liberalising money markets, companies and entrepreneurs would have easier access to capital with which to supply a wider range of goods and services that society demanded and from which they would be free to choose.

The role of governments and central banks was now…to do nothing! The market was a more efficient arbiter of what society wanted than government bureaucracy and would provide higher quality at a lower cost.

This idea has been, over the past forty years or so, extended from consumer goods and services to public services traditionally provided through taxation by government. Whether it be the products of the steel industry, or care for the elderly the market would now provide, not government.

Such radicalism was initially implemented by the centre-right governments of Margaret Thatcher and Ronald Reagan, but its full flowering, arguably, came under the centre-left governments of Tony Blair, Bill Clinton, and Gerhard Schroeder. Importantly, the challenges we face as a result of such ideologies are not simple left-right debates. They have become ubiquitous across the political spectrum.

Have such policies worked? They have not. It is impossible to argue that there is now more equal access to high quality public services, health care, education, or care for the vulnerable. Access to high quality services such as these is largely dependent on income. Even organisations such as the IMF, at the heart of driving such policies for the last five decades, agree that these ideas have been ‘oversold’.

The recently retired Governor of the Bank of England, Mark Carney, has pointed out the more subtle problem. There has been a separation of ‘moral sentiment’ from economics. We no longer debate what is ‘right’, there is no political discussion of what we think ought to be provided in society, regardless of increased costs, or the wider ‘value’ of something, beyond its economic value; the value of everything has become the price of everything. If the market has not provided something, then it cannot be provided.

Once again, this is not a left-right debate. Prior to this turn to the market in the late-1970s, the importance of debating ethical questions as part of economic decision making was seen as a fundamental part of political action across the ideological divide and going back to the eighteenth century. Over the last few decades, however, governments of different political persuasions have justified inequalities and their own lack of action in remedying them, by resorting to the argument that the market is the arbiter and expression of democratic choice. That nothing can be done in the face of the free market.

More generally, the turn towards market outcomes over the past four decades has determined that ‘success’, one’s status and reward, is mainly measured by economic outputs. It is argued that those who earn the highest incomes are those who have demonstrated the talent and hard work to best satisfy the demands of the market and it is right, therefore, that larger and larger economic rewards and  prestige are concentrated in their hands. The corollary is that those who have less, have proven themselves less able and hard working to have satisfied the same demands, are, thus, equally deserving of their lower economic and social status.

It has become ubiquitous for political leaders to claim they have, or at least are striving to, implement a ‘meritocracy’, a society in which one’s position, what might broadly be termed one’s ‘success’, is due to a combination of one’s hard work and ability; in short, one’s merit.[1]

Whilst  leaders acknowledge that differences of class, gender, birth, race and so forth have and continue to present barriers to a level playing field on which all can compete, and ‘compete’ is relevant here, there is a general acceptance of a vision of the future in which such barriers continue to be dismantled. Once accomplished, it is argued, a ‘just’ society would finally be realised, in which one’s place in society is what one ‘deserves’. Certainly, such a society, politicians maintain, would be far ‘fairer’ than, say, an aristocracy or a class-bound society in which benefits are handed down from one generation to the next.

Indeed, so ‘common sense’ seems this argument, that it is hard to find many across the political spectrum, or indeed members of the public, that would disagree with its prima facia conclusions.

And yet, there is a dark side to this argument. We have become so used to the idea of ‘competition’ in society, for jobs, income, prestige and so on, that we have accepted the inevitable idea of ‘winners’ and ‘losers’, with all its accompanying,  morally freighted baggage. To be a winner is a ‘good thing’. But to be a ‘loser’ is to have one’s efforts consigned to some kind of social dustbin in which one has ‘failed’. If you have not achieved what you wanted to, that’s your fault, the result of some kind of combination of a lack of ability and effort.

Of course, in a meritocracy, it might be argued that such conclusions are harsh, but fair. If one has ‘won’, one deserves the laurels. If you have not, well, it’s a tough world out there. Next time, work harder, be smarter. And teach the lesson to your children.

If one’s failure is due to the lack of a level playing field, the fact that public school pupils with parents who have connections in the professional world have a far better chance than those from a council estate, well, that only shows how important it is to remove such barriers and push headlong towards a true equality of opportunity.[2]

Again, so common sense does this seem, it is hard, on the surface, to argue against it. There will always be inequality to some degree. Let it be justified on the basis of hard work and ability, the merits of the individual.

But is this really the case? Are ability and effort really, entirely, self-earned merits? If I decide to encourage and extol the virtues of ‘achievement’ in my young children, good marks at school, self-sacrifice, hard work, a pathway towards higher education and a high-income and ‘prestigious’ career, are they not more likely to achieve these goals than children who are less so facilitated and encouraged? It would seem a positive outcome is, in fact, largely a combination of pre-disposed genetics and the environment in which they have, arbitrarily, found themselves.

Equally, my daughter may find herself with fame, fortune, and prestige as a top professional footballer. But, again, this is only because she happens to live in a society that extolls the virtues of playing football well. Had she happened to have found herself by accident of birth in the thirteenth century, her ability to curl a ball accurately with the outside of either foot, whether by genetic predisposition or the effects of her environment, would have stood for little. Even the hard work she puts in to develop these skills is, to a significant extent, a factor of an environment that encourages such effort; one in which not all children are lucky  to find themselves, regardless of their intrinsic ‘talent’.

To a significant extent therefore, meritocratic success is the result of unearned genetics, environment, and a set of skills, either naturally acquired, or encouraged and taught, that happen to be desirable and rewarded by society at a particular point in time. One might say, a morally arbitrary accident of birth.

Even worse, those who ‘fail’ to achieve ‘success’ are told this is the position they deserve. Had they, in a meritocracy, worked harder and been smarter, they would have done better.  It was not always thus. Prior to the turn towards market outcomes as the designator of ‘success’, labour that may not have accrued substantial economic rewards was still considered as valuable and as contributing to the common good. Coal miners and steel workers knew that barriers of class and arbitrary birth hindered their progression towards higher income professions, but they also knew this was not their fault or due to their own lack of worth and merit. The dignity inherent in their labour, their contribution and value to society, was apparent and they rightly proclaimed it thus. And they were supported in this belief by representative political and labour organisations.

For unrepresented workers today, whose traditional jobs have been overtaken by technology, and who find themselves valued by the market in terms of insecure, minimum wage, zero hours contracts, such dignity and self-respect is harder to assert.

It is this combination of the turn to market outcomes as the arbiter of all value, and an apparent ‘meritocracy’ in which one’s position, for good or ill, is what one ‘deserves’, which is the cause of the anger and resentment in society. This has been exacerbated by the lack of political representation for people who are told the cause of their disaffection is actually their own lack of hard work and ability.

Importantly, these market-governed judgements are completely uninterested in questions of ‘right or wrong’.

Milton Friedman, for example, one of the Nobel Prize-winning architects of free market ideology accepts that the market should conform to the ‘basic rules of the society, both those embodied in law and those embodied in ethical custom’ (my emphasis).

The question is, of course, how do we decide what is classed as an ‘ethical custom’? Surely, the basis of this is through political discourse and debate. But the intertwined ideologies of the market and meritocracy are like the air we breathe; invisible and so accepted as facts of everyday life that they require no debate.

The answer to the anger and division in society is not a question of ‘policies’. It is one of understanding how we got to this place and the structural ideologies that paved the way. Fundamentally, there is a need for robust public and political debate about the society we want. Only then can we decide on the policies that will get us there.

The public’s views on public ownership, inequality, and higher taxes, particularly for the top 1% of earners, show that the UK is, broadly, ready for reform and change. Perhaps it’s time to be angry, but together rather than factionally. It’s certainly well past time to engage politically and demand that what we want as a society should not be solely determined by ‘the market’, but by discussion.

Depressingly, this is not a debate that will intrude on today’s elections.

Download PDF

Notes

[1] In the following sections, I draw heavily on M. J. Sandel, The Tyranny of Merit: What’s become of the Common Good? (London, 2020). Other works on a similar theme include: D. Goodhart, Head, Heart, Hand: The Struggle for Dignity and Status in the 21st Century (London, 2020); J. Littler, Against Meritocracy: Culture, Power, and Myths of Mobility (Abingdon, 2018); M. A. P. Bovens & A. C. Wille, Diploma Democracy: The Rise of Political Meritocracy (Oxford, 2017).

[2] What is interesting is how elite children use meritocratic language to obscure their origins. LSE’s new study shows how our fetishisation of meritocracy makes privileged people frame their lives as an uphill struggle:(https://www.theguardian.com/commentisfree/2021/jan/18/why-professional-middle-class-brits-insist-working-class). However, as Friedman and Laurison point out, ‘People from working-class origins do sometimes make it into elite jobs, but it is rare; only about 10% of people from working-class backgrounds (3.3% of people overall) traverse the steepest upward mobility path…Origins, in other words, remain strongly associated with destinations in contemporary Britain.’: (S. Friedman & D. Laurison, The Class Ceiling: Why it Pays to be Privileged (Bristol, 2020),p. 13.)

 

 

 

 

 

 

 

 

 

 

 

 

 

Value and Values

Value and Values

Dr David Civil is a Research Fellow at the Jubilee Centre in the School of Education at the University of Birmingham. His PhD research explored the concept of meritocracy in post-war Britain’s intellectual politics.

Since their inauguration in 1948, the BBC Reith Lectures have provided historians with an annual window into the intellectual preoccupations of the post-war world. From the impact of quantum and atomic theory in 1953 with Robert Oppenheimer’s Science and Common Understanding to questions of racial, gender and national identity in 2016 with Kwame Anthony Appiah’s Mistaken Identities. A potential Reith lecturer, searching for a topic to explain the contemporary moment would be spoilt for choice: the Covid-19 pandemic, the climate crisis, rampant economic inequality and social injustice, the rise of populism, big tech, the existential challenge to democracy etc. The list, it seems, is endless. On the surface then, the selection of Mark Carney, the Canadian central banker and former Governor of the Bank of England, feels like an odd choice. Carney has sat at the apex of a financial system assailed on all sides and held responsible, by a wide variety of politicians, commentators as well as large swathes of the public, for creating or exacerbating many of the problems listed above. The title of his series however, ‘How We Get What We Value’, unites the vast majority of these crises and, in doing so, like all good Reith Lectures, touches on one of the fundamental issues of the post-Cold War age.

At the heart of Carney’s thesis is the idea that financial value has trumped human values as developed nations morph from market economies into societies where the market rules. The free market, Carney claims, has become the organising framework not just for economies, but for broader human relations as its reach extends further into civic spaces and family life. Across a variety of sectors ‘citizens’ have been replaced by ‘service users’, with perilous consequences for our civic sphere. Whether manifested in concerns about the outsourcing of public services to private providers or the growing privatisation of public spaces, the so-called ‘invisible hand’ has come to exercise a visible and forceful grip.

Within these market societies the idea of subjective value is now hegemonic. Whereas in the past thinkers as diverse as Aristotle, Karl Marx and Adam Smith felt the value of a product derived from how that product was produced, neo-classicist economists in the early twentieth century shifted the axis of value theory away from labour and towards the consumer. A product or service was no longer deemed valuable because of the costs that had gone into making or providing it, instead value was to be decided by whether individual consumers were willing to pay for it. Value was no longer thought to lie in the sweat of the labourer but in the eye of the beholder. In many ways this was a democratic shift: value was to accrue to those who could satisfy millions of individual preferences as reflected in the free market place.

It was not, however, without consequences and Carney identifies a number of risks associated with this rise of subjective value. For example, individuals are not always the rational decision-makers assumed by neo-classicist economic theory and often value the present more than the future. This ‘tragedy of the horizon’ has made solving issues like climate change more difficult. The catastrophic costs of a global issue like the climate crisis are felt beyond the traditional time horizons of most actors – imposing a cost on future generations that the current generation have no direct incentive to fix. As Carney has noted elsewhere, the ‘horizon for monetary policy extends out to two or three years.’ For ‘financial stability it is a bit longer, but typically only to the outer boundaries of the credit cycle – about a decade.’ In others words, once climate change becomes a defining issue for financial stability, it may already be too late. More worryingly, Carney claims, is the ‘drift from moral to market sentiments.’ This ‘flattening of values’ corrodes those which have tended to exist outside of the market (e.g. civic virtues) and in the process has undercut the social foundations upon which any economic activity fundamentally relies. In short, anything not priced, not deemed financially valuable, in our society is not valued. Nowhere is this fact more starkly visible than in the essential work of the care sector where, because the value of care is difficult to measure, pay remains low and conditions poor. Care workers, therefore, remain the victim of a damaging tautological spiral: because their labour has been historically undervalued they are not paid a lot and because they are not paid a lot their labour is not seen as valuable.

The message and the messenger of the 2020 Reith Lectures is emblematic of the growing intellectual consensus in favour of a ‘social reset’. Whether embodied in Prime Minister Boris Johnson’s rather vague slogan of ‘Build Back Better’, the World Economic Forum’s ‘Great Reset’ or the progressive Left’s ‘Green New Deal’, the desire for a fundamental reappraisal of the global economy is shared, admittedly to differing degrees and to varying ends, across the ideological spectrum. While Carney’s lectures serve as a symbol of this particular conjuncture, his concerns are nothing new. The idea of a parasitic free market is a common theme of communist and socialist texts, while the more Carney-like warnings are found in a variety of liberal and social democratic positions. Even more surprising, perhaps, is to find traces of Carney’s thesis amongst some of the neoliberal thinkers whose intellectual output in the early-to-mid twentieth century did so much to economically and philosophically support the rise of the market in the 1980s.

For example, Friedrich von Hayek, the Austrian economist, philosopher and author of the influential tract The Road to Serfdom, argued in 1960 that

A society in which it was generally presumed that a high income was proof of merit and a low income of the lack of it, in which it was universally believed that position and remuneration corresponded to merit, in which there was no other road to success than the approval of one’s conduct by the majority of one’s fellows, would probably be much more unbearable to the unsuccessful ones than one in which it was frankly recognised that there was no necessary connection between merit and success.[1]

For Hayek, in any free society income should reflect the value of an individual’s goods and services and have nothing to do with merit, virtue or the moral importance of their contribution. In a similar vein Frank Knight, the American anti-New Deal economist and later Hayek’s teacher, argued in the early 1920s that an individual’s income or market value should not be associated with their social contribution.[2] Serving demand in the market is simply a matter of satisfying the wide range of tastes and desires people happen to have at that particular moment in time. The ethical significance of satisfying them, however, depends on their moral worth.

Evaluating this worth involves making contested moral judgements which go beyond the discipline of economics. The philosopher Michael Sandel illuminates this distinction by considering the character of Walter White, the teacher, father and drug-dealing kingpin of the Emmy-award winning drama Breaking Bad. Most viewers would agree that White’s contribution as a teacher far exceeds that of his contribution as a drug dealer. ‘Even if meth were legal’, Sandel argues, ‘a talented chemist might still make more money producing meth than teaching students.’ But this does not mean that a ‘meth dealer’s contribution is more valuable than a teacher’s.’[3] In a similar vein, few would have argued that Captain Sir Tom Moore’s fundraising efforts, reaching £33 million in total, would have represented less of a contribution had he only met his initial target of £1000. In this sense the value of his effort was recognised in the civic or moral character of his actions rather than because of their monetary value.

Context is important here. Hayek, for example, had little influence in 1960, the start of a decade where technocratic desires to rationally plan economic activity reached fever pitch and the free market remained a marginalised concept. Hayek’s primary concern in distinguishing between merit and value was to secure the legitimacy of free market inequalities. This legitimacy, he claimed, would be tarnished if those at the top were not only rich but also considered morally superior. As Carney’s lecture makes clear, however, these warnings went unheeded as price and value increasingly became conflated. Those individuals with high incomes also came to possess greater status, power and, perhaps most damagingly, moral superiority. It does not therefore require much of an intellectual leap to consider how Hayek’s concerns have played out in the last decade of political destabilisation. Amidst the Covid-19 pandemic, however, it is clear that these market generated inequalities are suffering a legitimation crisis. In this sense Carney’s intervention has fired the starting pistol on what Mariana Mazzucato has described as a great contested debate about value.[4]

It is clear that this debate can not be overly reliant on the discipline of economics, a discipline where the moral questions highlighted by Knight have been subsumed beneath technical exercises in applied rationality. The market appealed to politicians and policymakers precisely because it eschewed these contested judgements, pushing questions of ‘who gets what?’ onto an abstract, impersonal force. There was no longer a contested debate about the morally right or wrong course of action but a mechanistic discussion about the economic costs or benefits of a particular policy choice. In its place the debate will be heated. As David Robinson has outlined in this journal, in its worst form it will descend into an irrelevant culture war. Yet it appears that a tentative consensus is forming as those across the political spectrum recognise that key workers deserve pay, status and conditions beyond those assigned by the market.

A great debate about values entails radical consequences for the shape of higher education in Britain, a sector which has too often, in the words of David Manning, disengaged ‘from the virtues of scholarship to perform research for market value.’ This is particularly true of disciplines like the Humanities where value is difficult to measure and demonstrate. Shifting away from crude metrics, however, should not, be used as an opportunity to completely dismantle mechanisms designed to deliver accountability and ensure fairness. Instead it represents an opportunity for all of us in the Humanities to illuminate the issues, challenge long-standing assumptions and help to construct a new social contract which places human, rather than merely financial, values firmly at the centre.

Download PDF

Bibliography

Hayek, F. A.,The Constitution of Liberty (London, 1960).

Knight, F. H.,  The Ethics of Competition (New Brunswick, NJ., [1923] 1997).

Mazzucato, M., The Value of Everything: Making and Taking in the Global Economy (London, 2017).

Sandel, M. J., The Tyranny of Merit: What’s Become of the Common Good? (London, 2020).

 

Notes

[1] F.A. Hayek, The Constitution of Liberty (London, 1960), p. 98.

[2] F.H. Knight, The Ethics of Competition (New Brunswick, NJ., [1923] 1997), p. 46.

[3] M.J. Sandel, The Tyranny of Merit: What’s Become of the Common Good? (London, 2020), pp. 138-39.

[4] M. Mazzucato, The Value of Everything: Making and Taking in the Global Economy (London, 2017)

 

 

 

 

 

 

 

 

 

 

 

 

The “Woke” Bite Back!

The “Woke” Bite Back!

David Robinson takes a light-hearted look at the shifting reputation of the Humanities academic but concludes there are serious reasons for their voices to be heard…

In a pre-pandemic life I was asked at a social event the obligatory question about what I do. I replied that, having been made redundant after 25 years in the business world, I went to university to study undergraduate history. Subsequently, having discovered a love for research and writing, I completed my MA and, recently, received my PhD. “And what will you do with that?”, asked my interlocutor. “Well”, I replied, “I’ve put it in a frame on the wall.”

A flippant response, but I was heading off the inevitable. A request to defend spending seven years achieving something that could not be directly monetised; not, anyway, in the sense that my previous life as a Commercial Director could be leveraged for financial gain. Of course, I could secure a position as a university academic, but in a climate where even entry level positions are given to module convenors with several years’ experience and a slew of books and articles, that is unlikely.

Hold your sympathy, though. I have my dream job! I was recently offered the opportunity to edit a journal I co-founded 3 years ago: the one you are presently reading, The MHR.

“Ah!”, exclaimed my dinner-party host, “I expect that pays quite well.” “Precisely nothing!”, I replied. “Oh! Well, that’s the problem”, they responded, “there’s not much one can do with those humanities degrees.” We both shook our heads knowingly and they wandered off to find someone with a proper job to talk to.

This was an experience to which I have become accustomed. Perhaps, you have too? Why study History? Or even worse, Art History or Philosophy? American Studies? What can you really do with these degrees? What do they even mean?

When my teenage daughter suggested she, too, might be interested in studying history, that she had enjoyed our visit to the National Gallery in London and our subsequent lunch discussion of the ways in which gender roles had been assigned in art, someone quietly reminded her that “what your Dad does isn’t really history.” Quite. She might be advised to “do a proper subject.”

Such disdain does not seem justified. In fact, the reverse. In the twentieth century, thirteen out of the nineteen British Prime Ministers awarded a degree were humanities graduates. More recently, P.P.E. (Philosophy, Politics, and Economics) is the humanities degree of choice for an astonishing proportion of those who rule over us and those who explain how we are ruled, whether senior politicians or prominent political journalists. And what about those who implement policy, the civil service? Andrew Greenway, a former senior civil servant who writes regularly for Civil Service World, argues that P.P.E is not necessarily the golden ticket to the top of the political and administrative elite’. In his list of post-war Prime Ministers and Cabinet Secretaries, only three studied P.P.E. Still, between the P.M. who, as Greenway puts it ‘chooses the route’ and the cabinet secretary who ‘drives the car’, of the eighteen he lists with a degree, thirteen studied history or a mix of classics and philosophy. The outliers were a few lawyers, economists, and a chemist.

It seems, then, that we might complicate the debate on the relevance of a humanities degree. An education providing little apparent value for the likes of thee and me, appears to be an almost ubiquitous preparation for a career at the highest levels of public life.

Let’s unpack this a little further. Back in the nineteenth century, studying classics at Balliol, Oxford, under the college’s Master, Benjamin Jowett, was de rigueur for a career as a senior administrator in British India. Young men were trained, Jowett claimed, ‘by cold baths, cricket, and the history of Greece and Rome.’[1] The British did not simply take their management of India from classical Greek and Roman precedent, they were the new Greeks and Romans. A classical education was not merely a useful preparation for colonial administration, it was central to justifying ‘the historical experience of overseas domination.’[2]

Generations of British school children have been taught British history as a discrete list of the actions of, mostly, white male elites, often described by gentlemen amateurs and retired statesmen who regularly wrote about the very policies they themselves designed and implemented. As Churchill noted, ‘history will be kind to me, for I intend to write it.’

The most influential historian of his generation, Thomas Babington Macaulay, wrote his 1848 seminal work almost entirely as a justification of the cultural, economic and political authority of the English middle-classes. This was ‘the smug message of Macaulay’s History of England.’[3] Nor was he informing only his own generation. As he wrote to a friend, ‘I have tried to do something that will be remembered; I have had the year 2000, and even the year 3000, often in my mind.’[4] Macaulay would probably have been reasonably satisfied with his efforts. These are histories of glorious nation that are, at best, incomplete and de-contextualised and, at worst, a carefully crafted narrative of British (English?) exceptionalism to justify and lionise tyrannical imperialism and global domination.

Is this a fair and balanced assessment? Of course not. For a start, Churchill never made such a statement. Although, he did say ‘for my part, I consider that it will be found much better by all Parties to leave the past to history, especially as I propose to write that history myself.’[5] A damning indictment? Well, that’s the problem with woke lefty historians: no sense of humour. Churchill was probably just joshing. A bit.

The more serious point is that, as David Ludden puts it, ‘the veracity of statements about reality is not at issue so much as their epistemological authority, their power to organize understandings of the world.’[6] More simply, the study of human affairs is not so much about what happened and when, although events and chronology are important, but how and why the past is and has been interpreted differently.

So, back to my central question. Why are the exponents of academic humanities, once respected for their knowledge and trusted to pass on their understanding of Britain’s and, more broadly, the ‘West’s’ contribution to concepts of ‘progress’ and ‘civilisation’, now castigated as ‘typical of the open-toed, sandal wearing, beardy geography teachers at the heart of all the problems in modern society.’[7]

For two key reasons: on the one hand, because they present an existential challenge to many Britons’ understanding of themselves and their nation’s place in the world, past, present and future; on the other, because they potentially strike down a central appeal of politicians to the voting public, namely their right to rule based on defending that same understanding.

Most academics today contend that there is a strong prima facia case to suggest that the interpretation of the humanities for educational and public consumption has tended to be selective and aimed at presenting a particular view of the past that tends towards a certain British exceptionalism and national superiority. In recent decades, scholars of the humanities have taken to analysing and deconstructing the comfortable and self-congratulatory picture of the past taught for more than a century.

Are they right? Can we have that discussion? Not a very sensible one when government ministers trivialise the issues with populist headlines such as We will save Britain’s statues from the woke militants who want to censor our past.

Let me close with an example of just how misleading such headlines can be, the 2020 removal of the statue of Edward Colston, in Bristol, the paradigmatic example of cancel culture, imposed by woke militants intent on erasing our history.

The reality, I contend, is almost exactly the opposite of what has been popularly proposed. The argument has been made that Colston, an acknowledged beneficiary of colonial oppression and slave trading, may have profited from activities considered unacceptable today, but that when his statue was erected in 1895, such practices were less proscribed. To remove his statue is to impose a modern moral standard not subscribed to at the time, and is thus a distortion of history, the classic example of ‘cancel culture’.

In fact, the removal of Colston’s statue had little to do with historical debate and more to do with the frustrations of local protesters. Their legitimately gained democratic mandate, to have a plaque attached which offered some more context in terms of Colton’s slave trading activities, has been continually blocked. It is also demonstrably the case that slave trading was, largely, as unacceptable in 1895 as it is today.

That, however, is not my point. The proposal to erect Colston’s statue back in 1895 was a local political response to the growing protests of Bristolian workers objecting to poor rates of pay and working conditions. Political activists argued in public speeches, influenced by the political tracts of Karl Marx, that those workers were as much victims of their merchant masters as the slaves and the colonised that had been such a source of enrichment for the commercial elites of Bristol. These were arguments that gained some traction with the voting public of the city. Such concerns and the potential for unrest, common to many British cities, prompted local businessman, James Arrowsmith, to try and raise a statue to Colston, a well-known Bristolian philanthropist, by public subscription. Arrowsmith’s strategy was to counter criticism of Bristol’s colonial merchants through a demonstration of public support for the civic benefits brought to the city by those engaged in colonial trade. In this, he largely failed. Although some public funds were raised, the statue was eventually erected mostly at his own cost.

The background, then, to the Colston statue is not one of ubiquitous popular support for a merchant philanthropist Bristolian, but a fascinating insight into nineteenth-century ‘open class warfare’ and public support for ‘the formation of a “labour party” to represent working people’. Which aspects of history are being erased by substituting a mature debate on this subject with trivialising accusations of woke cancel culture? In many ways, Arrowsmith’s nineteenth-century tactics are being replicated by twenty-first century politicians.

The past will always be contested. If you are reading this, you are probably engaged in that process to some extent or another. Practitioners of academic humanities are, perhaps, not naturally suited to confrontation. But our voices are not just important, they are key. As Orwell pointed out, critically thinking about the past is an essential part of the present and, by extension, the future.

The MHR aims to be a part of that debate, through the voices of our contributors. We look forward to hearing from you.

Download PDF

Notes

[1] P. Woodruff, [= P. Mason], The Men who Ruled India: The Founders (London, 1953), p. 15.

[2] E. W. Said, Culture and Imperialism (New York, 1993), p. 114.

[3] F. Bédarida, A Social History of England, 1851-1975 (Paris, 1976), p. 49.

[4] T. Pinney, The Letters of Thomas Babington Macaulay in four volumes (Cambridge, 2008), p. 216.

[5] Speech in the House of Commons, Hansard ,Volume 446 (23 January 1948), Column 557. https://hansard.parliament.uk/Commons/1948-01-23/debates/b9704861-e9ed-40d9-ab92-4477a26e25f5/CommonsChamber.

[6] D. Ludden, ‘Orientalist Empiricism: Transformations of Colonial Knowledge’, in C. A. Breckenridge & P. van de Veer (eds.), Orientalism and the Postcolonial Predicament: Perspectives on South Asia (Philadelphia, PA., 1993), p. 250.

[7] As said to me at the same dinner party described above. Ok, it’s a great line!