The Uncertain State of Mortality: Reconsidering the Intricacies of Death in the Context of the First Human-to-Human Heart Transplant Surgery

The Uncertain State of Mortality: Reconsidering the Intricacies of Death in the Context of the First Human-to-Human Heart Transplant Surgery

Ashley Clark

Download PDF


This article covers the legacies of the first human-to-human heart transplant performed by Christiaan Barnard and his team on the 3rd December 1967 at the Groote Schuur Hospital in Cape Town, South Africa. The investigation into this subject is to emphasise its significance from a socio-medical stance as success in this procedure allowed for medical engineers to not only be able to perform an unthinkable surgery like a heart transplant but also encourage an innovative discipline which interprets the process of death in line with advances in medical technology and understanding. By looking at a series of ideological standpoints which theorise the multi-faceted brain death concepts, this scholarship highlights how the brain has replaced the heart as the centre-point of ascertaining death. In summary, this article gives agency to a topic which is largely unexplored but remains imperative to understanding how the concept of Brain Death has become a medically and socially accepted ideology; one that is fundamental to concluding whether an individual is dead or alive.

Keywords: Electroencephalogram, Human heart transplants, Brain death, Cape Town Symposium, Donor, Christiaan Barnard, The Harvard Committee, The Presidents Commission

Author Biography

Ashley Clark is a graduate of the University of Derby. He received a First Class Honours in History which sparked a passion in medical history. He is currently completing a Post-Graduate Diploma in Secondary Education History at the University of Birmingham before beginning his teaching career as a Secondary School History teacher at Birmingham Ormiston Academy. His interest in this field stems from his ability to produce valuable work in a significant yet unexplored area of history.


The concept of death has various interpretations that are shaped by cultural and spiritual connotations. The success of the first human heart transplant on December 3rd, 1967 experienced by recipient Louis Washkansky remains one of the most ground-breaking medical procedures of the twentieth century; leading to, as Martin S. Pernick insinuates, fraught considerations about the meaning of death because of the possibilities now posed by the successes of heart transplant surgeries. Pernick highlights how this surgical breakthrough was the latest stage in a long history of controversies about death and its interpretations.[1] The technological innovations in medicine that appeared in the post-Second World War period, particularly in the 1960s, partnered with the tremendous success of the first heart transplant in 1967, created a new discourse in the field of death; one that shifted the focus from the heart as the central system of the body to the brain having the utmost importance when determining an individual’s state. Whilst there are many different types of ‘brain death’ such as those explored in Ben Sarbey’s article on death and its definitions, this latest development in the topic of death allowed contemporaries and historians to approach the matter of death from a completely different angle.[2] This new perspective is displayed by historian Robert Veatch and his compelling summary of death stating that ‘death is the irreversible loss of that which is essentially significant to the nature of man.’[3] Despite the multiple ambivalent stances that are present about mortality, an undeniable factor is the emergence of medical technologies and their ability to reconstruct the traditional paradigm. Advances in medical science and equipment are responsible for brain death becoming a legally acknowledged alternative to the traditional cardiopulmonary definition[4], which remained unchallenged until the success of the heart transplant in 1967. This article will analyse the effects of the first human-to-human heart transplant and how this procedure acted as a driving force in the process of redefining death. It will use scholarly observations from contemporary surgeons and specialists in the cardiac field such as Dr Denton Cooley, Professor Christiaan Barnard, Dr Adrian Kantrowitz, and Dr Pierre Grondin; all of whom were present at the Cape Town Symposium in 1968 to discuss the legacies and experiences with human heart transplantations. Observations made by these professionals, amongst others, will support the need to compromise on a valid and accepted meaning of death that is both necessary and effective.[5] Addressing these features will relay the overall theme of the article which highlights the significance of the heart transplant and its ability to encourage evolved socio-medical doctrines about death and its occurrence.

The Heart Transplant and the Foundations of Death in a Modern Context

The success of Professor Christiaan Barnard and his surgical team on December 3rd, 1967, remains one of the most ground-breaking innovations in medicine.[6] A striking observation about Barnard’s success is not the technicalities required to perform it but rather the location where it was conducted. Barnard had spent years observing pioneers in the surgical world. Witnessing the legal setbacks that were apparent in the US, Barnard stated to a pump technician working in the US that ‘you have too many prohibitions to negotiate before you can find a donor. We have no such obstacles in South Africa.’[7] Barnard knew that the agreement of only two doctors was required to declare death through irreversible brain damage and he sensed an opportunity in South Africa; one that could manipulate the less restrictive legislation around death, as the regulations in the US were more established and firmer.[8]

Unorthodox requirements necessary to completing a heart transplant such as finding an appropriate number of willing donors and removing a beating heart from a patient in a persistent vegetative state who would still be traditionally labelled as ‘alive’ created a grey area in ethical considerations. The realisation that donors were needed to support the practicality of this procedure generated uncertainty in both the public and professional spheres; mainly because giving an individual a donor status is essentially synonymous with declaring death. Despite the proliferation of successful heart transplants since Christiaan Barnard’s success in 1967, this field remained largely unexplored and left contemporary professionals and members of the public anxious about the long-term effects of heart replacements. To tackle these social and ethical issues around death, transplant surgeons advocated a revised definition of death which was compatible with the revolutionary surgical innovations of the period. Medical transformations such as ventilators ensured that key organs such as the heart and lungs could now be maintained artificially. Transplant surgeons believed that the traditional criteria of death were now invalid and that a death centring its focal point on the brain was more appropriate for the evolving medical climate. Thus, the growth of human heart transplants became pivotal to debates on death and how it is defined.

The development and progress in hospital equipment throughout the twentieth century were critical to the success of heart transplants. The use of a ventilator provides patients with serious brain injuries a chance of recovery by providing the body with enough oxygen so their heart continues beating and circulating oxygenated blood.[9] In the context of heart transplants, this machine sparked mass debate as death by cardiopulmonary means was now reversible and, as David Rodriguez Arias has suggested, the ventilation machine and its presence contributed to a readjusted approach to determining death.[10] As heart transplants were now in the public domain, Martin Pernick highlights the emergence of two key concerns for both the public and professional sphere: the fear of being pronounced dead before appropriate and therefore overhastily being designated an organ donor, and the fear of being kept alive too long as a ‘vegetable’ with severe, irreversible brain damage.[11] Concerns like these were used by transplanters to highlight the ambiguities of death and because the heart transplant was used to treat people with heart failure which was the biggest killer during this period, Barnard justified the extraction of organs from brain-dead patients on life support by suggesting that they were extracting organs from people who existed in a no-man’s-land between life and death.[12] The feasibility and availability of donors were now becoming a priority as this would help protect the longevity of heart transplants, and the presence of the ventilator now served as an inadvertent obstacle that was as beneficial to the concept of heart transplants as it was disadvantageous. This is because the ventilator could now sustain the heart by allowing it to function normally. This created a difficult conundrum between deciding whether to hold out for the unlikely but potential recovery of a patient, or whether to extract a perfectly healthy heart from someone who was in an irreversible state of brain death to help others with heart failure.

The heart transplant was an interesting development in surgical advances which provided the groundwork for future medical innovations. However, reflecting on unconventional yet innovatory procedures such as heart transplants, modern observations suggest that for an organ transplant to be considered ethical then the policies about organ procurement should not ignore either the vital needs of the recipients or the dignity and interests of the donors. This is an acknowledgement that was also recognised before and immediately after the heart transplant’s debut in 1967.[13] Concerned with the ethics of donor/recipient safeguarding and the intricacies of the procedure itself, contemporary physicians and transplant surgeons realised that they had to also destroy any debatable perceptions that the public may have about surgeons and their self-indulgent nature to ravenously snatch organs from helpless victims.[14] This selfish nature is highlighted in contemporary newspapers where tabloids, such as The Times, imply that heart transplants have become a point of national rivalry, one that defies the Hippocratic approach[15] which Western medicine is built upon.[16] Furthermore, the heart transplant was regarded as an unworkable procedure because of its uncertain prospects. The operation still had the label ‘palliative’[17] and because of the huge risks that shadowed this procedure, controversies lingered around whether they should be proliferated and normalised.[18] A strong realisation that supports the anti-heart transplant notion is that despite the procedure’s initial success, criticism for this operation began to surface when the public realised that transplant surgeons could not always control organ rejection.[19] This was a valid argument as rejection served as the biggest killer in heart transplant victims. However, Lord Morris writing in The Times in 1969, believed that to withdraw from performing heart transplants because of the fear of rejection and other issues reflected a defeatist attitude, one that did not complement the revolutionary medical understandings stemming from Barnard and his team.[20] Morris believed that the success presented by Barnard’s second heart transplant victim, Dr Blaiberg, who lived on for eighteen months post-operation, served as a triumphant leap into a complex field and that it would be disreputable to halt this approach indefinitely due to fears of the unknown and other prejudices. This contemporary proposal is supported by experts David Cooper and Denton Cooley[21] who reframe that considering the unexplored area of immunosuppressive therapy and a surgical team’s inexperience in treating and diagnosing tissue rejection, Barnard’s second transplant served as a framework to support explorations into the field of heart transplants and preserve its legitimacy.[22]

Revolutionary advances in medicine witnessed in the late 1960s added pressure to creating a new definition of death; one that reflected medical innovations and understandings. The traditional criteria of cardiopulmonary-related death were no longer compatible with situations created by medical equipment that could now be used to reverse conventional means of death, and as Hershenov has implied, individuals who would have been considered dead in another era [were] now sometimes ‘returnable.’[23] The ground-breaking impact of Barnard’s procedure, partnered with the advances in medical technologies created a desire to protect the procedure’s long-term prospects. As heart transplants were now recognised as a practical option for treating patients with heart failure, the need to redefine death was imperative to organ transplant advocates as the heart and lungs could now be regulated artificially. So, not only did that make the traditional concept of death obsolete but it also created an urgency to reconsider existing medical terminologies. Professor of Social Medicine David Rothman acknowledged that once surgeons had ‘transplanted a beating heart and the feat was celebrated in the media – the need to redefine death was readily apparent.’[24] As the concept of death had now been thrown into question by transplant surgeons, waiting for heart stoppage was now surgically unacceptable. Transplant surgeons believed that waiting for the cessation of a heart to pronounce death was now medically unethical; especially with the new technologies and understandings in this field.[25]

However, whilst convincing suggestions were made about what constitutes death by heart transplant supporters, obtaining the appropriate number of donors for recipients remained challenging. This is why the meaning of death had to be carefully reassessed to gain support for transplant surgery and increase the number of people willing to donate their organs if they entered a state from which they were to be unable to recover. Desmond Smith argued that ‘as the need for donors grow larger, the definition of death must be carefully redefined. When are you dead enough to be deprived of your heart?’[26] These socio-medical and ethical questions about heart transplants were the primary motivation behind the Symposium in 1968. At this event, leading surgeons across the globe came together to discuss how their interpretations in this medical discipline could enhance progress and provide a new method of treating those with serious heart issues; also, extinguishing any critical declarations which may completely stop or halt the process of maintaining heart transplants.

Whilst heart transplants dramatically proliferated in the late 1960s, it was the success derived from Barnard’s patients that proved the most ground-breaking. Barnard’s first patient died eighteen days post-surgery and his second patient lived for a staggering eighteen months post-operation; it is the latter case which is regarded as the cornerstone of securing the future of heart transplants.[27] It was this type of success which helped to re-conceptualise the idea of death, making the topic a sensitive delineation between defining death as the cessation of brain functions or circulatory/respiratory functions.[28] However, whilst the surgery’s success did encourage the idea that the practicality of heart transplants should be taken seriously, a popular reason to reject this operation was the lack of long-term solutions that it offered. Opinions and feelings in the post-Barnard era can be witnessed in examples such as John Roper’s article where he uses the idea that:

occasional dramatized success should not oblige the health service to try to meet the disproportionate demands of surgical enthusiasts for scarce medical, technical, and nursing resources implicit in a premature attempt to establish cardiac transplantation as a practicable form of treatment before the basic scientific problems concerned have been brought nearer to solution.[29]

The experimentations in this field were unorthodox and although contemporary professionals and non-professionals were concerned about the feasibility of the transplant and the self-interests of transplant surgeons, these types of claims inspired the symposium of 1968. Arguably, to think rationally during this period would be to discourage heart transplants temporarily due to the social and medical complexities that it entailed. Contrastingly, transplant surgeons had already descended into the realms of cardiac transplantation and as Christiaan Barnard suggested, the ‘first step’ into this field has already been taken and it would, in his view, have been wrong for the patient, humanity, and the common purpose to turn back.[30]

The definition of death remained ambivalent because medicine was developing at a rate that many non-professionals and even professionals could not fully comprehend. However, in the context of heart transplants, the ‘dead-donor rule’ played a central role when justifying the procedure in a legitimately ethical manner. [31]  As certain biological functions were now able to be maintained in brain-dead patients, surgical advocates in this field thought it was necessary to amend the meaning of death because of the medical innovations that were now present.[32] This desire to reinvigorate the definition of death was acknowledged by professionals at the Symposium. Dr A. P. Rose-Innes declared that ‘if we use the conventional point of death only, time is impossibly short for proper preparation for the surgery.’ This acknowledgement is mirrored by Dr. de Villiers[33] who applies the re-visited definition of death to the application of donors by stating that, ‘it must be reasonable to state that only those patients with severe, irrecoverable brain damage who cannot maintain their respiratory function independently, should be regarded as suitable candidates for providing tissues for transplantation.’[34] These observations show the importance of a re-evaluated definition of death and its instrumentality in securing the future of heart transplantation. The importance of this change of criteria is fundamental to the procedure as it ensures that a beating heart can be extracted from a brain-dead patient without any controversy surrounding the state of the donor. According to Professor Adrian Kantrowitz[35], ‘experimentally, it can be shown if one takes a beating canine heart, that the chances of immediate success are much higher. If one waits until the heart has stopped, one can resuscitate such a heart, but it does not perform as well as the heart removed still beating.’ He concludes this observation by stating that ‘the point at which the potential donor becomes a donor is essentially the point of irreversible brain death, as diagnosed by the experts.’[36] Whilst the steps taken to perform a heart transplant were widely considered unethical and unorthodox during this period, declarations like Kantrowitz’s imply that with the wide range of medical innovations and capabilities that are now present in the medical industry at the end of the 1960s, it is now inhumane and unethical not to change traditional criteria’s as these conventional methods are now outdated. A similar point to Kantrowitz was displayed by contemporary surgeon Professor Norman Shumway who believed that the heart must be in perfect condition before it is extracted from the donor and placed into the recipient.[37] The use of artificial interventions, such as the ventilator, would help preserve the heart before it is extracted for transplantation purposes; ensuring that the organ is not weakened due to any deprivation of its functions. It salvaged life while also identifying who now became unsalvageable by medicine.[38]

Cardiac transplantation was now gaining a profile that helped the procedure become ‘normalised’ within everyday medical practice and discipline. The focus on death acting as a process rather than an event is a popular initiative that helped professionals reconsider their ideological stance. As emphasised by Professor Kantrowitz, ‘death is a process just as life is, and there are organs and tissues which die very quickly. If you deprive the brain of oxygenated blood for 3,4 or 5 minutes, it is irreversibly dead.’[39] Displaying death as a phenomenon that develops over time reflects an idea stressed by philosopher Baruch A. Brody. Brody emphasises that the search for a definition/criterion made sense when these points were always close in time to each other because medicine could not protect some of the functions when others had stopped. However, this was no longer relevant as medicine was now capable of maintaining key bodily functions for longer periods.[40] The socio-medical attitudes towards death in the 1960s and 1970s reflect a ‘renaissance-like’ development in the medical discipline. These ideologies are also witnessed as early as 1915 in an account of a surgeon from Chicago who justified his euthanasia practices through a brain-based concept of life. He asserted that ‘we live through our brains… Those who have no brains, their blank and awful existence cannot be called life.’[41] This example portrays early implementations of brain death policies. It can also suggest a framework upon which surgeons, physicians and philosophers of the following decades had built their ideology.

The evolving theories around what constitutes death were evident in the ‘President’s Commission for the Study of Ethical Problems in Medicine and Biomedical and Behavioural Research’ report on a ‘total brain’ standardised definition of death, which was published in the early 1980s. The report noted that ‘the traditional “vital signs” of heartbeat and respiration were merely surrogate signs with no significance in themselves. On this view, the heart and lungs are not important as basic prerequisites to continued life but rather because the irreversible cessation of their functions shows that the brain had ceased functioning.’[42] This acknowledgement shows how the traditional approaches to death were no longer applicable and how innovative medical machinery had helped enhance the idea of quickly adopting a modernised style that reflected significant breakthroughs in surgery.

The heart transplant and its public impact on medicine was single-handedly responsible for an immediate desire to reconsider ideas about death. A surgery that required unorthodox steps to be taken to treat terminally ill patients with heart diseases created the foundations of evaluating death in the context of contemporary medical advances.

The Concept of Brain Death and its Fundamental Ambiguity

The concept of death is a field of study with many grey areas, meaning that a concrete definition has not yet been achieved. Death’s ambiguity creates subjective interpretations amongst physicians, transplant surgeons, scholars, and the public. However, scholars tend to analyse the concept of death by attempting to agree on its definition, and then formulate a measurable, appropriate criterion to show that the definition has been fulfilled, resulting in a series of tests to prove that the criterion for death has been reached.[43]

Before attempting to define the topic of death, it must first be acknowledged that a definition and a criterion are two separate entities. A definition portrays the conditions that must be met for a thing or event to answer a certain description, and it is these conditions which must be precise before one can decide which criterion best reflects that these conditions are being achieved.[44] In the pursuit of applying brain death as a suitable way of defining death, Robert Veatch implies that death should be understood in the relevant context when defining brain death. He suggests that the question of using the brain death criterion as a valid method for death seeks to specify the signs that indicate the relevant definition of death has been met.[45] Adopting this acknowledgement is important as it is a useful method to apply when attempting to resolve a difficult subject like defining death. It does not offer an impervious definition, but it does offer a method that will provide a meaning of death in response to evolving medical developments. Whilst a definition is not concrete, it is widely acknowledged that when we use a definition of death referenced by a criterion, then this is a priori matter for which empirical evidence is not directly needed or exactly relevant when determining one’s demise.[46]

In September 1968, the ‘Ad Hoc Committee of Harvard Medical School to Examine the Definition of Brain Death’, led by Henry Beecher, produced a report in the Journal of the American Medical Association under the title ‘A definition of irreversible Coma.’ This publication put forth a criterion involving a permanently non-functioning brain for the purpose of achieving a precise definition of death called ‘irreversible coma’.[47] It was generally agreed that as heart transplants increased their social presence, it was essential to the longevity of this operation that the medical community was to be given some agency when revisiting death’s definition.[48] Whilst the concept of brain death was being integrated into conscious thought amongst professionals and non-professionals and eventually into legislation, this only created more ambiguity regarding brain-orientated definitions. Although the original focus was on a ‘total’ brain death which refers to the death of the entire brain, as neurology now served as a centre-point of labelling death, questions were raised about what areas of the brain were considered important and whether an individual is only dead when the last remaining function in the brain ceases. This opened a different avenue of brain-orientated death, one that related to higher brain functions. As it was suggested that the cerebrum is the area of the brain that performs higher-standard functions like interpreting touch, vision, hearing, speech, reasoning, emotions, learning, and control of movement, the cessation of this area was referred to as the determinant of death based on a higher standard of brain death.[49] As Robert Veatch explained, ‘higher brain death’ holds that the key functions of the brain such as memory, consciousness, and personality are what make us a person and since those functions stem from the cerebral areas of the brain, it is the cessation of these portions of the brain that should be considered equal to the death of a person.[50] These important features of human function were echoed in 1971 when Scottish neurologist J.B. Brierley urged that brain death should solely be associated with the permanent cessation of higher functions rather than the complete loss of all brain functions. He stated that death should relate to the cessation of ‘those functions of the nervous system that demarcate man from the lower primates.’[51] Innovations were now being made within the brain-oriented environment which did not make the matter of defining death any clearer than it was under the criteria of the cardiopulmonary standard. Nevertheless, these are important developments in the field of death as new theories about human cessation were evolving simultaneously with the expansion of medical capabilities and understanding.

The Committee was very influential during this period and reiterated many of the contemporary beliefs of those who supported brain death as a discipline. The Committee was a very important presence during this time as they gave legitimacy to the abstract concept of brain death which before Barnard’s success would have been unimaginable. Henry Beecher favoured brain-based criteria, but his priorities were not set on resolving the meaning of death. Rather, he advocated a neurological stance to solve practical problems that he attributed to new technologies, particularly organ transplantation, heart-lung machines, and ventilators.[52] He hoped that the criteria displayed by the Harvard Committee would not only increase the number of donors but also defend the entire medical profession against public prosecution and accusations that would label these medical professionals as ‘organ thieves’ or ‘killers.’ Although at this period many ideas around securing the future of heart transplants and obtaining large amounts of donors were unethical, Beecher was trying to evolve the ‘normal’ boundaries of death, parallel to the increasing medical changes. It can be suggested that as time and investment were placed into the heart transplant and the concept of brain death, the principles and procedures that encapsulate these disciplines could eventually be considered ethical and morally objective.[53] William Curran, a member of the Harvard Committee resonated with some of Beecher’s ideas by explaining that if the whole-brain criteria became the standard of medical practice, then the law would protect physicians who followed the criteria from any malpractice charges, particularly in the context of organ donation.[54] The Committee’s obligation to a total brain death criteria is explored by historian Karen Gervais who suggests that Beecher [and his team] may have held this belief because there were no higher-brain criteria suggested at this point and not just because he held a whole-brain definition of life.[55] Whilst brain death provided a popular alternative to the traditional definition of death relating to the stoppage of the cardiopulmonary system, the introduction of brain death which filtered into contemporary professional discussions would evolve as further concentration was focused on understanding the cessation of certain brain functions and their overall importance when contributing to death’s process.

Before theories of brain death began to encapsulate both public and professional discourse, it was socially and legally accepted that death consisted of a ‘total stoppage of the circulation of the blood, and a cessation of the animal and vital functions consequent thereupon, such as respiration, pulsation, etc.’[56] These different theories all sought to emphasise which areas of the brain are necessary for determining irreversibility. However, the term ‘irreversibility’ which was used by the Harvard Committee in 1968 to identify the death of the brain is perceived as problematic by neurologist James L. Bernat. To accommodate medical theories and practices with the law, he believed that irreversibility, which means a function that has stopped and cannot be restarted, should be replaced with permanence. This is because the definition of permanence in the relevant context of brain death specifies that a function has stopped, will not restart on its own, and no such intervention will be undertaken to restart it.[57] However, this replacement idea has been criticised by theorists such as RD Truog and FG Miller who have suggested that replacing the term irreversibility with the permanence standard is just ‘gerrymandering the definition of death.’[58] Even though these criticisms may be true to a degree, this example shows just how intricate the study of neurology is in the context of death, and how unexplored and subjective the field of death remains; despite multiple attempts by different theorists to provide a definitive measure of mortality.

Although the Harvard Committee had put forth the idea of whole-brain criteria for defining death consisting of the permanent loss of all brain functions from consciousness to primitive brainstem reflexes[59], the President’s Commission for the Study of Ethical Problems in Medicine and Biomedical and Behavioural Research presented the ‘total brain’ standard to incorporate this theory into state legislation. This is because it addressed the advances in medicine and technology which could now perform necessary bodily functions. Although the President’s Commission did not have the legal authority to change legislation and purely acted as an advisory group to the President on bioethical issues, the presence of this organisation in 1981 shows how the policy of brain death had infiltrated its way into legal matters; originating from the period which witnessed Barnard’s first heart transplant in 1967, and the Harvard Committee’s report in 1968.[60] However, another theory of brain death that became popular during this attempt to inaugurate the total-brain standard by the President’s Commission is the ‘higher-brain standard’, and while competing with the total standard, the death of the entire brain is sufficient enough for labelling brain death but not entirely necessary under the policies of this new discipline.[61] While total brain death may not be entirely necessary for death, it is applicable in the sense that all cases of total brain death are by default cases of death and in practical purposes, cases of total brain death can be referred to as cases of death by anyone holding a brain-based standard of mortality. This is because every case of total brain death would automatically be a case of higher brain death, meaning that both standards have been met.[62] This could be one of the many reasons why a total brain standard was legally accepted and not based on the cessation of higher functions; making the total standard a more politically pragmatic way of separating the state of being alive and dead. The case of total brain death was made stronger by the President’s Commission which attempted to debunk declarations of higher-brain-oriented death. They argued that these definitions which focused on the higher-end functions were too radical, too subjective, and impossible to carry out under the current climate of medical development as no operational criteria existed for supporting these theories.[63]

In response to these proposed definitions of brain death, arguments have arisen that use scientific findings and logic to either support or denounce brain death theories. Two popular theories that attempt to explain brain death are the Thermodynamics Theory and the Holistic Integration Argument. The first theory is employed by Jules Korein who emphasises that parallel to a brain being destroyed, the organism’s critical system has been destroyed and with the destruction of this system, spontaneous fluctuations will irreversibly cause the organism to deteriorate and increase systematic entropy until all functions are unable to operate. He further declares that it is the beginning of this process where certain brain functions are lost that are responsible for the critical functions of the organism.[64] A similar theory supporting brain death is the Holistic Integration Theory which declares that in the event of an individual who possesses complete, irreversible, and irrecoverable loss of brain functions, the organism has lost its central control organ and it is this centre where all different parts and functions of the body are monitored and controlled. However, this theory has been analysed in Dieter Birnbacher’s article where he suggests that if this argument is understood to mean that the brain is strictly responsible for the control of every bodily function, then this is inaccurate when we consider the capacity of bodily functions to remain intact with artificial support, such as a ventilator for up to several months. However, if this theory is interpreted in the spiritual sense that someone who is deprived of vital brain functions is no longer a ‘complete’ individual, then this adds substance to medical debates as it is true that a human without key functions of the brain can hardly be considered complete in the context of ordinary and necessary elements of life.[65] This interpretation of similar theories is compatible with both the total and higher standard of brain death as they both imply that the brain is what matters in the determination of death; whilst attempting to delineate between the areas of the brain that are responsible for functions which are considered both natural and essential to one’s existence. This acknowledgement is consistent with ideas announced by specialists in this field such as Charles Gulver, Bernard Gert, and James Bernat who all conclude that whilst removing life support from someone in a persistent vegetative state (PVS) is controversial, the answer should not be reached by deliberately ignoring the important distinction between death and loss of personhood.[66] Whilst this theory does mention acknowledging the distinction between death and loss of personhood, it is not completely dismissive of the fact that the loss of personhood is irrelevant and although it is a psychological and spiritual concept that can only die in a metaphorical sense, it is this realisation that places a sense of consideration to the idea of a person irreversibly losing their individual identity. This observation serves as another paradigm which reflects the ambiguity of death and how despite scholarly attempts to reach some form of clarity and coherent conclusion, it ultimately complicates matters further; emphasising that the realities of death rely on subjectivity and interpretation rather than objective information.

Deciding on an appropriate definition for death is difficult and the available testing to support these brain-based theories shares the same problems. As the field of neurology is so vast and largely unexplored, using an adequate test that best reflects some of the proposed theories of brain-oriented death remains controversial with professionals all struggling to create a perfect solution. The Quality Standards Subcommittee of the American Academy of Neurology suggested a criterion that was compatible with the evolution of medical capabilities to identify the occurrence of brain death. They had stressed that if a patient was not being maintained by a ventilator then it would be necessary to identify the individual’s death with the prolonged loss of circulatory and respiratory functions; however, if the patient was maintained on a ventilator then it would be appropriate to use the criterion of measuring for prolonged cessation of measurable clinical functions of the whole brain.[67] These innovative adaptations can reflect Barnard’s theories expressed at the Symposium around the extraction of beating hearts. He implied that if you removed a respirator away from a patient to see if they can breathe independently as part of the tests to determine someone’s death based on their inability to breathe autonomously, then is it necessary to wait for the beating heart to stop beating before it can be extracted?[68] Drawing parallels between Barnard’s observation and the methods displayed by the Subcommittee, we can see a continuation in medical discipline first proposed by Barnard in 1968 and by the Subcommittee in 1995. This is apparent through the realisation that the use of artificial machinery has become obligatory for someone in a critical state, and because of this, the criteria for brain death have become the closest resemblance to measuring someone’s mortal state; in a scenario where irreversible cessation has occurred and recovery is not an option due to the irreparable damage of the brain, and the artificial inability to reverse the occurrence.

Although specialists debate which definition best reflects the state of death, most medical experts suggest that the diagnosis of brain death should be confirmed by tests showing that none of the vital centres in the brain stem is still functioning, repeating this test frequently to enhance its validity. In addition, most imply that the presence of an Electroencephalogram (EEG) is not required because although it is used to monitor brain activity, this machine can track even the minutest bit of brain activity which could interfere with the determination of death when applying brain death ideology. So, brain-oriented advocates, either of the higher, total, or brainstem standards, would not want an EEG present as it could tamper with results.[69] Despite these implications present when testing for brain death, Christopher Pallis has observed that bedside tests for brain death that are performed by physicians rarely assess the functions of large portions of the brain such as the occipital lobes, basal ganglia, and the thalamus. He portrays that determination of whole brain death focuses on assessing brainstem functions, presumably because they’re much easier to test for than other ‘higher brain’ functions, their exact location within the brain, and concluding their complete cessation.[70] However, Veatch implies that while a criterion that solely accepts the higher-brain notions of death is unlikely to occur in the immediate future, he suggests that the ‘old-fashioned’ view of the total brain standard is becoming less and less popular as more people begin to realise that not all brain functions must be permanently lost to classify an individual as dead.[71] The desire to reach a completely coherent conclusion about brain death is a natural by-product of ever-evolving ideas and innovations. Its primary focus is to receive a good reception from the public as it plays a key part in the continuation and preservation of both brain death legislation and heart transplants. These issues were contemporarily reported on at the symposium by professionals such as Dr Pierre Grondin[72] who states that ‘even though it is very difficult to define the point of no return, or to find some definition of brain death, I think it is important because the public is worried about one thing and that is that we are going to use donors who could be restored.’ The approach taken here shares many similarities with theorists such as Veatch who seek to preserve their ideology practically. Grondin echoes many of the contemporary understandings about the preservation of heart transplants and the importance of brain death by concluding that experts in this field must attempt to define death to the best of their abilities as this is essential to the preservation of human heart transplants; otherwise, the public would always remain critical of this procedure and those performing it.[73] Tests must be carried out with clarity and authenticity because if a test fails to present any of the key features consistent with brain death, then this standard cannot be diagnosed and certified due to its inability to meet the correct criteria.[74]

For as long as humans have questioned the concept of death, the heart has always acted as the spiritual and biological headquarters of an individual. However, with a proliferation of surgical developments and advanced medical knowledge, the focus on the heart has been replaced with an increase in concern about the brain and its functions. The innovation in surgical equipment and the use of artificial machinery presented the idea that the heart, which was once thought to possess almost mystical qualities, could now be maintained through artificial intervention. Leading professionals such as Boston neuroscientist Robert Schwab argue the traditional narrative of the heart and suggest that ‘the human spirit is the product of a man’s brain, not his heart.’[75] Whilst this transition is noted by many professionals and scholars in the field as dramatic, Alex Capron, the executive director of the President’s Commission, suggested that a move from a definition based on a standard of cardiopulmonary cessation to a brain-based standard was not radical at all; rather, it reflected the recognition and acceptance of new diagnostic measures and equipment that were both not available before and no longer compatible with traditional disciplines.[76] According to Capron and the Commission’s ‘Defining Death Report’, the evolving disciplines and attitudes toward brain death are merely a surrogate that reflects the advanced capabilities of both surgeons and medical equipment and does not highlight a drastic alteration from the traditional narrative of death but instead, offer an innovated alternative approach.[77]

However, whilst these modern ideas and developments have led many to question the occurrence of death in a medical context, it has also created an inquisitive field where professionals and non-professionals have suggested the importance of them as individuals and what components are imperative to natural and ordinary life. These acknowledgements favour the importance of spiritual components, although they do not disregard the biological structure and significance of an individual.[78] The two main theories that reflect these ideas around the spiritual significance of an individual are the Identity Theory and the Functionalist Theory. Firstly, the Identity Theory suggests that ‘states and processes of the mind are identical to states and processes of the brain’.[79] If we take this acknowledgement and apply Jeff McMahan’s idea that ‘each of us is essentially a mind’[80], and our minds are indistinguishable from our brains, then the death of the brain will result in the death of the mind; therefore, the death of the whole person. This theory has been used to add support to some of the brain-oriented standards such as the total and higher disciplines of brain death. This is because the cessation of the cerebral hemispheres will be sufficient for the death of an individual and it is these parts which are believed to constitute the mind and its function.[81]

Secondly, we have the Functionalist Theory which shifts away from the identicality of the mind and the brain and applies the theory that the mind has functions and it is these functions which matter, rather than the particular means by which those functions are carried out. The Functionalist theory of the mind is said to be, ‘what makes something a thought, desire, pain (or any other type of mental state) depends not on its internal constitution, but solely on its function,’ and it is these functions that control an individual’s key features. [82] This theory similarly situates itself in support of the higher-orientated definitions of brain death as it suggests that only the parts of the brain which stimulate the mind are important when determining an individual’s mortal state, where an occurrence of total brain death is not necessarily required. These two theories of identity would prove more appealing as they reflect development within a modernised concept such as brain death. Their theories mirror the description of the higher brain standard which highlights the components of an individual which most would be concerned about, such as producing memories, consciousness, and personality; seemingly underlining the key elements of humanity and living a fulfilling life.[83] However, these theories about personhood and personal identity playing a key role in the determination of death have been critiqued by scholars such as Veatch, who suggests that it is wrong to claim that the higher-brain criteria are solely based on theories of either personal identity or personhood. He acknowledged the possibility that there are living human beings who do not satisfy the various features of personhood and who do not obtain the ability to perform ‘vital’ key functions which are considered ‘essential’ to one’s existence to highlight the invalidity of prioritising these theories of identity when determining death. He concluded that if the law is only discussing the matter of an individual in a state of being alive or dead, then personhood is irrelevant.[84]

Contrastingly, contemporary giants in the field of brain death such as Henry Beecher, on the topic of personal identity, stated that the key functions of an individual are, ‘the individual’s personality, his conscious life, his uniqueness, his capacity for remembering, judging, reasoning, acting, enjoying, worrying, and so on.’[85] Using Veatch’s argument, we cannot be certain that the origins of these functions all stem from either the cerebrum or the cerebral cortex; however, if we apply similar ideas to those of Beecher’s, in the context of heart transplants, the use of a higher-brain criterion for death would result in a proliferation of organs available for donation because of the less confining standards which would now legally define death.[86] Concerning the theories of identity, their key emphasis is that the mind is what gives bodily functions regulated by the brain a purpose, meaning that spontaneous outbreaks of physical movements in a brain-dead person are irrelevant. This idea could reflect a developing attitude regarding higher brain death because while contemporary medical equipment cannot entirely support these standards it does help to debunk any ideas of ‘hope’ in an individual who is irrecoverable, leaving little room for controversy to linger.

As medical capacity grows then the number of theories that argue something ‘new’ or ‘innovative’ increases. This is where we can see a popular rise in the higher brain standards, where supporters of this notion debate internally what definition best suits this discipline. Some claim that the essence of an individual is the integration of both the body and the mind, making these two factors the determiners of life and death. In contrast, other advocates of this definition believe in a stricter version, holding the belief that only the mind is important.[87]

Countries across the globe have introduced amended legislation on brain death that reflects their cultural and legal traditions. These differing notions all try to balance the grounds between applying continuously developing medical understandings, the consideration of a patient’s loved ones, and the patient themselves. As defining death is a grey area, Veatch has implied that ‘when there is a doubt about which of the definitions to adopt, we should take the safer policy course, especially in matters that are literally life and death.’[88] The safer option is one that we can see being implemented by state officials ever since the emergence of human-to-human heart transplants. Also. while the concept of brain death was initially considered unorthodox in theory and practicality, this soon changed when the efficiency and rate of survival through heart transplants were improved, and rejection was no longer an issue. The heart transplant became regarded as a suitable and ethical procedure, one which could effectively fight any type of heart disease. This means that a change in national legislation for the definition of death required adjusting to match the ongoing enhancement in medical proficiency. In the early 1980s, the US government, based on research conducted by the President’s Commission passed the ‘Uniform Determination of Death Act’ (UDDA) which portrayed that an individual was dead if they fulfilled one of the two criteria that were listed. The features described that a patient was dead if they had sustained either irreversible cessation of circulatory and/or respiratory functions, or irreversible cessation of all functions of the entire brain including the brain stem.[89]

However, specific legislation does not apply strictly to all States of the USA. For example, New Jersey has a Declaration of Death Act (1991) that states:

the death of an individual shall not be declared upon the basis of neurological criteria … when the licensed physician authorised to declare death, has reason to believe, on the basis of information in the individual’s available medical records, or information provided by a member of the individual’s family or any other person knowledgeable about the individual’s personal religious beliefs that such a declaration would violate the personal religious beliefs of the individual. In these cases, death shall be declared, and the time of death fixed, solely upon the basis of cardio-respiratory criteria.[90]

These religious and cultural exemptions reflect the ideas stated in Bagheri’s article that nobody should be considered dead based on any form of the brain death standard (often total brain death) if that patient, while competent, has asked to be pronounced dead based on the conventional cardiopulmonary standard.[91] Therefore, the New Jersey Declaration serves as a great example of the kind of issues that are apparent when we define death because although countries have attempted to introduce legal guidelines that firmly state the difference between life and death, subjective recognition is vital to consider when displaying these legal requirements. New Jersey, in the context of the law, has now created a situation where ‘there can be living people with dead brains.’[92] Similarly, the Japanese government passed the Japanese Organ Law of 1997 where individuals, for the pronouncement of their death, can choose the definition related to cardio-respiratory cessation or the loss of entire brain functions based purely on their own decision; an enactment that has been labelled ‘pluralism on brain death definition.’[93] As it is suggested by Kimura that death is not an individual event but rather a family event in Japanese culture, the law gives power to the family to confirm or reject the choice made by the patient under certain circumstances, particularly within the context of organ donation.[94]

This act passed in Japan suggests that any transplant-related legislation which ultimately results in the death of the donor should include the opinion of the family when decision-making. Similarly, it is these considerations which can be seen to mirror legislation passed in New Jersey, but in a national-specific manner that applies to considerations for Japanese cultures and traditions. The developments in understanding are reflected in other state legislations where other countries have used their ideologies and perceptions to formulate their governing laws. In countries such as Canada, the UK, and Switzerland, death is determined under a single-brain criterion. In Switzerland, the Swiss Federal Act of 2004 states that ‘a person is dead when all cerebral functions, including the brain stem, have irreversibly ceased.’[95]

These legislative procedures reflect the differing perspectives around the globe on the controversial matter of establishing death. Whilst Bagheri has suggested that everyone should permit a conscience clause in case of the event where an individual is in a state of irreversibility with ongoing brain damage, stating clearly which definition they would like to be applied, this offers a temporary solution to the current climate and understanding of death.[96] It must be acknowledged that with the ever-evolving medical understandings, surgical disciplines and ideological doctrines must be susceptible to change based on curative breakthroughs. Thus, whilst Barnard’s surgical masterpiece did influence the direction of global attitudes toward death, it does not mean that the brain-oriented definitions of death will serve as permanent solutions. Legislation must continue to find the middle ground between applying innovative ideas about death and acknowledging the considerations and emotions of a permanently comatose patient’s loved ones; as prioritising one over the other can lead to great social and ethical issues.


Barnard’s first successful heart transplant in December 1967 prompted a shift in the medical discourse, one that stressed the importance of re-defining death. In a medical context, as the conventional definitions of death no longer reflected the advances in surgical equipment and understanding, a revised definition was required that would prove compatible with medical developments. As heart transplants served as a resolution for those with heart disease, the preservation of this procedure required a re-evaluated definition of death as the extraction of a heart under conventional understandings of death would mean that the surgeons performing this procedure would essentially be ending a life to preserve another. So, to protect surgeons from legal prosecution and gain acceptance of heart transplantation as a normal ethical event, a reassessed definition of death that focused on the cessation of brain functions rather than the traditional cardiopulmonary system would ensure the longevity of this procedure and the transplant surgeons’ careers. However, whilst this new idea of determining death better reflected medical advances and capabilities, this did not mean that an uncontroversial, objective answer had been reached. As the brain has become the central focus in the determination of death, different theories have surfaced about what areas of the brain are vital to human existence, making the matter of death just as complex as it was before brain death notions were accepted as legitimate disciplines.

Nevertheless, because of heart transplant surgeries, brain death has become the most widely and legally accepted definition of death. Whilst this notion has been recognised as the most suitable way of ensuring the preservation of this procedure, the concept of brain death continues to be shaped by different perspectives and interpretations. Ultimately, these neurological viewpoints served as the perfect replacement for the conventional cardiopulmonary definition; but as theories and advances in surgical equipment evolve, this innovative idea of brain death may continue to be contested by other measures of determining death. In the meantime, however, the first heart transplant procedure and the Symposium in 1968 served as pivotal moments in the seemingly never-ending objective of defining death. Together, they established the need for a revised definition to ensure that the interests of the donor, the donor’s family, and the recipient were given due regard and protection; simultaneously, they also provided the necessary legal protection for transplant surgeons whose reputations depended on the procedure’s success.



Primary Sources

Edited Transcript

Shaprio, H. A. (Eds.) Experience with Human Heart Transplantation: Proceedings of the Cape Town Symposium 13-16 July 1968 (Durban: Butterworth & Co. 1969)

Newspapers and Journals

Atkins, H., ‘Problems in Transplant Surgery’, The Times, Issue: 57281, 19th June 1968

Barnard, C., ‘Patients Moral Courage’ in ‘Surgeon explains first heart transplant’, The Daily Telegraph, Issue: 35031, 11th December 1967

Barnard, C. N., ‘Surgeon Explains First Heart Transplant’, The Daily Telegraph, Issue: 35031, 11th December 1967

By Our Health Services Correspondent, ‘Doctors back brain death concept’, The Daily Telegraph, Issue: 37240, 15th February 1975

Daily Telegraph Reporter, ‘Surgeon attacks “prestige heart transplants”’, The Daily Telegraph, Issue: 35746, 3rd April 1970

Faulkner, A., ‘Brain Death Ruling in U.S.’, The Daily Telegraph, Issue: 35233, 6th August 1968

Hughes, B., ‘Defining death’, The Times, Issue: 60251, 8th March 1978

Kennedy, I., ‘A legal definition of death’, The Times, Issue: 60137, 18th October 1977

Loshak, D., ‘Brain Death is key’, The Daily Telegraph, Issue: 37228, 1st February 1975

Morris, L., ‘Heart Transplants’, The Times, Issue: 57643, 20th August 1969

Mahoney, John, ‘Transplants: the definition of death’, The Times, Issue: 59310, 3rd February 1975

Our Cape Town Correspondent, ‘Barnard’s plans’, The Daily Telegraph, Issue: 35054, 9th January 1968

Our Medical Correspondent, ‘Blaiberg’s death a signal for surgeons to take stock’, The Times, Issue: 57641, 18th August 1969

Our Medical Correspondent, ‘British heart transplant may be too early’, The Times, Issue: 57243, 4th May 1968

Our Medical Correspondent, ‘Cardiology: Transplants assessed’, The Times, Issue: 59861, 15th November 1976

Our Medical Correspondent, ‘Doctors differ on the point of death’, The Times, Issue: 57922, 20th July 1970

Our Medical Correspondent, ‘Heart transplants back in favour, surgeon says’, The Times, Issue: 60722, 12th September 1980

Our Own Correspondent, ‘Doctors’ new definition of death’, The Times, Issue: 57322, 6th August 1968

Our Own Correspondent, ‘Heart Surgeon Hits Back’, The Times, Issue: 57149, 15th January 1968

Our Own Correspondent, ‘Moral difficulties of defining death’, The Times, Issue: 57122, 12th December 1967

Our Own Correspondent, ‘Patient Will Die’, The Daily Telegraph, Issue: 35055, 10th January 1968

Pallis, C., ‘The Criteria for diagnosing death’, The Times, Issue: 59568, 3rd December 1975

Playfair, G., ‘Transplant Rights and Wrongs’, The Sunday Telegraph, Issue: 396, 15th September 1968

Prince, J., ‘Call for brake on heart transplants’, The Daily Telegraph, Issue: 35199, 27th June 1968

Roper, J., ‘Heart surgeons will take stock’, The Times, Issue: 57292, 2nd July 1968

Roper, J., ‘Heart transplants “need more research”’, The Times, Issue: 57455, 10th January 1969

Roper, J., ‘I give extra life Barnard says’, The Times, Issue: 57687, 10th October 1969

Roper, J., ‘Individuals’ rights in transplant surgery’, The Times, Issue: 57352, 10th September 1968

Roper, J., ‘Transplants “should be allowed while heart still beating”’, The Times, Issue: 59308, 31st January 1975

Roy, A., ‘Death of the brain marks end of life, doctors agree’, The Daily Telegraph, Issue: 38457, 26th January 1979

R. Y. Calne & P. D. G. Skegg, ‘Transplants: a register of objectors’, The Times, Issue: 59313, 6th February 1975

Science Report, ‘Cardiology: Survival after transplants’, The Times, Issue: 59057, 5th April 1974

Smith, T., ‘Whose hand on the life switch?’, The Times, Issue: 59860, 13th November 1976

Taylor, F., ‘Concern for “New Heart” patient’, The Daily Telegraph, Issue: 35038, 19th December 1967

Unknown, ‘Doctor criticises heart transplant “vultures”’, The Times, Issue: 57353, 11th September 1968

Unknown, ‘Indian Heart transplant patient dies’, Aberdeen Evening Express, 20th February 1968, p.3

Warman, C., ‘Heart transplants must go on surgeon says’, The Times, Issue: 57969, 12th September 1970

Welbourn, R. B. ‘Heart Transplants’, The Times, Issue: 57170, 8th February 1968

Wolff, M., ‘Did the Surgeon go too far?’, The Sunday Telegraph, issue: 358, 24th December 1967

Official Government Papers

‘Defining Death: A Report on the Medical, Legal and Ethical Issues in the Determination of Death’, Published by President’s Commission for the Study of Ethical Problems in Medicine and Biomedical and Behavioural Research (1981)

‘Federal Act on the Transplantation of Organs, Tissues and Cells’, The Federal Assembly of the Swiss Confederation (2004)

Secondary Sources


Belkin, G., Death Before Dying: History, Medicine, and Brain Death (Oxford University Press 2014)

Cooper, D. K. C., Christiaan Barnard: The Surgeon Who Dared (UK & US: Fonthill Media 2017)

Jonsen, A. R., A Short History of Medical Ethics (Oxford: Oxford University Press 2000)

Kerrigan, M., The History of Death (London: Amber Books Ltd 2017)

Lock, M., Twice Dead: Organ Transplants and the Reinvention of Death (London: University of California Press 2002)

Edited Books

(Eds.) Lena Hansen, S., Schicktanz, S., Ethical Challenges of Organ Transplantation: Current Debates and International Perspectives (Bielefeld: Transcript, Bioethics/Medical Ethics 2021)

(Eds.) Youngner, S. J., Arnold, R. M., Schapiro, R., The Definition of Death: Contemporary Controversies (Baltimore and London: John Hopkins University Press, 1999)

Online Resources

Mayo Clinic Staff, ‘Heart Transplant’, Available online: Accessed on: 14/07/22

Practo, ‘The Hippocratic Oath: The Original and Revised Version’ Available online: Accessed on: 08/11/22.


Alivizatos, P. A., ‘Fiftieth anniversary of the heart transplant: The progress of American medical research, the ethical dilemmas, and Christiaan Barnard’, Baylor University Medical Centre, (4) October 2017.

Ave, A. D., Shaw, D., Bernat, J., ‘Defining Death in Donation after Circulatory Determination of Death’, in (Eds.) Lena Hansen, Solveig & Schicktanz, Silke, Ethical Challenges of Organ Transplantation: Current Debates and International Perspectives (Bielefeld: Transcript, Bioethics/Medical Ethics 2021)

Bagheri, A., ‘Individual choice in the definition of death’, J Med Ethics, 33(3) (2007)

Bernat, J. L., ‘Point: Are Donors After Circulatory Death Really Dead, and Does It Matter? Yes and Yes’, Chest Journal, vol.138, Issue 1. (2010) p.13-16

Bernat, J. L., ‘Refinements in the Definition of and Criterion of Death’, in (Eds.) Youngner, S. J., Arnold, R. M., Schapiro, R., The Definition of Death: Contemporary Controversies (Baltimore and London: John Hopkins University Press, 1999) p.83-93

Birnbacher, D., ‘Determining Brain Death: Controversies and Pragmatic Solutions’, in Lena Hansen, S., Schicktanz, S., (Eds.)  Ethical Challenges of Organ Transplantation: Current Debates and International Perspectives (Bielefeld: Transcript, Bioethics/Medical Ethics 2021)

Brody, B. A., ‘How much of the Brain must be dead?’, in Youngner, S. J., Arnold, R. M., Schapiro, R. (Eds.)  The Definition of Death: Contemporary Controversies, (Baltimore and London: John Hopkins University Press, 1999)

Cooley, D. and O.H. Frazier, ‘The Past 50 years of cardiovascular surgery’, Circulation, Vol 102, No.4 (2000) p.87-93

Cooper, D. K. C., Cooley, D. A., ‘Christiaan Neethling Barnard, 1922-2001’, Circulation, vol.104, No.23. (2001) p.2756-2757

Golia, A. K. & Pawar, M. P., ‘The Diagnosis of brain death’, Indian J Crit Care Med, 13(1) (2009)

Hoffenberg, R., ‘Christiaan Barnard: His First Transplants and Their Impact On The Concepts Of Death’, British Medical Journal, vol.323, No.7327 (2001) p.1478-1480

J. J. C. Smart, ‘The Mind/Brain Identity Theory’, Stanford Encyclopedia of Philosophy (2022)

Jensen, A. M. B., Hoeyer, K., ‘Making Sense of Donation: Altruism, Duty, and Incentives’, in (Eds.) Lena Hansen, Solveig & Schicktanz, Silke, Ethical Challenges of Organ Transplantation: Current Debates and International Perspectives (Bielefeld: Transcript, Bioethics/Medical Ethics 2021)

Lynn M. D., Joanne, Cranford M. D. Ronald, ‘The Persisting Perplexities in the Determination of Death’, in (Eds.) Youngner, Stuart J., Arnold, Robert M., Schapiro, Renie., The Definition of Death: Contemporary Controversies (Baltimore and London: John Hopkins University Press, 1999) p.101-115

Macdonald, H., ‘Considering Death: The Third British Heart Transplant, 1969’, Bulletin of the History of Medicine, Vol. 88, No.3 (2014) p.493-525

Noedir, A., Stolf, G., ‘History of Heart Transplantation: A Hard and Glorious Journey’, Brazilian Journal of Cardiovascular Surgery, 32(5), 2017.

Pallis, C. D. M., ‘On the Brainstem Criterion of Death’, in (Eds). Youngner, Stuart J., Arnold, Robert M., Schapiro, Renie., The Definition of Death: Contemporary Controversies (Baltimore and London: John Hopkins University Press, 1999) p.83-93

Parent, B. & Turi, A., ‘Death’s troubled relationship with the law’, Illuminating the Art of Medicine (December 2020)

Pernick, M. S., ‘Brain Death in a Cultural Context: The Reconstruction of Death, 1967-1981’ in (Eds.) Youngner, Stuart J., Arnold, Robert M., Schapiro, Renie., The Definition of Death: Contemporary Controversies (Baltimore and London: John Hopkins University Press, 1999)

Sarbey, B., ‘Definitions of death: brain death and what matters in a person’, Journal of Law and the Biosciences, vol.3, issue 3 (2016) p.743-752

Veatch, R. M., ‘The Impending Collapse of the Whole-Brain Definition of Death’, The Hastings Centre Report, Vol.23, No.4 (1993) p.18-24


Holt, N. (dir,) Between Life and Death, BBC Production, 13th July 2010



[1] Pernick, M. S., ‘Brain Death in a Cultural Context: The Reconstruction of Death, 1967-1981’ in Youngner, S. J., Arnold, R. M., Schapiro, R. (Eds.) The Definition of Death: Contemporary Controversies, (Baltimore and London: John Hopkins University Press, 1999) See also: Macdonald, H., ‘Considering Death: The Third British Heart Transplant, 1969’, Bulletin of the History of Medicine, Vol. 88, No.3 (2014) p.495.

[2] For the different types of brain death see: Sarbey, B., ‘Definitions of death: brain death and what matters in a person’, Journal of Law and the Biosciences, vol.3, issue 3 (2016): p.743-752.

[3] Veatch, R., The Whole-Brain-Oriented Concept of Death: An Outmoded Philosophical Formulation, 3 J. THANATOL. 13, 23 (1975). Cited in: Sarbey, B., ‘Definitions of death: brain death and what matters in a person’ p.747

[4] The cessation of adequate heart and respiratory functions which results in death without reversal. This definition now became permeable due to the inventions of medical instruments which could now keep humans alive through artificial means.

[5] Youngner, S. J., Arnold, R. M., Schapiro, R. (Eds.) The Definition of Death: Contemporary Controversies, (Baltimore and London: John Hopkins University Press, 1999) p.37-67.

[6] For context see: Cooper, D. K. C., Christiaan Barnard: The Surgeon Who Dared (UK & US: Fonthill Media 2017) p.187.

[7] Quote from, McRae D., Every Second Counts: The Race to Transplant the First Human Heart. New York: G. P. Putnam’s Sons; 2006. Cited in: Alivizatos, P. A., ‘Fiftieth anniversary of the heart transplant: The progress of American medical research, the ethical dilemmas, and Christiaan Barnard’, Baylor University Medical Centre, (4) October 2017.

[8] Cooper, D. K. C., Cooley, D. A., ‘Christiaan Neethling Barnard, 1922-2001’, vol.104, No.23. p. 2756-2757.

[9] Macdonald, H., ‘Considering Death: The Third British Heart Transplant, 1969’, Bulletin of the History of Medicine, Vol. 88, No.3 (2014) p.500.

[10] Rodriguez-Arias, D., ‘Together and Scrambled. Brain Death was conceived in Order to Facilitate Organ Donation’, Dilemata, 23 (2017) p.57-87. Cited in: Ave, A. D., Shaw, D., Bernat, J., ‘Defining Death in Donation after Circulatory Determination of Death’, in (eds.) Lena Hansen, S., Schicktanz, S., Ethical Challenges of Organ Transplantation: Current Debates and International Perspectives (Bielefeld: Transcript, Bioethics/Medical Ethics 2021) p.117.

[11] Pernick, M. S., ‘Brain Death in a Cultural Context: The Reconstruction of Death, 1967-1981’ in Youngner, S. J., Arnold, R. M., Schapiro, R. (Eds.) The Definition of Death: Contemporary Controversies, (Baltimore and London: John Hopkins University Press, 1999) See Also: Lock, M., Twice Dead: Organ Transplants and the Reinvention of Death, University of California Press, London 2002, p.78.

[12] Macdonald, H., ‘Considering Death’, p.501.

[13] Bagheri, A., ‘individual choice in the definition of death’, J Med Ethics, 33(3) (2007).

[14] Lock, M., Twice Dead, p.79.

[15] The Hippocratic Oath is an oath taken by physicians. It ensures that their actions meet the standard ethics and that doctors conduct their work properly to ensure that patients receive the correct treatment.

[16] Our Medical Correspondent, ‘British heart transplant may be too early’, The Times, Issue: 57243, May 4th, 1968.

[17] In this context, a palliative relieves suffering without treating the cause of the suffering. The heart transplant was initially regarded as a palliative because its success was not ensured.

[18] Prince, J., ‘Call for brake on heart transplants’, The Daily Telegraph, Issue: 35199, 27th June 1968.

[19] Daily Telegraph Reporter, ‘Surgeon attacks “prestige heart transplants”’, The Daily Telegraph, Issue: 35746, 3rd April 1970.

[20] Morris, L., ‘Heart Transplants’, The Times, Issue: 57643, 20th August 1969.

[21] David Cooper is an expert in surgical transplantations, serving as a surgeon-scientist until now where he is predominantly involved in the research of xenotransplantation (cross-species). Denton Cooley was a contemporary American surgeon who, along with Barnard, led some of the ground-breaking medical innovations in transplantations. He was also present at the Cape Town Symposium of 1968 along with other contemporary experts in the medical field.

[22] Morris, L., ‘Heart Transplants’, The Times, Issue: 57643, 20th August 1969. See Also: Cooper, D. K. C., Cooley, D. A., ‘Christiaan Neethling Barnard, 1922-2001’, vol.104, No.23.

[23] Hershenov D. The problematic role of “irreversibility” in the definition of death. Bioethics. 2003;17(1):89-100. Cited in: Parent, B. & Turi, A., ‘Death’s troubled relationship with the law’, Illuminating the Art of Medicine (December 2020) p.1055. See also: Kerrigan, M., The History of Death (London: Amber Books Ltd 2017) p.12.

[24] Rothman, Strangers at the Bedside (n.3) p.160. Cited in, Macdonald, H., ‘Considering Death’, p.503.

[25] Idea from: Roper, J., ‘Transplants “should be allowed while heart still beating”’, The Times, Issue: 59308, 31st January 1975. See similar ideas in: Our Medical Correspondent, ‘Cardiology: Transplants assessed’, The Times, Issue: 59861, 15th November 1976.

[26] Smith, D., ‘The Heart Market: Someone Playing God’, Nation, 1968, P.721. Cited in: Lock, M., Twice Dead, p.85.

[27] Hoffenberg, R., ‘Christiaan Barnard’, p.1478.

[28] Ave, A. D., Shaw, D., Bernat, J., ‘Defining Death in Donation’, p.117.

[29] Roper, J., ‘Heart transplants “need more research”’, The Times, Issue: 57455, 10th January 1969.

[30] Roper, J., ‘I give extra life Barnard says’, The Times, Issue: 57687, 10th October 1969.

[31] See: Sarbey, B., ‘Definitions of death: brain death and what matters in a person’, p.751.

[32] Birnbacher, D., ‘Determining Brain Death: Controversies and Pragmatic Solutions’, in Lena Hansen, S., Schicktanz, S. (Eds.) Ethical Challenges of Organ Transplantation, p.103-105.

[33] Jacquez Charl (Kay) de Villiers was a leading South African neurosurgeon emeritus professor of neurosurgery at the University of Cape Town. He was also present at the Cape Town Symposium in 1968.

[34] Shaprio, H. A. (Eds.) Experience with Human Heart Transplantation: Proceedings of the Cape Town Symposium 13-16 July 1968, (Durban: Butterworth & Co. 1969) p.38 & 39.

[35] Adrian Kantrowitz was an American cardiac surgeon and performed the first human heart transplant in the United States, three days after Christian Barnard performed the world’s first human such operation in December 1967.

[36] Shaprio, H. A. (Eds.) Experience with Human Heart, p.41. See also: Loshak, D., ‘Brain Death is key’, The Daily Telegraph, Issue: 37228, February 1, 1975. See also: Lock, M., Twice Dead, p.89.

[37] Science Report, ‘Cardiology: Survival after transplants’, The Times, Issue: 59057, 5th April 1974.

[38] Belkin, G., Death Before Dying: History, Medicine, and Brain Death, (Oxford University Press 2014) p.53.

[39] Shaprio, Hillel A. (Eds.) Experience with Human Heart, p.49.

[40] Brody, B. A., ‘How much of the Brain must be dead?’, in Youngner, S. J., Arnold, R. M., Schapiro, R. (Eds.) The Definition of Death: Contemporary Controversies, (Baltimore and London: John Hopkins University Press, 1999) p.79.

[41] Pernick, M., The Black Stork (New York: Oxford University Press, 1996) Cited in: Pernick, M. S., ‘Brain Death in a Cultural Context’, in Youngner, S. J., Arnold, R. M., Schapiro, R. (Eds.) The Definition of Death, p.7.

[42] Sarbey, B., ‘Definitions of death’, p.746.

[43] Bernat, J. L., Culver C. M., Gert B., ‘On the definition and criterion of death’, Ann Intern Med 1981; 94: 389-394. Cited in Bernat, James L., ‘Refinements in the Definition of and Criterion of Death’, in Youngner, S. J., Arnold, R. M., Schapiro, R. (Eds.) The Definition of Death: Contemporary Controversies, (Baltimore and London: John Hopkins University Press, 1999) p.83.

[44] Idea from: Birnbacher, D., ‘Determining Brain Death’, p.104.

[45] Veatch, R., Death, Dying and the Biological Revolution (New Haven/London: Yale University Press 1989) Cited in Birnbacher, D., ‘Determining Brain Death’, p.104.

[46] Birnbacher, D., ‘Determining Brain Death’, p.104/105.

[47] Sarbey, B., ‘Definitions of death’, p.744. See also: Hoffenberg, R., ‘Christiaan Barnard’, p.1480. Also, Faulkner, A., ‘Brain Death Ruling in U.S.’, The Daily Telegraph, Issue: 35233, August 6, 1968. See Also, Lock, M., Twice Dead, p.90.

[48] Idea from Faulkner, A., ‘Brain Death Ruling in U.S.’, The Daily Telegraph, Issue: 35233, August 6, 1968.

[49] Sarbey, B., ‘Definitions of death’, p.745.

[50] Veatch, R., The Whole-Brain-Oriented Concept of Death: An Outmoded Philosophical Formulation. Cited in: Sarbey, B., ‘Definitions of death’, p. 747.

[51] Pernick, M. S., ‘Brain Death in a Cultural Context’, p.12.

[52] Pernick, M. S., ‘Brain Death in a Cultural Context’, p.9.

[53] Idea from: Beecher, H. K., ‘Ethical Problems Created by the Hopeless Unconscious Patient’, New England Journal of Medicine, 278 (June 27th, 1968) p.1427, cited in: Pernick, M. S., ‘Brain Death in a Cultural Context’, p.9.

[54] Journal of the Medical Association, June 27th 1968, p.1426-29. Cited in: Pernick, M. S., ‘Brain Death in a Cultural Context’, p.13.

[55] Grandstrand Gervais, K., Redefining Death (New Haven: Yale University Press, 1986) p.13 cited in: Pernick, M. S., ‘Brain Death in a Cultural Context’, p.12.

[56] Macdonald, H., ‘Considering Death’, p.501.

[57] Idea from Bernat, J. L., ‘Point: Are Donors After Circulatory Death Really Dead, and Does It Matter? Yes and Yes’, Chest Journal, vol.138, Issue 1. (2010) p.13-16. See also: Parent, B., Turi, A., ‘Death’s troubled relationship with the law’, Illuminating the Art of Medicine (December 2020) Available Online: Accessed on: 03/10/22. Similar ideas mentioned in: Ave, A. D., Shaw, D., Bernat, J., ‘Defining Death in Donation after Circulatory Determination of Death’, in (eds.) Lena Hansen, S., Schicktanz, S., Ethical Challenges of Organ Transplantation, p. 119.

[58] Truog R. D., Miller F. G., ‘The dead donor rule and organ transplantation’, N Engl J Med. 2008;359(7):674-675. Cited in: Parent, Brendan & Turi, Angela, ‘Death’s troubled relationship with the law’.

[59] ‘A Definition of Irreversible Coma’, Journal of the American Medical Association 205 (1968) p.337-340. Cited in: Pernick, M. S., ‘Brain Death in a Cultural Context’, p.8.

[60] Idea from The President’s Commission for the Study of Ethical Problems in Medicine and Biomedical and Behavioural Research Defining Death: Medical, Legal, and Ethical Issues in the Determination of Death. (1981) Cited in: Sarbey, B., ‘Definitions of death’. See also: Hoffenberg, Raymond, ‘Christiaan Barnard’, p.1480.

[61] Sarbey, B., ‘Definitions of death’, p.748.

[62] Sarbey, B., ‘Definitions of death’, p.745.

[63] Presidents Commission, Defining Death, 38-41, cited in: Pernick, M. S., ‘Brain Death in a Cultural Context’, p.19. See also: Sarbey, Ben, ‘Definitions of death’, p.745.

[64] Bernat, J. L., ‘Refinements in the Definition of and Criterion of Death’, p.86-87.

[65] Birnbacher, D., ‘Determining Brain Death’, p.107.

[66] Bernat, J. L., Culver C. M., Gert B., ‘On the definition and criterion of death’, Ann Intern Med 1981; 94: 391. Cited in: Bernat, J. L., ‘Refinements in the Definition of and Criterion of Death’, p.89.

[67] The Quality Standards Subcommittee of the American Academy of Neurology. Practice Parameters for determining brain death in adults. Neurology 1995;45: 1012-1014. Cited in: Bernat, J. L., ‘Refinements in the Definition of and Criterion of Death’, p.85-86.

[68] Shaprio, H. A. (Eds) Experience with Human Heart Transplantation, p.51.

[69] Idea from: Smith, T., ‘Whose hand on the life switch?’, The Times, Issue: 59860, 13th November 1976. See also: By Our Health Services Correspondent, ‘Doctors back brain death concept’, The Daily Telegraph, Issue: 37240, 15th February 1975.

[70] Pallis C., ABC of Brainstem Death (London: British Medical Journal, 1983) Cited in: Bernat, J. L., ‘Refinements in the Definition’, p.87. See similar ideas in: Pallis, C., ‘The Criteria for diagnosing death’, The Times, Issue: 59568, 3rd December 1975.

[71] Veatch, R. M., ‘The Impending Collapse of the Whole-Brain Definition of Death’, The Hastings Centre Report, Vol.23, No.4 (1993) p.24.

[72] Dr. Pierre Rene Grondin was a Canadian cardiac surgeon and is remembered for his efforts in the cardiac field. He was also one of the first doctors to successfully perform a successful human-to-human heart transplant.

[73] Ed., Shaprio, H. A., Experience with Human Heart Transplantation, p.48-49.

[74] Idea from: Golia, A. K. & Pawar, Mridula P., ‘The Diagnosis of brain death’, Indian J Crit Care Med, 13(1) (2009) Available online: Accessed on: 06/10/22.

[75] (Time 1966 Thanatology. May 27) Cited in: Lock, M., Twice Dead, p.92.

[76] Veatch, R. M., ‘The Impending Collapse of the Whole-Brain’, p.19.

[77] President’s Commission (1981) p.59, cited in: Sarbey, B., ‘Definitions of Death’, p.750.

[78] Idea from: Sarbey, B., ‘Definitions of death’.

[79] J.J.C. Smart, ‘The Mind/Brain Identity Theory’, Stanford Encyclopedia of Philosophy Archive (2014) Available online: Accessed on: 10/01/23. See also, Sarbey, B., ‘Definitions of Death’, p.748.

[80] McMahan, J., The Metaphysics of Brain Death, 9 Bioethics 91, 102 (1995) Cited in: Sarbey, Ben, ‘Definitions of Death’, p.748.

[81] Sarbey, B., ‘Definitions of Death’, p.748.

[82] See Putnam, H., ‘Psychological Predicates’Art, Mind, and Religion, 37–48 (William Capitan & Daniel Merrill eds., 1967) Cited in: Sarbey, B., ‘Definitions of Death’, p.749.

[83] Sarbey, B., ‘Definitions of Death’, p.750.

[84] Veatch, R. M., ‘The Impending Collapse’, p.20.

[85] Veatch, R. M., ‘The Impending Collapse’, p.19.

[86] Sarbey, B., ‘Definitions of Death’, p.752.

[87] Veatch, R. M., ‘The Impending Collapse’, p.21-22.

[88] Veatch R. M. Transplantation ethics. Washington, DC: Georgetown University Press, 2000 69–72. Cited in: Bagheri, A., ‘individual choice’.

[89] President’s Commission for the Study of Ethical Problems in Medicine and Biomedical and Behavioral Research. Defining Death: A Report on the Medical, Legal and Ethical Issues in the Determination of Death. Washington, DC: US Government Printing Office; 1981. Available online: Accessed 12th December 2022. See also: Parent, B. & Turi, A., ‘Death’s troubled relationship with the law’, p.1056. See also: Ave, A. D., Shaw, D., Bernat, J., ‘Defining Death in Donation’, p.119. See also: Sarbey, Ben, ‘Definitions of Death’, p.745-746.

[90] New Jersey Declaration of Death Act, N.J. Sess. Law Serv. Ch. 90, 26:6A–5 (1991). Cited in: Sarbey, B., ‘Definitions of Death’, p.746-747. For further mention of New Jersey Act see: Parent, B. & Turi, A., ‘Death’s troubled relationship with the law’, p.1058.

[91] Bagheri, A., ‘individual choice in the definition of death’, p.146-149.

[92] Complaint, McMath v. Rosen. (Cal. Super Ct. Dec. 09, 2015) (No. RG15796121). Cited in: Sarbey, Ben, ‘Definitions of Death’, p.747.

[93] Morioka M. Reconsidering brain death: a lesson from Japan’s fifteen years experience. Hastings Cent Rep 20013141–46. Cited in: Bagheri, A, ‘individual choice in the definition of death’.

[94] Idea from: Kimura R. Death, dying and advance directives in Japan: socio‐cultural and legal points of view. In: Sass HM, Veatch RM, Rihito K, eds. Advance directives and surrogate decision-making in health care. Baltimore: Johns Hopkins University, 1998187–208. Cited in: Bagheri, A, ‘individual choice in the definition of death’.

[95] Swiss Federal Act on Transplantation of Organs, Tissues and Cells. 8th October 2004 (SR 810.21) Cited in Ave, Anne Dalle; Shaw, David; Bernat, James, ‘Defining Death in Donation’, p.119.

[96] Bagheri, A., ‘Individual choice in the definition of death’, p.146-149.

Cow 133 at the Central Veterinary Laboratory: Recognising a Novel Zoonosis

Cow 133 at the Central Veterinary Laboratory: Recognising a Novel Zoonosis

Isobel Newby

Download PDF


Known colloquially as Mad Cow Disease, Bovine Spongiform Encephalopathy (BSE) is one of a group of fatal, progressive neurodegenerative diseases called Transmissible Spongiform Encephalopathies. They take the form of scrapie, a contagious disease affecting sheep and goats first recorded in Britain in the early eighteenth century, and Creutzfeldt-Jakob Disease (CJD), a rapidly deteriorating dementia in humans. Farmers were critical in BSE’s early detection. They noticed cattle suffered a loss of control over their limbs, trouble rising, anxiety and eventually paralysis before euthanasia. BSE was found to be a zoonosis: a disease that can be transmitted from animals to humans. The BSE crisis is so-called because it led to the destruction of nearly five million cattle, international bans on British beef imports and the death of 178 people from the variant-CJD. Today, it is accepted that BSE emerged sporadically and spread through the reprocessing of infected material in cattle feed, enabled by intensive farming practices following the Second World War.[1]

Under the auspices of the Ministry of Agriculture, Fisheries and Food (MAFF), Central Veterinary Laboratory (CVL) scientists were responsible for infectious livestock disease diagnosis and research. Despite this, when the first case of BSE was identified at the CVL in 1985, government ministers in charge of animal disease control were not alerted to its existence until 1987. The Central Veterinary Officer (CVO), ostensibly the country’s chief veterinarian, took a minimal role in advising the government on animal health and disease control policy. Focusing on diagnostic and epidemiological research, CVL pathologists did not have an advisory role, though their outputs certainly would have informed the CVO’s recommendations. Existing accounts of BSE have focused on the responsibility of MAFF officials in delaying making it a notifiable disease, out of fear of causing panic amongst the public and the meat trade industry.[2] They do not discuss the influence of CVL scientists on the political response to BSE before 1987.

This paper will demonstrate that, in receipt of the first case of BSE – Cow 133 – the CVL had a pivotal, if indirect, role in dictating how the novel disease was approached by policymakers for a large part of the epidemic. This enables historians of science to rethink their approaches to the BSE episode: the ways in which the CVL understood BSE shows that MAFF did not suppress knowledge but followed the CVL’s relaxed lead. Evidence from materials of the Phillips Inquiry – a public inquiry into the government’s handling of BSE, published in 2000 – show that actions of senior CVL pathologists were largely motivated by a desire to prevent causing panic in the public and industry. Current literature is interdisciplinary with few accounts from the perspective of historians of science. They often employ the narrative of a government ‘cover-up’, focusing on the period following 1987 once MAFF had already been alerted to news of the new disease. The role of the CVL is often neglected in explorations of the British government’s failure to respond to the crisis in a timely manner.[3] Other, more general accounts of the crisis deploy an entertaining telling of the BSE story.[4] This is useful for a sophisticated lay audience but not as an academic analysis of the origins of the disease’s emergence. As a case study for disciplines such as politics, sociology, animal diseases, risk management or science and technology studies, all reflections on the BSE episode require historical analysis.

In late 1984, a cow from Pitsham farm in Sussex presented with the aforementioned behaviours and passed away in February of 1985. This animal was labelled Cow 133 in the veterinarian’s records.[5] By September 1985, seven more cows from Pitsham farm died after suffering similar nervous symptoms. Their peculiarity led to referrals to a Veterinary Investigation Centre for post-mortem, but no definitive diagnosis could be reached. As a result, Cow 133 was passed to the CVL for brain and spinal cord examination.

Assigned to Cow 133 was pathologist Carol Richardson, who found small vacuoles in the brain, causing it to resemble a sponge. These holes were consistent with scrapie-infected sheep brains, but this was the first time she had discovered it in that of a cow.[6] Consequently, she sought a second opinion. Despite Richardson having never observed bacterial toxins causing this effect on the brain, her conclusion was dismissed by colleague Gerald Wells, who believed the holes were caused by a bacterial toxin.[7] Wells’ modified version of the report was submitted back to the local veterinary surgery. Whilst this has been recognised by historians of science, the reasons why two experienced pathologists came to such conflicting interpretations of the same specimen have not been clarified. For example, Kiheung Kim has revealed scrapie’s enduring prevalence in both national and international contexts, but a discussion of BSE research falls outside the scope of his work.[8] Evidence gathered by the Phillips Inquiry indicate that there are two key and connected reasons: organisational changes within the CVL and the scrapie analogy.

Organisational changes

In 1983, Ray Bradley was appointed as the CVL’s new Head of Pathology. Prior to Bradley’s arrival, CVL scientists would specialise in one particular species, including all of its organ systems. He enforced a refocus to expertise in one organ system, across all species. In Richardson’s testimony to the Phillips Inquiry, she attributed a general lowering of CVL expertise in the 1980s to this change.[9] She contended that emphasis on the overall knowledge of an individual species improved the pathologist’s understanding of the animal’s condition as described by farmers. Furthermore, Bradley implemented a change of CVL priorities, from research to diagnosis. As a result, six incomplete research projects were terminated and four dissatisfied staff members left their roles at the CVL. In summary, Bradley’s alterations were intended to prioritise diagnosis, but were not conducive to field practice. They did not reflect the intuition vital and present in the farmer-animal relationship.

The use of analogy

Epidemiological models of BSE were initially based on logical reasoning known as the scrapie analogy. This stipulated that: if BSE was bovine scrapie, and scrapie was of no risk to human health, then BSE was unlikely to be a risk to human health. Maya Ponte demonstrates how constant reference to scrapie in discussions of BSE legitimated the response of the government, allowed the prediction of its behaviour and enabled policymakers to presume that BSE would not be a risk to human health.[10] Ponte generalises that everyone with influence over BSE policy relied on established knowledge of scrapie in discussions about the novel BSE. However, I will demonstrate that the Chief Veterinary Officer (CVO) stood as a solitary, but influential dissenting voice.

Carol Richardson was not inclined to analogise but was trained to make diagnoses based on the morphological structure of specimens without using clinical terminology. As a result, she could only diagnose Cow 133 based on what she observed post-mortem, without assumptions based on knowledge of similar conditions. Wells, on the other hand, saw it in his professional remit to diagnose based on an animal and its herd’s history. Wells, and other CVL pathologists, only decided that the diagnosis of spongiform encephalopathy was appropriate until similar cases entered the CVL in 1986. Wells believed the brain pathologies of the affected cows closely resembled that of scrapie in sheep, leading the CVL to conclude that they had identified the first case of scrapie in a cow.[11] This assumption set a precedent for government response.

Though they are both spongiform encephalopathies, BSE and scrapie are two distinct diseases. Scrapie has been endemic in British flocks for over two centuries and has never crossed the species barrier, whilst this novel cattle disease emerged only in 1984. Still, it was more palatable for CVL pathologists to assume and reassure the ministry that they were dealing with a disease with which they were familiar, than to consider the possibility that it was an emerging disease. Their assumption that the disease was bovine scrapie enabled them to predict its epidemiology, transmissibility and public health implications. Presumptive and analogical strategies formed the basis for future animal and public health policy. As MAFF was led to believe BSE was scrapie, the resulting rationale was that there was only a remote chance of it posing a risk to human health via the consumption of infected beef. In a reality not yet realised, BSE-infected meat products put the British public at risk of the fatal variant-CJD.

Led by Wells, the CVL were to publish an authoritative account of what they called ‘bovine scrapie’ and present the paper to a Joint Meeting of the Medical and Veterinary Research Clubs in May 1987.[12] Knowledge about the novel disease was poised to spread beyond the small circle of state veterinary scientists. In a turning point in this early stage of the BSE episode, CVO William Howard Rees supported publication on the condition that the name of the disease be corrected to BSE. This demonstrates that not all state veterinary scientists shared the belief that BSE was scrapie, with the country’s leading veterinary authority warning against the assumption that the two diseases would behave in the same way.[13] Rees promptly advised MAFF of BSE’s emergence but not on the correct course of action, out of fear that the CVL’s lack of knowledge about BSE would be exposed and cause public speculation and panic. The most accurate method of investigating risk to humans would be through transmission experiments with primates. However, primates were expensive to acquire, and studies would take several years to yield results.[14] By the late 1980s, funding and staffing shortages meant the CVL struggled with its current caseload. Still, any potential risk to humans required immediate mitigation. Accordingly, the CVL were only able to rely on what was already known about scrapie. Despite their efforts to retain knowledge of BSE, two years after Cow 133 was examined by Carol Richardson, BSE began to receive attention by the wider professional veterinary community.[15] The CVL were forced to reverse Bradley’s diagnostic priority to address essential research into the novel disease as the epidemic began.


Before the establishment of independent consultative committees in 1989, MAFF relied solely upon the expertise of CVL scientists. However, a lack of understanding of the disease meant the CVL were unable to carry out their advisory function. Delays and inaction ensued, setting a precedent for the next decade of animal and public health policymaking. The justification of the scrapie analogy was that if BSE is actually scrapie, and scrapie is of no risk to humans, then BSE would not be a risk to humans.[16] Unfortunately, the British government dismissed any risk to human health, with reiterated assurances that ‘beef is safe to eat’. The future course of BSE policy was based on this faulty logic, until the first case of vCJD was confirmed in 1996 after the death of a 19 year old.

A deeper exploration of correspondence between CVL scientists and MAFF officials has shown that the Ministry were not lone decision makers at the start of the BSE crisis. Actions of the CVL were critical determinants of the pace of government response and risk assessment. This demonstrates for the first time the agency and influence of state veterinary scientists on how the emergence of BSE was handled in its formative stages. The CVL had significant control over the initial response to BSE in its identity construction, research dissemination and public relations. Structural organisation and individual personalities within the CVL had a lasting influence on animal and human health policy until the start of the twenty-first century.



Primary Sources

Lord Phillips of Worth Matravers, Bridgeman, J., and Ferguson-Smith, M., The BSE Inquiry: The Report: The Inquiry Into BSE and Variant CJD in the United Kingdom, published October 2000. Archived on 25 May 2006. Retrieved from the UK Government Web Archives:

Holt, T., and Phillips, J., ‘Bovine Spongiform Encephalopathy’, British Medical Journal (Clin Res Ed), 296 (1988), pp. 1581-1582.

Wells, G., Scott, A. C., and others, ‘A Novel Progressive Spongiform Encephalopathy in Cattle’, Veterinary Record, 121 (1987), pp. 419-420.


Secondary Sources

Bartlett, D., ‘Mad Cows and Democratic Governance: BSE and the Construction of a “Free Market” in the UK’, Crime, Law and Social Change, 30 (1999), pp. 237-257.

Beck, M., and others, BSE Crisis and Food Safety Regulation: A Comparison of the UK and Germany, Working Paper (University of York: Department of Management Studies, 2007).

Hardy, A., Salmonella Infections, Networks of Knowledge, and Public Health in Britain, 1880-1975 (Oxford, 2014).

Hueston, W., ‘BSE and Variant CJD: Emerging Science, Public Pressure and the Vagaries of Policymaking’, Preventive Veterinary Medicine, 109 (2013), pp. 179-184.

Jasanoff, S., ‘Civilization and Madness: The Great BSE Scare of 1996’, Public Understanding of Science, 6 (1997), pp. 221–32.

Kim, K., The Social Construction of Disease: From Scrapie to Prion (Oxon, 2006).

Ponte, M., Managing Risk and Uncertainty During a Novel Epidemic (San Francisco, 2005).

Schwartz, M., How the Cows Turned Mad (Berkeley, 2003).

Woods, A., ‘A Historical Synopsis of Farm Animal Disease and Public Policy in Twentieth Century Britain’, Philosophical Transactions: Biological Sciences, 366 (2011), pp. 1943-54.



[1] Abigail Woods explores the development of post-war agricultural policy in: A. Woods, ‘A Historical Synopsis of Farm Animal Disease and Public Policy in Twentieth Century Britain’, Philosophical Transactions: Biological Sciences, 366 (2011), pp. 1943-1954.

[2] A notifiable disease is an animal disease, either endemic or exotic, which must be reported to MAFF even if an animal is only suspected of being affected. Failure to report these was and remains a criminal offence.

[3] See: David M. C. Bartlett, ‘Mad Cows and Democratic Governance: BSE and the Construction of a “Free Market” in the UK’, Crime, Law and Social Change, 30 (1999), pp. 237-257; Matthias Beck, and others, BSE Crisis and Food Safety Regulation: A Comparison of the UK and Germany, Working Paper (University of York: Department of Management Studies, 2007); William D. Hueston, ‘BSE and Variant CJD: Emerging Science, Public Pressure and the Vagaries of Policymaking’, Preventive Veterinary Medicine, 109 (2013), pp. 179-184; Sheila Jasanoff, ‘Civilization and Madness: The Great BSE Scare of 1996’, Public Understanding of Science, 6 (1997), pp. 221–32.

[4] Maxime Schwartz, How the Cows Turned Mad (Berkeley, 2003).

[5] M. L. Teale & Partners, ‘Invoice’, 84\12.28\1.1, in Lord Phillips of Worth Matravers, and others, The BSE Inquiry Report: The Report: The Inquiry into BSE and Variant CJD in the United Kingdom (hereinafter The BSE Inquiry Report), published October 2000, retrieved from the UK Government Web Archives: <>, accessed 04.09.2020.

[6] C. Richardson, ‘Form’, 85\09.10\1.1, in Lord Phillips of Worth Matravers, The BSE Inquiry Report, <>, accessed 04.09.2020.

[7] C. Richardson, ‘Report’, 85\09.19\2.1, in Lord Phillips of Worth Matravers, The BSE Inquiry Report, <>, accessed 04.09.2020.

[8] K. Kim, The Social Construction of Disease: From Scrapie to Prion (Oxon, 2006).

[9] C. Richardson, ‘Evidence given by C. Richardson’, Hearing Day 28, in Lord Phillips of Worth Matravers, The BSE Inquiry Report, <>, accessed 04.09.2020.

[10] M. Ponte, Managing Risk and Uncertainty During a Novel Epidemic (San Francisco, 2005).

[11] C. Richardson, ‘Evidence given by C. Richardson’.

[12] G. Wells, A. C. Scott, and others, ‘A Novel Progressive Spongiform Encephalopathy in Cattle’, Veterinary Record, 121 (1987), pp. 419-420.

[13] W. Rees, ‘Witness Statement no. 126’, in Lord Phillips of Worth Matravers, The BSE Inquiry Report, <>, accessed 05.10.2020.

[14] W. Rees, ‘Minute’, 87\07.29\3.1-3.6, in Lord Phillips of Worth Matravers, The BSE Inquiry Report, <>, accessed 09.11.2020.

[15] T. Holt, and J. Phillips, ‘Bovine Spongiform Encephalopathy’, British Medical Journal (Clin Res Ed), 296 (1988), pp. 1581-1582.

[16] Ponte, Managing Risk and Uncertainty During a Novel Epidemic, p. 39.

‘The introduction into English public life of the educated workman’: The rise of Labour in the Edwardian Mass Press


This paper explores how the emergent Labour Party was represented by two of Britain’s leading popular daily newspapers: the Daily Mail and the Daily Express. Focusing on coverage afforded the party during its first general elections — 1900 and 1906 — it will be argued that the response of the Conservative popular press to the rise of Labour was complex. While often hostile, these newspapers also showed considerable interest in the party’s rise and were also broadly positive to both individual Labour MPs and the movement’s desire to better represent working class interests. Adding to past works into pre–Great war political culture, this paper interrogates the complexity of Labour’s emergent place within a mass political culture that, while broadly hostile to left–wing politics, primarily catered toward an imagined ‘everyman’ who was very similar to Labour’s assumed electoral supporter.

Keywords: Labour Party, popular press, newspaper language, political identity, pre–1914 British culture

Author Biography

Dr Chris Shoop-Worrall is Lecturer in Media & Journalism at UCFB, having completed his PhD at the University of Sheffield’s Centre for the Study of Journalism and History in 2019. His work explores the intersections between politics, mass media, and consumer culture within nineteenth– and twentieth–century Britain. His first book, an adaptation of his doctoral work, is forthcoming with Routledge Focus.


‘The introduction into English public life of the educated workman’: The rise of Labour in the Edwardian Mass Press

Download PDF


The mass election–time political culture of Edwardian Britain, into which the Labour Party[1] first entered in 1900, was framed primarily around the perceived wants and interests of an imagined ‘man in the street’, whose significance had grown particularly after the various reform acts of the 1880s.[2] This ‘everyman’ was the person whom the proposed political policies of both the Liberals and the Conservatives were increasingly pitched, on issues including tariff reform, religious education and alcohol consumption.[3] This increasingly mass and masculinised election sphere was part of a wider consumer culture within which the everyman also held significance.[4] A key component of these interconnected cultures of politics, urban consumerism, and entertainment was the daily mass press: the ‘new dailies’ Mail and Express which lay the groundwork the dominant tabloid culture of the twentieth century.[5] These newspapers, and newspapers in general, were key conduits of political communication in late–nineteenth and early–twentieth century Britain.[6] Their content sensationalised and personalised election news in ways that effectively spoke to their mass readerships, many of whom were the same ‘man in the street’ sought by politicians across the political spectrum.[7] Their communicative potential was noteworthy: Stephen Koss’s chapter on these newspapers shows Joseph Chamberlain’s intense interest in courting their support[8], while recent scholarship by David Vessey has noted how the Women’s Social and Political Union (WSPU) similarly saw the merits of their suffrage campaigns capturing the attention of these particular newspapers.[9]

Labour were perhaps uniquely interested in the political significance of the new dailies. Their appeal to the man in the street — an individual from whom Labour particularly sought the vote — made the daily mass press a hugely significant force. Indeed, Labour would eventually launch their own newspaper, the short–lived Daily Citizen, such was the perceived political importance of having a Labour–friendly mass daily newspaper[10]. The knowledge of the mass press’s appeal to the man in the street came with a parallel hostility from across the early Labour movement towards this ‘capitalist’ press. The fact that the Citizen’s birth was a decade in the making spoke significantly of the agonising across the pre–war British left about what constituted appropriate mass political communication: an issue which the party would continue to struggle with for decades to follow.[11]

While some scholarship has explored aspects of Labour’s relationship towards and with both the popular press and popular culture pre–1914[12], little exists on the ways in which Labour manifested within the pages of the mass daily press. This paper interrogates the ways in which the two founding publications of Bingham and Conboy’s ‘tabloid century’, the Mail and Express, represented the emergence of Labour during their first two general election campaigns. Using these two periods of newspaper coverage, spanning the weeks of the elections both in 1900 and 1906[13], this paper explores the complex place that Labour held within the pages of these mass–selling newspaper and, by extension, a significant component of the political culture in which they sought success.

On the one hand, it would seem that the hostility shown across the British left towards the new dailies, and the wider culture to which they contributed, was somewhat mutual. Both the Mail and Express featured articles critical of the party’s politics, especially after their true ‘arrival’ onto the national political scene in 1906. Much of this criticism revolved around Labour’s language of chaos and destabilisation; the emergence of this new, left–wing political movement clashed considerably with the broadly conservative outlook of both the new dailies and the consumer political culture to which they sold so well. However, this criticism was not uniform. In fact, both newspapers dedicated coverage that was receptive to much of this emergent party. Central to this positivity was the idea that Parliament was becoming increasingly representative. For example, ‘working men’ entered the Commons and were seen as a welcome and overdue reality. This, and an appreciation of some of the societal inequalities that Labour were struggling to overcome, underlines the complicated place which Labour occupied within this massified, masculine election culture to which the new dailies contributed so significantly.


Early Indifference

The 1900 election was the Labour Party’s first ever election, as well as the first time that Britain had a socialist party competing at a national election. Their initial success was modest, having had two MPs elected to the House of Commons and amassing just under 63,000 votes.[14] That said, it marked a significant change in the British political landscape; in their first election, Labour won a larger share of the popular vote than John Redmond’s Irish Parliamentary Party. Considering the later significance that can be (and has been) so easily placed on a party’s first election, one would assume that there was a noticable response at the time to Labour’s electoral debut, including from two of the country’s most popular newspapers.

The reality of the response, from both the Daily Mail and the Daily Express at least, was considerably underwhelming. Admittedly, the 1900 election was defined by the central issue of the Second Boer War; a pro–imperial national spirit borne out of the war was widely credited with helping the Conservatives sweep to victory, and both the new dailies’ election coverage was heavily focused on the electoral importance of the ongoing conflict in the Transvaal.[15] However, even considering the weight of coverage afforded the war, the Labour Party was given almost no coverage at all. Far from being a watershed moment which saw a conservative press react with intensity, the rise of Labour prompted Britain’s two leading right-of-centre dailies to do little more than shrug.

The sparse mentions that were given to Labour by the two newspapers during their first election represented the party as a curious, inoffensive new oddity. Most of the attention in these newspapers focused not on the party itself, but on some of the high–profile individual members. Of particular interest was Keir Hardie, the party’s founder, leader, and first elected MP. One report noted that he had earned the support of renowned businessman, philanthropist and ‘Quaker cocoa manufacturer’ George Cadbury, who had sent Hardie £500 to help the party to support ‘the expenses of Labour candidates’ in Blackburn, Manchester, and Glasgow.[16] Besides earning Cadbury’s support, Hardie’s brief appearances portray him as a curious eccentric, assigning him the nickname ‘Queer Hardie’ and noting how his personality was not that of traditional members of Parliament; ‘(he is) the most erratic of Labour members… his outward oddities only faintly disguise a strong, simple, resolute character’.[17]

Similarly, the other mentions of Labour parliamentary candidates focus on curious aspects of their personalities, rather than on controversial or original aspects of their political leanings. For example, a candidate in Derby called ‘Mr. R. Bell’ was portrayed similarly to Liberal or Conservative candidates, stating that he ‘loves conciliation more than controversy’.[18] Another, Thomas Burt of Morpeth, was described as ‘no friend of socialism’ and given a background that remarks on the originality of his political background; ‘he still bears on him the marks of his early life of toil at the pit mouth… teetotalism and trade unionism made him a speaker… his mates elected him secretary (of his trade union) nine years later they sent him to Parliament’.[19] Far from being portrayed as revolutionaries, Labour’s new and prospective parliamentary candidates were represented as relatively unremarkable new additions to the British political landscape. The above–examples of language used to portray them focuses more on personality quirks than political leanings. Any reference to personal or party ideology seems to deliberately play down any radical or controversial tendencies. Their emergence is noted, but as little more than a minor footnote on the wider issues in the election.

One potential reason why the Mail’s and Express’s coverage of the party’s emergence seems to have been so underwhelming can be seen in how the broader idea of a worker-propelled political movement is discussed. Again, references to a wider Labour movement are scarce, but they suggest a shared understanding that a future of worker–driven politics was a long way off. For instance, a front page in the Express features a speech from the leading Liberal Unionist MP Joseph Chamberlain, in which he espouses the view that any new, ‘Labour’ members of Parliament — ones elected directly from a working–class community to represent their interests — would be like ‘fish out of water’ in the Commons.[20] Another article, published later in the election, speculates light–heartedly on a future where Britain has a ‘worker–controlled future electorate’. It argues that a time should come when the only barrier to voting should be an age limit of 21, and concludes with an interested look forward to what types of legislation might be passed if ‘the working man controlled the voting’.[21] Interestingly, while it has a more positive view than the quoted speech by Chamberlain, this article shares the view that a worker–driven politics is still not a present concern.

Overall, the Labour Party’s emergence and first presence at a British general election met with a muted response from the daily popular national press. On the one hand, there is some acknowledgement of the party’s arrival onto the British political scene and how a Labour–orientated working–class politics had the potential to lead to future change. However, this future theorizing is an exception to an initial response which represents Labour and it’s members as odd new additions to the established political landscape. Labour’s members were presented as original and unconventional, but only in relation to aspects of their personalities or the manner of their upbringing. Indeed, their politics are barely discussed and any references to ideology are framed to downplay any radical aspect of Labour beliefs. The impression left by these newspapers is that Labour, while new, were little but an eccentric, minor addition to British politics. Their emergence may well have been a matter of concern or interest for an undetermined point in the future. However, Labour was represented as a party of little concern to the readers of these two newspapers during their first general election.


Second Coming

As has been discussed, the representations of the emerging Labour party in the popular new dailies during the 1900 election placed little significance on them. At the beginning of the next — and Labour’s second — general election in 1906, the initial coverage from both newspapers was similarly sparse. In the Daily Mail for example, the opening few days of the election contained very few articles on Labour, and these, similarly to those from 1900, characterised the party by the unconventional personalities of its members. In particular, a piece on the opening day of the campaign focuses on the sitting MP of Woolwich and his ‘quaint sayings’ and ‘his insistence on his absolute ignorance of Latin’.[22] On the same day, the Daily Express’s sole representation of Labour concerned a speech by the ‘Socialist Countess’ Lady Warwick, and how local workers in the West Ham area of London ‘go and look at the lovely Countess while she is making one of her Socialistic speeches’.[23] While covering very different stories, both newspapers were again constructing Labour, its members, and socialism in general as a quirky, yet separate, addition to the British political tradition.

This approach changed dramatically after Labour began winning more MPs, with the first news breaking on January 15th 1906 that Labour had already gained seven seats in Parliament. The Daily Mail noted these ‘Labour successes’ and named the new members elected for Labour.[24] The Express meanwhile represented the new significance of Labour’s election successes by including them on their front–page ‘Election Race by Motor Car’: a daily cartoon which would track a political party’s progress to the ‘finish line’ at the end of the election.[25] Labour, missing entirely from the Express’s equivalent cartoon in 1900, now merited a place in the race.

This initial appreciation by both newspapers would change into a dramatic reaction in the subsequent days after Labour’s ‘arrival’ onto the main political stage. The day after the announcements, both newspapers published editorials focused on the electoral triumphs of Labour. The Express noted the party’s ‘astounding victories’ and how their success now posed a threat to the paper’s favoured Unionists.[26] This editorial echoed their front page of the same day which marvelled at the ‘astounding succession’ of Labour victories, while noting that it may well be a watershed historical moment; ‘nothing like it [Labour’s victories] has ever occurred in the history of British politics’.[27] This same sentiment was shared in the Mail’s editorial ‘Outlook’, headlined ‘The Rise of Labour’. Like the Express, it marked a decisive shift in the paper’s coverage of Labour which now represented the party as a ‘hurricane’ that was fundamentally changing the face of British politics;


Enormous Labour polls are, indeed, the great feature at the election, and even where Labour has not won it has voted in a manner that is beginning to cause nervousness to its Liberal ally . . . Socialism, by its very essence, means the abolition of all competition . . . equal rewards for fit and unfit.[28]


 After the relative indifference shown during the 1900 campaign, both the Mail and the Express increasingly represented Labour as both the defining aspect of the 1906 election, as well as a landmark shift in the history of British politics. This shift in both papers’ interpretation of the party led to a multitude of articles and editorials across the rest of the election dedicated to the party and its new MPs. Some of this new content was, perhaps unsurprisingly, fiercely hostile.


Chaotic Threat

It is interesting to note that, in the same early articles detailing Labour’s historic election successes, the new dailies quickly represented Labour as a potentially damaging and dangerous new political entity. For example, The Mail editorial cited above appears to associate Labour with forces of chaos, from the metaphorical ‘hurricane’ to the latter outlining of socialism’s radical stance against competition. The final quote above extends to communicate the potentially ruinous damage of Labour’s anti–competitive nature; ‘if the British worker cannot compete, so much the worse for them!’[29]

The clear conclusion, that Labour’s position would restrict the competitiveness of British labour both at home and abroad, represents the party as potentially ruinous both for wider British society and the very class of people it claims to represent. This association between Labour and chaos was also echoed in the Express, the same as the Mail’s ‘hurricane’ editorial. Their own ‘Matters of Moment’ associated the victories of the Labour party to ‘wreckage’ upon the status quo, with political policy labelled as both ‘fairytales’ and ‘insidious poison’.[30] Again, the choice of language used in these editorials associates Labour with chaos, and their negative impact on both the political system and those who may have, or may in future, vote for them.

These ideas of Labour–driven chaos would continue to be referenced throughout the rest of the election campaign, although the first days marked a high–point for both newspapers’ sense of panic. Their successes were frequently labelled as part of a ‘revolution’ or ‘upheaval’, which repeatedly suggested a link between the party and potential political unrest. This potentially damaging impact of the party was also applied to Labour itself, with the Mail speculating on a future Labour split between the small pro–Liberal section of new MPs and the majority of the rest of the MPs whom ‘do not trust Liberals’ and whose ideological extremism threatened an irreparable split between the two factions;[Labour radicals think] it better that ever Labour member candidates [loses] than that the cause should be degraded or obscured by weak MPs’.[31] While no other article considering the self–divisive nature of Labour’s emergence in 1906, it added to a broader representation from both new dailies that presented Labour as an unstable party, both within the wider climate of Westminster and, potentially, its own ranks.

Another persistent representation of Labour’s chaotic nature came from both papers’ repeated association between Labour and the Liberal Party. When again considering the initial responses of both dailies, the ‘hurricanes’, and ‘wreckage’ wrought on the election is appropriated to both Labour and the Liberals. The Mail’s editorial on the sixteenth contends the link between both anti–Unionist parties by saying how some Liberal candidates ‘are indistinguishable from Communists or extreme Socialists’,[32] while the Express also drew an immediate link between Labour and the Liberals, first being saying the latter were ‘aided and abetted’ by the former, and that together they were a threat to the Unionists.[33] These initial links drawn between the two parties are particularly fierce compared to the rest of the coverage, but were the first of several instances where Labour is represented directly, and negatively, in relation to its union with the Liberals.

Throughout the rest of the election coverage of the two newspapers, representations of Labour’s association with the Liberals seemed to be primarily focused on the former’s potentially damaging impact on the latter. For example, accusations of Liberalism’s pandering to Labour interests implies that the Liberals could end up regretting their partnership with the new socialists. The Mail for instance alluded to the idea that Labour were the real power, and that elected Liberals were ‘merely delegates’ of Labour and their trade union allies.[34] The fear of a trojan–horse, socialist incursion into the Liberals was continued later in the election as both Labour and Liberal victories kept growing, with a prophetic editorial that the upcoming Parliament’s true struggle would be ‘between Socialism and Protection’,[35] thus presenting Labour as the real force in any future non–Unionist government.

The Express shared a similar opinion of the two party relationships, arguing that Labour, not Liberalism, would play the greater role in a future government and that a ‘solid phalanx’ of Labour members had ‘forced their way into the Liberal ranks’.[36] Between evocative portrayals of militarized Labour infiltrating their ranks to the neo–criminal language of ‘aided and abetted’, the representations in both newspapers showed Labour to be just as damaging to their Liberal allies as to their Unionist opponents. This idea would continue to be explored throughout the election in both newspapers, with the ‘menace’ of Labour and their socialist policies frequently being associated to the eventual election–winning Liberals. For example, a particularly dismissive note in the Mail that declared that ‘oil and vinegar would readily mix than the ideals of [Labour MP] Philip Snowden’ and the Liberals[37], as well as updated summaries of the new Commons numbers with Liberal and Labour MPs combined (along with the Irish Nationalists) into the ‘Parliamentary’ column against the Unionists.

Perhaps unsurprisingly, the initial shock shown in the new dailies’ representations of the emergent Labour successes in 1906 quickly developed an antagonistic element. As two leading press supporters of the Conservatives, it is perhaps unsurprising that aspects of their coverage represented Labour in variously negative ways. What was remarkable was the speed of transition between coverage of Labour’s minor oddities to its newfound revolutionary, negative impact on British politics, its supporters and its Liberal allies.

Both the Mail and the Express were undeniably hostile towards Labour after their growth in influence during the 1906 election, and in this regard Labour were justified in the hostility they would, in turn, show to these particularly popular daily newspapers. However, the hostile representations were one of several ways in which these newspapers represented the party after its surge in the polls during the 1906 election. The hostility was noticeable, but generally subsided to reveal a more complex portrayal of the party which showed an interest in, and indeed levels of appreciation for, their membership and parts of their political message.


‘A most salutary influence’

Labour’s surge in popularity in its second–ever contested election was met with some hostile words from both the Express and the Mail. Interestingly however, the majority of the negative representations of the party focused on its potentially negative impact within the narrow confines of the Houses of Commons. Whether in relation to Labour’s potential to harm Parliament, its Liberal allies or the Labour party itself, the majority of their more negative representations in the new dailies were restricted to their place in Parliament. Very little coverage across either newspapers focused on the potentially negative impact of Labour on the everyday British public, besides the initial fear over the party’s position ‘against competition’ and a brief mention by the Mail’s early editorial of the party’s attitudes against public houses and a supposed plan to ban betting news inside pubs[38]. Conversely, the representations of Labour and its impact on British life outside of Westminster were broadly positive.

After the early outrage shown in both of the newspapers’ early editorials, the Mail and the Express shifted to positively representing an aspect of Labour’s emergence: the increased representation of the working classes. The day after their ‘insidious poison’ editorial, the Express ran another editorial dedicated to Labour, appreciating that ‘it is right and proper’ that the working classes had direct representation in Parliament and that Labour were well–placed to best voice their interests:


every class of the community should be represented in Parliament . . . we have more in the Labour men than to believe that they would permit themselves to degenerate into mere money–making politicians.[39]


The appreciation of working–class representation in the Commons was twinned with a portrayal of the new Labour members as people who would honestly work for them in Parliament, undistracted by other potential perks of the role in the House of Commons. A very similar sentiment was shown in the Mail’s Outlook the next day. While the newspaper’s opinion on Labour’s future plans (‘whether for good or evil remains to be seen’) created a certain degree of doubt, it agreed with the Express on matters of representation and the honesty of the new members;


It cannot be suggested that labour will be unduly represented . . . [many elected] have been bona–fide working-men… frankly, we much prefer these workers to a good many, who [hitherto] used the House of Commons as a road to money–making.[40]


Across both newspapers, Labour was represented as a positive influence both for the wider electorate and for the moral fabric of the Commons. While occasionally appearing alongside sentiments expressing mistrust or outright antagonism to the party, there was a shared understanding of Labour as a collection of politicians who would represent the British lower classes better, and less corruptly, than any other political group striving for their support. Admittedly, this more positive aspect of the party’s portrayals in the new daily press did not ever become a full endorsement, as high levels of mistrust were also associated with the party’s wider plans for the future of Parliament’s stability and the industrial way of life. It was, however, an undoubted acceptance, or possibly even a degree of admiration, of some of the party’s potential positives.


‘Gone is the Club’

As a collective party, Labour was represented in complex ways to the readers of the new dailies. Praise of their honesty and of overdue and deserved working-class representatives in Parliament were counter-balanced by persistent descriptions of the party as a disruptive force to their parliamentary colleagues and the British political tradition. Interestingly however, the majority of the coverage of the Labour Party in the Mail and the Express was not dedicated to the party itself. The most frequently occurring representations of Labour in the 1906 election focused on individual members; the MPs, old and new, whose collective integrity both newspapers positively represented.

The most noticeable focus in the new dailies was an interest in the employment backgrounds of Labour MPs. This manifested itself in sections in both newspapers that detailed members of the House; short descriptions of sitting MPs, challengers and the newly–elected. To understand the curious uniformity of the two papers’ profiles of Labour politicians, it is important to know the diversity of terms through which both Liberal and Conservative politicians were discussed in the same articles. For example, on January the seventeenth, the Mail ran a ‘Who’s Who’ column, providing brief details of a host of new faces in Parliament. The ways in which Liberal or Unionist politicians were described varied considerably; ‘forty-two years of age’, ‘an architect’, ‘a Londoner by birth and education’, ‘a Tariff Reformer’, ‘was born in 1845’, ‘a Fellow and lecturer of Merton College, Oxford’.[41]

The key words or phrases that were used to primarily define Liberal or Unionist candidates showed differences from person to person: age, education, upbringing, employment and particular political beliefs were all used to describe them. In stark contrast, Labour candidates or returned MPs were principally defined most often with reference to their engagement in hard physical labour, very often with reference to their early beginnings in said trades. The Mail also summaries from mid–January contained, among others, the following Labour returns;


Mr. Enoch Edwards, after a defeat at last election, has gained Hanley for the Labour Party. He is fifty-four years of age. He entered a colliery aged nine . . .

Mr. George Wardle, Labour member for Stockport, worked in a factory from the age of eight and became a clerk on the Midland Railway when fifteen.

Mr. Charles Duncan, the new Labour representative for Barrow-in-Furness, is an engineer and trade-union organizer

Mr. W. C. Steadman (Central Finsbury) is a Labour member . . . a barge builder by trade

Mr. Thomas Glover, St Helens Labour representative . . . At nine years of age he was working in the mines.[42]


Where Liberals or Unionists were just as much defined by education and politics as by their employment history, Labour politicians were primarily represented as politicians defined by their connections to industrial labour. The Express, on the same day, was compounding this manifestation of the same Labour members as people defined by their pasts in hard employment in their ‘Who’s Who’ equivalent called ‘The Polling’;


Finsbury Central, W. C. Steadman . . . apprenticed in the barge-building trade

Barrow-in-Furness: Charles Duncan . . . apprenticed to the engineering trade

Birkenhead: Henry Vivian . . . a carpenter and joiner by trade

Hanley, E. Edwards . . . at nine entered colliery.[43]


This attention to the manual employment backgrounds of Labour politicians was repeated throughout the election;


Summertail: son of a miner, started work as grocer.[44]

N. Barnes: apprenticed as an engineer.[45]

R. Clynes: cotton-factory boy.[46]

Crooks (Woolwich): has been a workhouse lad.[47]

Seddon (Newton): apprenticed to the grocer trade.[48]


The difference between Labour and non–Labour members is starkest when the briefest of summaries were printed side by side with a double election in Sunderland of a Liberal and a Labour candidate, describing the former as a Fellow of Trinity College and the latter as having ‘started work at seven’.[49]

The potential reasoning behind the consistent identification of Labour candidates by their industrial backgrounds is varied. On the one hand, there was the reality that the vast majority of Labour politicians did not have the same lavish educational or professional backgrounds often cited in descriptions of Liberal or Unionist candidates. This reality however cannot adequately explain the curious consistency with which both newspapers categorized Labour politicians by their labouring pasts, as non–Labour candidates sharing significant traits (for example, an excellent university education) were not treated to the same uniformity. It is possible that the new dailies’ fixation on the pasts of Labour members was an extension of the representations of individuals from 1900, which highlighted curious eccentricities of the likes of Keir Hardie. In place of ‘Queer Hardie’, there was a consistent interest in MPs with pasts in manual labour. Edwardian Britain’s Parliament was populated largely with members of the higher classes: peers, newspaper proprietors, industrialists, and lawyers.[50] Therefore, an influx of men who had worked in coal mines as children represented a curious break from the norm — a quirk to tradition that made these new members stand out from the rest. By consistently highlighting working pasts, the new dailies were partly continuing this image of Labour as a curious new phenomenon, potentially intended to provoke a wry, almost amused response from readers.

Another potential interpretation of the new dailies’ representations of Labour members as people defined by their pasts is that it shows considerable admiration of their emergence onto the political scene. These men, some of whom had to go to work from as young as seven, had now entered into the elite of British political life against considerable personal odds. Their individual stories represented triumphs over adversity; proverbial rags–to–riches narratives that correlated with the new dailies’ broader interest in emotive, human-interest news content that appealed to their mass, lower–class audiences. Rather than, or as well as, being a representation of curious backgrounds for British parliamentarians, these newspapers’ focus on employment pasts presented Labour members as everyday success stories to be respected and admired.

This latter interpretation is further supported by the fact that both newspapers dedicated longer profile articles to particular Labour politicians, which explicitly championed their rise from difficult upbringings. In the Mail, the article ‘A New Style Labour Member’ focused on the new West Ham MP Will Thorpe. Much was made of his journey from relative poverty to the Commons, and he is positively shown to have worked his way from the bottom to the top;


Seventeen years ago . . . a day labourer. Today, he is a member of Parliament.

Proved himself a born captain . . .

Born to misery . . . (parents) brickfield workers . . . endured the burden of toil.[51]


His transformation from the ‘urban slums’ to a ‘representative of starvation’ is shown to be something to be admired, even despite the article’s explanation that his life had led to him becoming ‘a Socialist of the most extreme type’. Indeed, in this context, the Labour man’s radical politics are presented as an understandable, if not agreeable, response to his personal history.[52] His past is a story of respectable, positive success, even in spite of politics wholly against those of these two newspapers.

The Express shared this positive depiction of Labour members and their industrial pasts with their ‘Romance of Labour’, a story about J. T. Macpherson who, having ‘served as a boy at sea’, had become an MP after his union had helped him pay his through a degree at Ruskin College, Oxford.[53] Again, the ‘romance’ comes from an individual who had reached Parliament, via one of the world’s best universities, having started life as a child labourer. He, like other Labour MPs, was represented as a personal success story. His journey was chronicled quite succinctly in the same newspaper a few days later;


At twelve, cabin boy.

At eighteen, Middlesbrough steel smelter

At twenty-one, founder of Steel Smelters Society

At thirty-two, Oxford Graduate and MP.[54]


When discussed in the new dailies as a collective, Labour politicians were categorised as honest and potentially simple characters who would do their best to represent working people. When discussed as individuals, Labour was represented as a group deserving of respect and interest due to their shared pasts overcoming hardships to enter Parliament. Often with reference to their histories working as children, Labour politicians were represented most strikingly as successes of hard work against personal adversity, to the point where disagreeable politics were contextualised and possibly even appreciated.  Labour, both as a party and as a group of people, was shown by the Mail and the Express to be a fresh addition to political life that carried with it an emotive, positive story of triumphing against difficult beginnings.


‘What Labour Wants’

In contrast to their broad political aims, the new dailies represented Labour’s politicians as broadly positive additions to the British political system. On occasion, the emphasis on personal triumphs over difficult starts in life was used as understandable context for any radical politics they may fight for in any future Parliament. This appreciation of the potential roots of socialism was not unique to profiles of individual MPs. Indeed, both the Express and the Mail dedicated significant coverage during the 1906 election that represented Labour, and socialism more broadly, as a cause driven by righteous discontent with existing realities of British life.

The most notable example of this came in the Daily Mail and its two–part long article ‘What Labour Wants’, written by a Mr. Bart Kennedy. Published on the seventeenth and eighteenth of January, its stated wish was to explore what the working man wanted, drawn from a series of interviews with ‘hard, strong–faced men of labour’ who, after everything, wanted nothing but ‘to live’. In its retelling of their stories, it paints an evocative picture of a horrific, lower–class existence;


[these men] did the dread work in the blackness of the earth… starving with their wives and family on a few shillings strike pay. Wives suckle their babies from their almost dry breasts.

Treated worse than the beasts in the fields.

Their wrongs cry out, no voice, no pen can fully put their case.[55]


In addition to these dramatic representations of suffering workers, Bart Kennedy portrays the owners of these businesses as nothing less than villains;


The people who own the mines have gradually pressed them [the labourers] down below the bare living point.[56]

. . . making the worker produce more wealth than it ever did before, and at the same time it is giving him less in proportion for his labour

You (the owner) are going on in a way that will bring England down about our ears.[57]


This extraordinary account of striking workers and profit–driven owners vividly represents an unsustainable divide between the richer and poorer elements of British society. Taken in the context of the broader coverage of the party and its members, it articulates the cause of the Labour party as one entirely justified by the current conditions facing workers. One of the party’s principle aims — to fight for better conditions for workers — is one that would directly tackle the ‘evil’ shown so evocatively in this article.

Interestingly however, the second part of this article concludes that ‘evil though the present system, it is better than it would be under Socialism’. This conclusion is sound and asserts the writer, because the current evil lies in the haplessness of authority, which would only increase under a socialist government. This conclusion, while strikingly brief in the context of the longer two–part article, correlates with the broader attitudes shown across the two newspapers towards Labour’s political ambitions. Labour and socialism are never shown positively; they are frequently associated with instability and neo–revolutionary disorder. What is interestingly though is that these two newspapers, which clearly and consistently represented Unionist politics as the best course of action, represented the conditions that Labour’s politics sought to address as a significant concern to its readers. The newspapers did not represent Labour’s motivations negatively and at times actively agreed with them on issues that politics needed to address. The party’s solution was not represented positively; their intentions often were.

This balance between the rejection and appreciation of Labour’s political aims was particularly pronounced in the Mail. For example, the twenty–third of January saw a column in the Mail written by recently–elected Labour MP Philip Snowden, in which he focuses on the party’s aim to ‘transfer large profits from private pockets to public utility… (and) enable better conditions to be given to the workers’.[58] On the one hand, sub–headings stating that Labour is a party that will ‘Tax the Very Rich’ and instigate ‘The Overthrow of Capitalism’ suggests the potentially revolutionary intentions of Labour, but it is countered by Snowden’s assertions that any future policy would be ‘not quite so blood–curdling as it sounds’. It is interesting that the input of the newspaper — the sub–headings — often contrasts with the actual content of Snowden’s writing; it is the heading, and not the Labour MP, who mentions anything tangibly proving an attempt to overthrow the existing capitalist system. This article, like the Kennedy article, touches upon the struggle between wealthy owners and poor workers, and represents Labour as a party fighting against an undisputed wrong. Also, particularly due to the sub–headings, the more positive representation of Labour’s motivations are countered with language portraying the party as a force of revolutionary harm.

The Express also echoed these same sentiments, though less frequently than its rival. Most notably, on the nineteenth of January, an editorial discussed ‘Labour on its Trial’ and the ‘colossal experiment’ of a socialist party in Britain. It, in contrast to the evocative longer reads in the Mail, represents the duality of Labour’s politics very concisely;


we say, give Labour its chance. If it succeeds, well, good.

If it fails, ________![59]


That brief editorial summary gets to the crux of this curious complexity at the heart of the representations of Labour’s politics. The party had won its place in the Commons. Now, it was time to see how they planned to solve issues that were of undeniable concern to British society. If their solutions proved a success, then it would be of benefit to all: in particular, to the many people who resonated with the imagined ‘man in the street’ sought by political parties, the mass press, and the surrounding popular culture of the period. However, as demonstrated by the concluding pause, it was clear that any Labour success, according to these newspapers, was both undesirable and rather unlikely.

This dichotomy teases out the fascinating and often contradictory place of Labour within the new dailies: two fundamental and widely consumed components of the election culture of early twentieth–century Britain. This new political party was, for many, a hostile and radical entity that clashed with much of the political and popular cultures into which they entered. However, their perceived connections to the everyman who was such a dominant part of those same two overlapping cultures meant that, for the hostility, there was also considerable admiration and support shown by the new dailies toward this ‘chaotic’ new addition to the electoral landscape of Long Edwardian Britain. While it would take until 1912 for Labour to have a mass daily newspaper for their own, they had already provoked a diverse and contested presence within Britain’s most popular daily newspapers during their emergent years as a political party.




Primary Sources:

Daily Mail: 26th September – 24th October 1900; 12th January – 8th February 1906

Daily Express: 26th September – 24th October 1900; 12th January – 8th February 1906


Secondary Reading:

Beers, L. Your Britain : Media and the Making of the Labour Party (Cambridge; Mass, 2010).

Bingham, A. and Conboy, M. Tabloid Century : The Popular Press in Britain, 1896 to the Present (Oxford, 2015).

Blaxill, L. ‘Joseph Chamberlain and the Third Reform Act: A Reassessment of the “Unauthorized Programme” of 1885’. Journal of British Studies 54/1 (2015), pp. 88–117.

______. The War of Words: The Language of British Elections, 1880-1914 (Woodbridge, 2020).

______. ‘Electioneering, the Third Reform Act, and Political Change in the 1880s*’. Parliamentary History 30/3 (2011), pp. 343–73.

Brodie, M. The Politics of the Poor : The East End of London, 1885-1914 (Oxford, 2004).

Butler, D. and Butler, G., British Political Facts, 10th ed. (Basingstoke, 2010).

Conboy, M. The Press and Popular Culture (London, 2002).

Hopkins, D., “The socialist press in Britain, 1890-1910” in Curran, J., Boyce, G. and Wingate, P. (eds.), Newspaper History from the seventeenth century to the present day (London, 1978), pp. 265-280

Koss, S. The Rise and Fall of the Political Press in Britain V. 2 (London, 1984).

Lawrence, J., Electing Our Masters : The Hustings in British Politics from Hogarth to Blair. (Oxford, 2009).

Rix, K. ‘“The Elimination of Corrupt Practices in British Elections”? Reassessing the Impact of the 1883 Corrupt Practices Act’. The English Historical Review CXXIII/500 (2008), pp. 65–97.

Shannon, R. The Age of Salisbury, 1881-1902 : Unionism and Empire (London, 1996).

Shoop-Worrall, C. ‘Politics and the Mass Press in Long Edwardian Britain 1896-1914’. (unpublished PhD thesis, University of Sheffield, 2019).

Thomas, J. A. The House of Commons 1906-1911 (Cardiff, 1958).

Thompson, J. British Political Culture and the Idea of ‘Public Opinion’, 1867-1914 (Cambridge, 2013).

Vessey, D. ‘Words as Well as Deeds: The Popular Press and Suffragette Hunger Strikes in Edwardian Britain’ Twentieth Century British History, 32/1 (2021), pp. 68–92.

Waller, P. J. and Thompson, A. F. Politics and Social Change in Modern Britain : Essays Presented to A.F. Thompson, (Brighton, 1987).

Waters, C., British Socialists and the Politics of Popular Culture, 1884-1914  (Manchester, 1990).

Windscheffel, A. Popular Conservatism in Imperial London, 1868-1906 (London, 2007).



[1] Throughout this paper, the word ‘Labour’ will be used to refer both to the party and, at times, to the wider movement to which the party remained connected. It is noted by the author, however, that they existed as the Labour Representation Committee (LRC) during the general election of 1900.

[2] L. Blaxill, ‘Joseph Chamberlain and the Third Reform Act: A Reassessment of the “Unauthorized Programme” of 1885’, Journal of British Studies 54/01 (2015), pp. 88–117; L. Blaxill, ‘Electioneering, the Third Reform Act, and Political Change in the 1880s’, Parliamentary History 30/3 (2011), pp. 343–73; M. Brodie, The Politics of the Poor : The East End of London, 1885-1914 (Oxford, 2004); P. J. Waller and A. F. Thompson, Politics and Social Change in Modern Britain : Essays Presented to A.F. Thompson (Brighton, 1987), p. 36; K. Rix, ‘“The Elimination of Corrupt Practices in British Elections”? Reassessing the Impact of the 1883 Corrupt Practices Act’, The English Historical Review CXXIII/500 (2008), pp. 65–97; Richard Shannon, The Age of Salisbury, 1881-1902 : Unionism and Empire (London, 1996).

[3] L. Blaxill, The War of Words: The Language of British Elections, 1880-1914 (Woodbridge, 2020); A. Windscheffel, Popular Conservatism in Imperial London, 1868-1906 (London, 2007).

[4] M. Conboy, The Press and Popular Culture (London, 2002), p. 95.

[5] A. Bingham and M. Conboy, Tabloid Century : The Popular Press in Britain, 1896 to the Present (Oxford, 2015), pp. 7–9.

[6] For more on the broader importance of newspapers, see J. Lawrence, Electing Our Masters : The Hustings in British Politics from Hogarth to Blair (Oxford, 2009), p. 78; J. Thompson, British Political Culture and the Idea of ‘Public Opinion’, 1867-1914 (Cambridge, 2013), p. 25; Windscheffel, Popular Conservatism in Imperial London, 1868-1906: pp. 26-7.

[7] C. Shoop-Worrall, ‘Politics and the Mass Press in Long Edwardian Britain 1896-1914’ (unpublished PhD thesis, University of Sheffield, 2019).

[8] S. Koss, The Rise and Fall of the Political Press in Britain (London, 1984), v. 2: pp. 15–53.

[9] D. Vessey, ‘Words as Well as Deeds: The Popular Press and Suffragette Hunger Strikes in Edwardian Britain’, Twentieth Century British History 32/1 (2021), pp. 68–92.

[10] Shoop-Worrall, ‘Politics and the Mass Press in Long Edwardian Britain 1896-1914’, pp. 180–200.

[11] See L. Beers, Your Britain : Media and the Making of the Labour Party (Cambridge; Mass, 2010).

[12] D. Hopkins, “The socialist press in Britain, 1890-1910” in J. Curran, G. Boyce and P. Wingate (eds.), Newspaper History from the seventeenth century to the present day (London, 1978), pp. 265-280; C. Waters, British Socialists and the Politics of Popular Culture, 1884-1914 (Manchester, 1990).

[13] See Bibliography

[14] D. Butler and G. Butler, British Political Facts, 10th ed. (Basingstoke, 2010).

[15] Bingham and Conboy, Tabloid Century, p. 26.

[16] ‘Campaign Items’, Daily Mail 27/09/1900.

[17] ‘Who’s Who in the Election’, Daily Mail 5 October 1900, p. 3.

[18] ‘Who’s Who in the Election’.

[19] ‘Who’s Who in the Election’.

[20] ‘Labour Members and Mr. Chamberlain’, Daily Express 1 October 1900, p. 1

[21] ‘The Working Man’s Vote’, Daily Express 11 October 1900, p. 6.

[22] ‘Woolwich’, Daily Mail 12 January 1906, p. 3.

[23] ‘The Socialist Countess’, Daily Express 12 January 1906, p. 5.

[24] ‘Labour Successes’, Daily Mail 15 January 1906, p. 7.

[25] ‘Election Race by Motor-Car’, Daily Express 15 January 1906, p. 1.

[26] Daily Express 16 January 1906, p. 4.

[27] Ibid, p. 1.

[28] ‘The Outlook: The Rise of Labour’, Daily Mail 16 January 1906, p. 6.

[29] Ibid.

[30] Express, 16 January, p. 4.

[31] ‘The Coming Troubles of the Labour Party’, Daily Mail 31 January 1906, p. 6.

[32] ‘Rise of Labour’, Mail, p. 6.

[33] Express, 16 January, p. 4.

[34] ‘The Outlook: Revolution of 1906’, Daily Mail 18 January 1906, p.6.

[35] ‘The Outlook’, Daily Mail 22 January 1906, p. 6.

[36] ‘Solid Labour Phalanx’, Daily Express 18 January 1906, p. 5.

[37] ‘The Outlook: Hushing it up’, Daily Mail 23 January 1906, p. 6.

[38] ‘The Outlook’, Daily Mail 7 February 1906, p. 6.

[39] ‘Matters of Moment: Labour and Liberalism’, Daily Express 17 January 1906, p. 4.

[40] ‘Revolution of 1906’, Mail, p. 6.

[41] ‘Who’s Who in the New House’, Daily Mail 17 January 1906, p. 7.

[42] Ibid.

[43] ‘The Polling’, Daily Express 17 January 1906, p. 1.

[44] ‘Who’s Who’, Daily Mail 19 January 1906, p. 7.

[45] ‘The Polling’, Daily Express 19 January 1906, p. 1.

[46] ‘Labour Successes’, Daily Mail 15 January 1906, p. 7.

[47] ‘The Polling’, Daily Express 18 January 1906, p. 1.

[48] ‘Who’s Who’, Daily Mail 25 January 1906, p. 4.

[49] ‘The Polling’, Daily Express 19 January 1906, p. 1.

[50] J. A. Thomas, The House of Commons 1906-1911 (Cardiff, 1958).

[51] ‘A New Style Labour Member’, Daily Mail 19 January 1906, p. 6.

[52] This would not be unique to the two papers’ coverage of Labour, as the broader issue of British socialism was discussed in dedicated articles elsewhere in the election coverage (See ‘What Labour Wants’).

[53] ‘Romance of Labour’, Daily Express 20 January 1906, p. 1.

[54] ‘Labour MP’s Romance’, Daily Express 23 January 1906, p. 5.

[55] ‘What Labour Wants’, Daily Mail 17 January 1906, p. 6.

[56] Ibid.

[57] ‘What Labour Wants (Part II)’, Daily Mail 18 January 1906, p. 6.

[58] ‘The People’s Party: Which Will Tax the Very Rich’, Daily Mail 23 January 1906, p. 6.

[59] ‘Matters of Moment: Labour on its Trial’, Daily Express 19 January 1906, p. 4.

Iranian Cinema and the New Woman: The Islamic Revolution’s Impact on Female Agency in Film


This article examines how Iranian regime, politics, and religion shaped the presence and roles of women in film. In the monarchical Pahlavi era, film followed early 20th century Western archetypes, marginalizing women to the binary role of virgin or whore. Despite misogynistic undertones of the Islamic Revolution, the “New Woman” created in the image of Fatima gave birth to honorific and deep roles for women on screen and within the industry, creating more agency for women in culture. In a complex balance between censorship and release valves, the Iranian government has allowed the film industry to deviate from their prescribed state stance on women’s rights, patriarchal authority, and female involvement.  This article identifies as a new genre of Iranian film, feminist realism, which is characterized by strong female performances and plotlines involving discussions of contemporary women’s issues. Feminist realism has made film an important outlet for cultural commentary and debate in Iran and has attracted international acclaim, particularly for the works of directors Asghar Farhadi and Dariush Mehrjui.  

Keywords: Cinema, Cultural history, Feminism, Film, Iran, Islamic Revolution, Middle East.

Author Biography

Sophia Hernandez Tragesser is an undergraduate at the University of St. Thomas in Minnesota, USA. She studies history and theology with particular interest in the modern Middle East, nineteenth-century African American history, and Latin America. She would like to thank the Luann Dummer Center for Women for generously funding her research and Dr. Shaherzad Ahmadi for her guidance and support, without whom this paper would not be possible.

Iranian Cinema and the New Woman: The Islamic Revolution’s Impact on Female Agency in Film

Download PDF

One of the core concepts at the heart of the intellectual politics of the 1979 Iranian Islamic Revolution was a rejection of the nation’s recent past under the ruling Pahlavi Dynasty and a hostility to cultural and social embodiments of ‘the West’. The Islamists’ rejection of Westernisation condemned not only the idea of Western morals and systems in situ, but also many aspects of urban and elite Iranian culture which followed American trends in fashion, beauty, and entertainment.[1] This tension between Western sociocultural trends and the Islamist ideals espoused by Ayatollah Khomeini and the revolutionaries culminated in a war over the ‘woman question’: who is the Iranian woman and what is her place in a theocratic Iran?[2] This crisis of state-women relations in the Islamic Republic was rooted in modern Pahlavi Iran’s cultural and political struggle to adequately address the same question in the preceding decades. Following the example of Turkey earlier in the twentieth-century, Pahlavi monarch Reza Shah sought to reform gender relations in 1930s Iran along Western lines.[3] These measures included banning the veil, encouraging co-educational public schooling, and promoting women’s suffrage.[4] Encouraging a Western understanding of gender and public politics served several ends. The first, as embodied in the Shah’s White Revolution of 1963, was to ‘modernize social and economic relations in order to build the nation state.’[5] By normalising male-female relations in social and political spheres and integrating women into the workforce, the Shah hoped to mirror the commercial success of the West.

The increased presence of women in the public sphere prompted the development of political, religious, and social women’s organisations in the 1930s.[6] In the 1950s, however, state-led repression of these organisations resulted in the dissolution of many of these groups and any remaining organisations for women were taken under central control by the government, often under the direct jurisdiction of the Shah’s sister, Ashraf Pahlavi.[7] The integration of the women’s movement into the state allowed the Shah to stifle discontent while making token gestures of progressive reform. This process represented a release valve of political activism for large numbers of women while also allowing the government to maintain bureaucratic control over the activities of many, potentially subversive, organisations.

After the Revolution, the Islamic Republic seized control over film content, production, and development. Just as the Pahlavi control over women’s socio-political activities enabled activism without substantially threatening the state, controls over film enabled the Islamic government to dictate film content while leaving room for subtle, contained dissent on political issues. As the state could easily cease production on a particular film, those which challenged Islam or the Islamic Republic could quickly be shut down without causing significant damage. Consequently, film had the latitude to examine what womanhood meant in Iran and to diverge from the official state policy on women’s rights and patriarchal authority. This relationship between censorship, the state, and the film industry has enabled twenty-first-century Iranian cinema to become a significant battlefield where debates over women’s roles in Iranian society are fought out. After the Islamic Revolution of 1979, Iranian cinema provided a forum where a ‘new woman’ could be debated, constructed, and represented.

The popular rejection of the Western woman of the Pahlavi era, known critically as a ‘painted doll’, in the aftermath of the Islamic Revolution led to the construction of a new female character-type in Iranian cinema. This redefinition along Islamist lines created an ideal characterised by piety, intelligence, and motherhood which then permeated wider society and culture. Central characters in Iranian films were now occupied by women with greater emotional depth than their Pahlavi predecessors and plotlines often centred on routine lifestyles and relationships. On top of this, censorship instated by the Islamic Republic served to phase out previously male-centric content and plots, especially those with extreme violence and sex. Out of necessity, content shifted focus to relationships, daily life, and cultural identity which naturally revolved around women. The focus on women’s stories and female characters created new agency for women both in film and in the broader film industry. This agency is visible in films from the 1990s to the 2010s which exhibit strong female roles, criticism of patriarchal and misogynistic aspects of Iranian society, and frequently involve female actors and directors.

These films, it is argued here, constitute a new genre of Iranian cinema: feminist realism. Feminist realism is characterised by strong female performances and plotlines involving discussions of contemporary women’s issues. Feminist realism diverges in significant ways from Western feminism. Rather than blatantly pushing the envelope of gender and modesty norms, Iranian feminist realism addresses questions of female identity and agency through mundane domestic plots driven by female action (and at times inaction) and consequently reveals important truths about the nature of womanhood in everyday Iran. As a result of the prominence of feminist realism, cinema has become a place for critical commentary and resistance against aspects of the Islamic Republic which restrict women. Despite the state’s active silencing of social criticism and women’s organisations, this genre of Iranian cinema has reached an international audience to great acclaim. Contrary to popular opinion, therefore, a more complex and nuanced portrayal of women in Iranian cinema did not accompany the modernisation of the Pahlavi period but only emerged after the Islamic Revolution of 1979. This article will begin by discussing the relationship between cinema and modernisation in Iran. From there, it will investigate the impact of twentieth-century Islamic philosophy and the Islamic Revolution on the role of women on screen and in the film industry. Lastly, this article will discuss cinema, particularly post-revolutionary cinema, as a space for feminist criticism of Iranian governance and society.


Iran’s Constitutional Revolution

The nature of modernity loomed large in the cultural and intellectual politics of early-twentieth-century Iran. The Constitutional Revolution of 1905–1909, albeit short-lived, created a parliament and restricted the power of the monarch. At the forefront of this modern expedition was an attempt to create an Iranian national identity largely based on the idea of a shared Persian history.[8] Similarly, notions of gender during the Constitutional Revolution were underpinned by distinctly Persian interpretations of the role of men and women in society. The dominant discourse of gender during this period has been extensively discussed by historian and gender theorist Afsaneh Najmabadi.[9] In particular Najmabadi highlights the practice, common in rural Iran, of using women and girls as a form of tribute payment to neighbouring villages.[10] The political climate of the Constitutional Revolution, however, encouraged fierce debate over this practice and prompted larger political and cultural discussions over the government’s role in protecting women. Of particular importance was the case of the ‘Daughters of Quchan’, a group of about 250 girls from the district of Quchan who were kidnapped and sold by the local government in lieu of tax. During the Constitutional era, Najmabadi argues, the ‘Daughters of Quchan’ became symbols of the national homeland and their loss of autonomy was considered both a sin against the girls as individuals and the broader notion of an Iranian nation.[11] Those who sold and bought the girls were portrayed as savage tribes who compromised Iran’s borders with Russia and exposed the government’s inability to protect the nation.

This national issue popularised the idea that women and girls should be protected from sexual insult and objectification as tribute. In order to protect women from these unacceptable tribal traditions, a strong and centralised government was deemed necessary to create a modern, non-tribal authority and to standardise the social and political treatment of women. This shift in popular opinion, away from tribal organisation and practices and toward a centralised modern state, set Iran on a Western path of nation-state development. However, neither a complete rejection of tribalism nor a full acceptance of the nation-state as a superior political body came to fruition for several decades.


Reza Shah and the Modernisation of Iran

Although the Qajar dynasty and their brief experiment with a constitutional monarchy was put to an end in 1925 by the ascent of the Pahlavi dynasty, the question of Iran’s modernity remained central. Pahlavi monarch Reza Shah launched aggressive modernisation efforts which not only encouraged the tentative development of a distinct ‘national’ identity, but also instituted technological leaps such as railways and radio broadcasting, which contributed to urbanisation and population growth.[12] This development primed the country to receive and soon produce cinema, which showcased and reinforced this nascent Iranian nationalism.

In 1924, Merian C. Cooper and Ernest B. Schoedsack, the American filmmakers who later produced King Kong, collaborated with journalist Marguerite Harrison on an ethnographic film following the migration of the Bakhtiari Tribe in Iran. The film, Grass: A Nation’s Battle for Life, captures the tribe’s seasonal trek from southern to central Iran, in addition to the filmmakers’ journey through Turkey to reach the ancient and unchanged ‘Forgotten People’.[13] In the tradition established by nineteenth-century Orientalist travellers, Grass is enamoured with the notion of an ‘ancient people’ at the heart of civilisation, struggling against nature to survive another migration. It presents the tribes as ’noble savages’, living in a different historical time from that experienced in the West. The film won international acclaim for its cinematic beauty and capturing of the tribe’s passage across the Karun River and over the Zardeh Mountain.[14] The film reflected tribal life as it existed in Iran and demonstrated the inability of central government to gain political dominance during the Constitutional Revolution. Grass portrayed the tribes in a dignified and valorised manner, a presentation which contradicts the narrative pushed by liberals in the Constitutional period and by Reza Shah. In Grass, the masculinity of the tribesmen is showcased through both physical feats and the life-and-death decision-making which the leaders must demonstrate throughout the migration. The women of the group are at the periphery and receive no specific attention. They are, however, presented as physically fit and capable, carrying large loads and contributing to the tribe’s migration. The incorporation of women in the tribe’s movements and their contribution to physical tasks sits in tension with the narrative of female vulnerability presented during the Constitutional Revolution and embodied in the ‘Daughters of Quchan’ incident. Here women were identified as incapable of self-defence, vulnerable to the whims of men, and in need of government intervention for their protection. This story of tribal independence undermined the nationalist narrative that traditional ways of life threatened the national social fabric.

Grass, in its original form, was banned in Iran as it critically contradicted the Shah’s actions to unite the tribes and construct a modern Iranian identity. Opposing the film gave the Shah the opportunity to institute state controls over cinema and to assert his authority over cultural affairs. After the deposing of the Shah in 1941 however, the film was edited with a Persian voice-over and became a point of national pride rather than of insult or alienation. Censorship during the Pahlavi period targeted scenes which challenged or mocked Islam as well as films with anti-state messages.[15] The government also implemented basic permit requirements for filming in public, specifically in religious or civic spaces.

Reza Shah used film to present his vision of a modern Iran and pushed back against the presentation of traditionalism in films such as Grass. A significant film in the early development of Iranian cinema and cultural modernisation was The Lor Girl (1933), also known as The Iran of Yesterday and the Iran of Today.[16] The Lor Girl was the first Persian talkie, produced by Ardeshir Irani and Abdolhossein Sepanta in Bombay.[17] In the film, Golnar—the Lor Girl—a young girl kidnapped by the Lor tribe of Western Iran, grows up and encounters a young man employed by the Iranian government, Jafar. The two intend to run away together when Gholi Khan, the leader of the bandit tribe, intercepts their plan and imprisons Jafar. Eventually they escape again, before an altercation with the remaining bandit gang members results in the death of several tribesmen. The two protagonists then flee to India and live there until they hear of Iran’s new government which has restored law and order by castrating tribal power and supplanting it with a centralised state. The film explores themes such as modernity and gender, themes which remained prominent in Iranian cinema until the 1970s. In a sense, modernity, and by extension the idea of a central state, saved the Lor Girl and delivered Iran from the grips of backward tribes. The Lor Girl establishes the primacy of male agency and action in film. However, Jafar is not a masculine or capable figure until he reaches India. When in Iran, the male figures appear inept and aloof, while the Lor Girl is clever and competent. When the two arrive in India, however, Jafar becomes the leading figure making decisions and taking action. The disordered gender roles in the first portion of the film are a consequence of tribalism and disappear when the setting changes to Zoroastrian India. The shift in gender relations based on setting speak to the ’correct’ cultural and political structures for social interaction. Being under orderly and structured governance enabled Jafar to become a man by taking up his responsibilities to lead and the Lor Girl was able to relinquish her more masculine qualities and take on a more appropriate secondary role once in India. At the end of the film, after retreating to Bombay, the Lor Girl only returns to Iran when a new government has incapacitated the tribes and brought the nation into modernity. Given these themes, this film supported Reza Shah’s repression of tribalism and his attempts to unify the country into a modern, Persian ethno-state.

In the 1930s and ‘40s, political tensions grew between the government and clerical establishment. Reza Shah continued to embrace modern reforms which sought to further integrate women into industrial and social settings previously dominated by men. In addition to clerical resistance, rural and lower-class individuals resented the Shah’s mandate of Western dress and the forced integration of men and women in schools. Ultimately however, sentiments of a strong and united Iran prospered. The narrative exemplified in The Lor Girl prevailed over that of Grass.


The Muhammad Reza Shah Era and Popular Cinema

Reza Shah was ousted by the British in 1941 and his son, Muhammad Reza Shah, ascended to power. Twelve years later, Iran’s Prime Minister, Mohammad Mossadeq, won the rights to Iran’s oil in an international court, forcing the British out and resulting in the industry’s nationalisation. Muhammad Reza continued his father’s modernisation efforts, bolstering educational opportunities and widening the civil service. In 1960, he launched the White Revolution which forced land redistribution, deployed students in rural areas as educators, furthered centralised state power, and promoted women’s enfranchisement. These efforts disturbed the clerical establishment, from whom much of the redistributed land was taken, as well as rural farmers who disliked having modern, secular students appear in their villages to re-educate their children. Opposition to Muhammad Reza’s revolution manifested itself in the foundation of organisations like the 1961 Freedom Movement, designed to oppose the regime’s pursuit of Western values. Opposition political parties and actors were silenced and exiled, which gave rise to discontent throughout the nation.

Between 1936 and 1947 no films were produced in Iran. Economic issues during these years contributed to political unrest, notably the protests of 1935 which culminated in the massacre of several hundred people at the hands of government troops. These economic and political issues  impacted production.[18] When commercial film production resumed in 1948, the Filmfarsi genre blossomed. Filmfarsi encompassed popular films which were typically melodramatic and involved Hollywood-style archetypes, often centred on a tough-guy trope. Filmfarsi actors quickly ascended to stardom and cinema began to dominate the national culture. Among film scholars, Filmfarsi marks a shift from film as a primarily artistic and artisanal medium to cinema as an industrialised and commercial product for popular consumption.[19] The masculinity espoused in Filmfarsi derived from the traditional Persian literary rogue figure: the luti.[20] In the nineteenth century this figure was portrayed as a gruff man living on the peripheries of society and operating under a traditional moralistic code, which at times required him to circumvent the law in the pursuit of vengeful justice. Representations of the luti were restricted during the Pahlavi period, in large part because the regime considered the figure to embody revolutionary tendencies.

Masud Kimiai’s 1969 film Qeysar confronts modernity and shifting gender roles in urban Tehran.[21] Title character Qeysar pursues the men who raped his sister (a crime which prompted her suicide) and killed his brother during a first revenge attempt, while evading the inept police’s attempts  to stop him. The film presents modernity as a war on women, only to be remedied by the return of masculinity in social and political structures. First, the virtuous women in the film, Fatima, Qeysar’s sister, and his mother are weak characters with little agency, suffering under the modern state of gender relations. Fatima is raped while studying with a male classmate and her subsequent suicide triggers a sequence of events which results in the deaths of her first brother, Faarman, and her mother, and in the potentially fatal stabbing of Qeysar. This plot is a clear attack on the Pahlavi desire to westernise women’s roles in Iranian society. While traditional gender roles would have kept Fatima at home safe with her mother, co-education, as established in the White Revolution, forced an already vulnerable young woman into an intimate position with an unrelated man who took advantage of her. Making matters worse, the modern police force is both incapable of protecting Fatima, and of  finding the perpetrators after the rape. The displacement of the traditional man’s role as avenger leaves Qeysar in the desperate position of having to avenge his sister in the urban landscape, under the radar of police or other witnesses.

Tough-guy films cast women in one of two ways: either as innocent unwilling victims of modernity or as sinful and complicit products of a Westernised culture. The first group encompasses most women in Qeysar. The second category of women is occupied by the club singer/dancer Soheila, girlfriend of the rapist and murderer Mansour. Soheila’s first scene opens in a club with her singing in a compromising dress, in full makeup, and pulled-up hair. In the almost seven-minute scene her very suggestive dancing captivates the gaze of all the men in the club, including the camera’s ‘male gaze’, reducing the character to her sexual attributes.

This virgin/whore or ‘pure/impure’ dynamic dominated Italian and Mediterranean film-making in this era and heavily influenced Iranian cinema. This binary character dynamic forces female characters into two-dimensional, shallow stereotypes, fully defined by their virtue or total abandonment thereof.[22] The virgin and whore roles both lack agency: the pure characters were dependent on men for their livelihood and the impure, though on the surface more independent than the former, still relied on men’s willingness to pay for sexual services in order to survive. Within Iranian Pahlavi-era film, women’s roles conformed to these categories, leaving little agency for females within plots and stifling the careers of female actors. For women portraying virgins, the available roles tended to be brief and weak, depicting women as subservient to the tough-guy, powerless and pitiful when caught up in the film’s dramatic plot. Women taking on the whore role necessarily participated in degrading scenes in compromising clothing, captured with an extremely objectifying male-gaze. This role encompasses the most liberal woman possible, with little regard of who she exposes herself to or sleeps with. In later tough-guy films, which take a very critical view of Pahlavi society, the whore is used to depict the degradation of women under the influence of Western liberal modernity in Iran.


The Islamic Revolution: Islamology and Politics

Qeysar and other films of the late 1960s and 1970s stood on the front line of the cultural war between the Westernised Pahlavi elites and the clerical establishment, buttressed by large numbers of conservative Islamists in rural Iran. Across the Middle East the idea of Pan-Arabism as an alternative to the West dissipated following the Arab defeat in the 1967 Six Day War with Israel.

In the 1950s, the Egyptian intellectual Seyyid Qutb began publishing political philosophy grappling with the meaning of Islam in a Western-dominated world. Qutb highlighted the West’s moral bankruptcy but also identified corruption and decay within the modern Islamic community. Qutb invoked the notion of Jahiliyya – the age of ignorance before the Prophet’s earthly life – and sought to apply it to the present state of Islam.[23] At the centre of his proposals to reinvigorate Islam as an international force was the creation of an intellectual vanguard to repress clerical corruption and to democratise access to the Quran. Though Qutb’s solution to Islamic governance utilised conservative structures, he sought to propagate Islamism as a theocratic movement across the Middle East. He was to have particular success with this project in Iran.

The intellectual Ali Shariati was one of the most significant theoretical influences on the development of the Islamic Revolution. While Shariati stemmed from the same Islamist intellectual movement as Qutb, he took a more leftist, revolutionary approach to achieving Islamic governance. After teaching, Shariati pursued studies at Mashhad University and the Sorbonne in Paris where he studied Islam in conjunction with philosophy, economics, ethics, sociology, and politics.[24] Shariati participated in multiple protest movements against the Shah both at home and abroad, including the National Movement of Iran in Europe and the Second National Front/Freedom Movement of Iran, for which he was imprisoned on several occasions. Shariati made critical contributions to the discussion of the ‘woman question’ in the 1970s and helped to shape the Revolution’s construction of the ‘new’ Iranian woman. In Woman in the Heart of Muhammad, Shariati asserts that Islam ‘emphasises equity by assigning to both [sexes] their natural places within society’, though the respective rights and duties of each differs. Shariati examines the life of Muhammad, specifically his relationships with, and treatment of, women, to contradict the Western narrative that Islam treats women as inferior to men. He also chastises the Christian missionary and European orientalist treatment of women ‘as a deception of the devil’, and their interpretation of Muhammad as a ‘Don Juan figure of the East.’ In this piece he specifically defends the practices of polygamy and modest dress as inherently protective for women.[25] Shariati does not promote modest dress as a means of controlling women, nor does he identify it as inherently spiritual. He sees the immodest Western dress as a symptom of youthful idolatry, connected to the propagation of cultural figures like Miss Universe. This mental attachment to shallow, anti-religious icons, Shariati argued, manifested itself in modern dress. He recognised, however, that intolerantly telling the youth what to do would not solve the problem. Instead, he advocated for presenting Islamic values ‘which are higher than the values represented by Miss Universe’ so that young women associate with the former and will ‘endure and incorporate all of those values herself’ by choice and not through coercion.[26]

Shariati’s most influential work on the ‘woman question’ in Iran was Fatima is Fatima, a lecture given at the Husayniyah Irshad and later distributed throughout the country. This piece was intended to address the identity crisis facing modern Iranian women who had adopted the ‘new imported mould’ of a distinctly foreign identity.[27] Shariati sought to find a model for Muslim women and, by expanding his source base to include several Shi’ite schools, eventually constructed the ideal heroine in the form of Fatima. Modern Iranian culture, Shariati argued, forced women to identify with either ethnic heritage or an ‘artificially imposed, imitative mask’. Instead, women want to ‘make decisions through reason and choice and to relate them to a history, religion, and society which received its spirit and basis from Islam.’ The lack of pre-existing theological movements which provided this basis, Shariati argues, was the fault of religious scholars and symbolised the schism between Islamic intellectuals and the Iranian people. Instead of seeing women in Muslim societies as either ‘traditional’ or ‘European-like’, the true face of a Muslim woman, and the ‘new woman’ of Iran, is Fatima.[28] Identifying with Fatima places all women in relativity to the time of the Prophet, espousing an identical standard which sits above generational time and space.

Shariati situates the new Islamic approach to questions of women and sexuality as the middle ground between the rigid, idealistic family of the religious Christian West and the short-sighted, pleasure seeking impulses of the secular West.[29] For Shariati, the Western notion of women – ‘toys of the Don Juans’ or ‘female slaves serving men’ – should be rejected and repressed.[30] Instead of seeking sexual freedom, which is fleeting, deceiving, and ultimately leads to dissatisfaction, Shariati argues that Muslim women should pursue womanhood as exemplified by Fatima and the Prophet, and that such womanhood would be best developed in a distinctly Islamic state. This authentic Islamic society would value women who are educated, virtuous, and are free to choose a life in the household, out of love for her family.[31] Muhammad loved Fatima and entrusted himself, his household, and his legacy to her. Shariati points to Fatima’s privileged place as beloved by the Prophet and as the perfect model of daughter, wife, and mother; she was ‘an outstanding example of someone to follow’, the model ‘for any woman who wishes to become herself […] through her own choice.’[32] Fatima’s personality, however, is more than a compilation of her roles in relation to Muhammad and others. Her identity can only be encompassed in herself: Fatima is Fatima.

Shariati’s assessment of Fatima enthrones her in inherent dignity while situating her in the lives of Islam’s most important figures. This analysis conveys the intrinsic value of women as understood by Shariati, as well as the dignity found in embracing Fatima as daughter, wife, and mother. This model of Fatima was rapidly embraced by Iranian women in the 1970s and underpinned the challenge to Westernised gender relations during the Islamic Revolution.[33] The identity of the ‘new woman’ did not rely on pure traditionalism or mimicry of the over-sexualised ‘painted doll’, but instead allowed Islam to serve as the basis of a chosen identity with intellect, agency, piety, and purpose. It was this new identity, forged in the Islamic Revolution, which challenged the role of women in Pahlavi film and provided the basis for a transformed, post-revolutionary Iranian cinema. This rejection of the Western-infused Pahlavi culture transformed the film industry and repealed many of the methodological and thematic tenets associated with Pahlavi-era films.

The Islamic Revolution’s redefinition of women’s role in society was of course part of a larger movement resisting the notion of Western modernity. The Revolution heightened religious and patriotic zealotry in Iran, priming the country for intensified conflict with Iraq. Tensions over the borderlands increased as Iraqi dictator Saddam Hussein openly attacked Iran’s revolutionary leader Ayatollah Khomeini and renounced the 1975 Algiers Accords, a critical agreement which previously kept the two from direct conflict over the Shatt al-Arab waterway.[34] On 22 September 1980, Hussein invaded Iran and embarked on a conflict which would come to embody an existential battle between the Shi’ite Islamists and the Sunni Pan-Arabs. The conflict presented the Iranian regime with the opportunity to consolidate power and Khomeini perpetuated the war despite Iraq’s willingness to cease hostilities after being pushed out of Iran in 1982.[35] The prolonged conflict, however, came at a high price. Iraq’s prolonged use of chemical weapons and Iran’s reliance on child soldiers made the war particularly ghastly, requiring heavy state propaganda to maintain a stream of volunteer fighters. The war offered women a new opportunity to take part in the defence of Shi’ism by both producing sons and allowing them to be martyred. This era solidified the ideals of femininity advocated by Shariati and other conservatives prior to the Revolution. The war carved out a special place for women in society, a place of honour in line with Islamic teaching.[36]


The New Woman in Revolutionary and War Cinema

Cinema during the war captured fighting on the front lines in a documentary style. These films featured minimalistic plots with little dialogue. Martyrdom became a central theme in war cinema and the notion of individual sacrifice for a collective or religious good was emphasised in contrast to Western individuality. The sense of collective identity was intensified by the limited focus on setting or personnel. Instead, voice-overs were added and scenes accompanied by narration and infused with religious rhetoric. Television specials and films covering the lives of war martyrs, notably a series entitled Chronicle of Victory, bolstered religious and patriotic devotion to the war.[37] In war cinema, the majority of stories centred on men in combat and were exclusively filmed and directed by men. Women only appeared as grief-stricken mothers and as relatives of the fallen soldiers.[38]

In terms of both production and consumption, the Revolution and subsequent war significantly harmed the film industry financially. The state acquired movie houses and implemented film content standards, mandating films to support the Islamic values of the new regime. [39] Both domestic and foreign-imported films required purification, something that could not be trusted to many of the industry’s former, largely secular, personnel. The Hijab became mandatory for all women in film, and Pahlavi or foreign films featuring unveiled women were censored. Considered the first post-revolutionary studio, Ayat Film Studio ascended to the forefront of Islamicate film because they were deemed trustworthy to produce films with the desired Islamic values. Ayat Film Studio, whose creation was inspired in the late-1970s by Ali Shariati’s call for Muslim youth activism in the arts, began filming documentaries of the marginalised.[40] Government film institutions quickly increased in number, alongside a small number of private and para-governmental studios.[41] In 1987, Ayatollah Khomeini relaxed the Islamic morality codes which created more artistic and political freedom for cinema.[42]


Post-revolutionary Cinema

The increasing dominance of Islamic values following the Revolution of 1979 unravelled the ‘whore/virgin’ dichotomy at the heart of Pahlavi-era film and created space for new female characters to emerge in Iranian cinema. Film became more accessible to, and directed at, religious audiences, children, and families. Furthermore, the film industry became a viable career path which girls and women could pursue without fear of the moral and social backlash which had followed Pahlavi-era stars.[43] Consequently, more women directed movies in the 1980s than in all preceding decades combined. This increased visibility of women was also apparent in other social and cultural spheres, such as the previously male-dominated environments of journalism and higher education. Despite their greater prominence in the film industry, however, women remained second-class citizens due to Iran’s imposition of sharia law.[44]

The separation of women and men in the public sphere, and the Islamic Republic’s codified modesty for women, produced a three-phase women’s movement in post-revolutionary cinema according to film historian Hamid Naficy. The first, in the early 1980s, can be characterised by ‘women’s structured absence’. This was a period of purification where women disappeared as hosts and as subjects in television news, were heavily edited or entirely removed from films whenever unveiled or sexualised, and were temporarily suspended from contemporary filming until new standards of purity were adhered to in the industry.[45] The second phase, in the mid-1980s, saw women as a largely ‘background presence’. This coincided with the height of the Iran-Iraq War and featured minor roles for women who were often confined to the domestic sphere. In particular, women only appeared dressed conservatively and the camera would intentionally avoid displaying their bodies. These modesty requirements noticeably complicated the filming process as even intimate scenes between a husband and wife could not be captured without veiling, and only behaviour acceptable in public settings was permitted. Naficy characterises the third phase of post-revolutionary cinema, beginning in the late 1980s and continuing in contemporary Iranian cinema, as one in which women are a ‘foreground presence’.[46] This phase, under the influence of realist techniques and theories, successfully integrates women into the film’s main plotlines. Frequently entire films centre on the stories of women and their daily lives. Female characters in this phase are intricate, multi-layered individuals with strengths, weaknesses and mixed motives. The complexity of character and context in these films gives female characters new agency to respond to difficult situations and presents women as intelligent actors capable of understanding and responding to their environment.


Case Study One: Leila

The 1997 film Leila is a key example of the complexity and agency of women in post-revolutionary realist cinema.[47] The film follows Leila, a young woman who learns that she is infertile and comes under pressure from her mother-in-law to allow her husband, Reza, to take a second wife. Though Reza continually insists that he loves Leila and does not want a child, his female relatives pressure her throughout and Leila eventually decides to allow Reza to pursue other potential partners. Reluctantly he does so but insists that if Leila later objects to the idea, or to a particular woman he chooses, he will stop the pursuit. Despite Leila’s internal anguish, she does not resist the pressure and in turn actively contributes to the search for Reza’s new wife. After the wedding, Leila cannot handle the reality of having another woman in her home and flees to her parents’ home to live separately. Reza and his new wife have a child and shortly thereafter divorce. Despite Reza’s appeals to Leila to return to his home and restore their marriage, she declines. Reza and his daughter appear at a family gathering as Leila watches from a window. Leila sees the girl and says, ‘maybe one day, when someone tells Reza’s daughter Baran this story, she might laugh when she learns that if it hadn’t been for [Reza’s] mother’s persistence, she might never have been born.’[48]

Leila stirred up considerable debate among audiences and film critics over its feminist credentials. Director Dariush Mehrjui is often regarded as a feminist film-maker, though Western audiences tend to view Leila as displaying misogynistic tropes due to Leila’s lack of agency in the face of an antagonistic mother-in-law.[49] The film should be read, however, as neither misogynistic nor feminist—at least in as far as these terms are commonly understood in the West. All of the central action of the film relies on female characters. There is only one significant male character, Reza, who makes no independent decisions and defers to Leila and his mother to address every issue. It is clear that all the women have the ability to navigate either alongside or around their husbands, and in many ways have more influence over the situation than many of the men. In this respect, Leila affirms female agency and presents it as especially powerful in domestic politicking. The film does not, however, take a stereotypical feminist stance, as Leila is far from the archetypical heroine. She is passive, quiet, indecisive, and allows her mother-in-law to intervene and dictate, despite numerous opportunities to stop her. The film pits Leila and her mother-in-law against each other, showing one as a powerful agent and the other as a passive onlooker on her own life. The contrast between these two women speaks to the contrast between conservativism and progressivism in Iran, and how the former is maintained despite shifts in popular opinion. The mother-in-law, representing tradition and conservatism, actively pursues a second wife for her son so that he may have a child, and she a grandchild. Conversely, Leila, who represents a progressive understanding of marriage as primarily for love and satisfaction between spouses and not for the purpose of childbearing, chooses to quietly watch as the conservative agents successfully promote their cause.

The film presents women as the enforcers of culture standards, including practices considered patriarchal such as polygamy and divorce as a response to female infertility. It is the mother, not Reza, who insists that the marriage is unsatisfactory without children and that the remedy is to be found in polygamous arrangements. The film also portrays Leila, a cipher for young progressives, as the reason why Iranian culture remains traditional. Leila needed only to speak and the entire situation would be derailed. The film’s symbolic conversation between conservative and liberal women identifies women – not men – as significant perpetuators of patriarchal culture. This is an uncomfortable accusation. Leila highlights particular issues which dominate women’s lives in Iran, the pressures to have children, to permit divorce when infertile, and to consistently please in-laws, and identifies how these issues persist at the fault of multiple parties. Rather than deploying a conventional feminist argument, Leila presents the question of how women, who are agents with choices, can change their circumstances or submit to contextual pressures.

The strong female roles, domestic plot, and direct examination of womanhood in Iran exhibited in Leila is largely representative of Iranian films from the late 1990s until the present day. By engaging directly with the core of Iranian culture, these films both identify issues faced by women in daily life and pose the question, ‘what should, and could life in Iran be like for women?’ The boldness of these films in addressing both traditional cultural standards and political actions which oppress women is striking, especially when considering the Iranian state’s capability and willingness to censor and control the film industry.


Case Study Two: A Separation

The films of the internationally acclaimed director Asghar Farhadi serve as another excellent case study of feminist realism in contemporary Iranian cinema. Farhadi’s films are characterised by strong female characters in mundane yet complex situations speaking directly to the state of gender relations in modern Iran. His 2011 film A Separation directly confronts the gulf separating Western and Iranian understandings of female identity.[50] The film opens with a couple arguing before a judge; he woman (Simin) is seeking to flee to the West to raise her daughter (Termeh), and is requesting a divorce since her husband refuses to leave the country. Simin argues that, ‘as a mother, I’d rather she [Termeh] didn’t grow up in these circumstances.’[51] This dialogue characterised Iran as a country short on opportunity, a difficult place for girls to grow up, and ultimately as inferior to the West. After the opening scene, Simin and her husband Nader return to their home where Simin packs her clothes and leaves for her parents and Nader nurses his father, who suffers from advanced Alzheimer’s. Termeh, from the beginning, is trapped between her parents. As Simin pulls the last things together before she leaves she walks right past Termeh, asks her to do her laundry, and at no point addresses her departure.[52]

Once Simin leaves, Nader meets with a prospective caretaker (Razieh) and hires her to watch his father during work hours. Razieh is always pictured with her four year-old daughter Somayeh and is clearly from a lower-class background. When Razieh returns the next day it is revealed that she is pregnant as well as from an orthodox religious background. In these circumstances she faces the dilemma of caring for Nader’s father without making herself ritually impure. On a later day, Nader and Termeh return home early and find that Razieh and Somayeh are gone and his father is on the floor, tied to the bedpost. After frantically aiding his father, Razieh returns and apologises for leaving but the conversation quickly escalates with incendiary language. Nader tries to get Razieh out of the house so he can help his father, but she resists and will not leave until he takes back some of his accusations. This results in Nader closing the door on Razieh as Somayeh and Termeh watch silently. Later, Simin and Nader hear that Razieh has been taken to the hospital for a miscarriage, and Nader insists he did not know she was pregnant. Razieh’s husband takes Nader to court where the three explain the case before a judge, who eventually charges Nader with the murder of the unborn baby. Outside of the courtroom, Simin tires to settle with the family and the class differences between the two families become evident. Nader’s mother-in-law tells Razieh: ‘you’re young […] you can try next year.’[53] At the centre of this dispute is a discrepancy between two families from different class families over the value of an unborn life. For the middle-class family, the miscarriage is no more severe than the harm done by Razieh to their grandfather. But for the poor family, the loss of a child entails earthshattering material and spiritual consequences.

As the film progresses, Simin and Nader navigate their fraught relationship and despite Termeh’s pleas are unable to reconcile. Razieh has equally troubling times with her husband, who dodges creditors and resents her for working behind his back. After more clashes in court, Razieh approaches Simin in private and reveals that she most likely lost the baby prior to the incident with Nader, when she was hit by a car while rescuing his father from a busy street.[54] This scene emphasises women’s ability to get to the truth outside of the legal system and without their husbands. Despite their mutual desire to settle the dispute, Razieh is unwilling to take the blood money for fear of spiritual implications and her husband lashes out at this refusal. The two young girls are caught between their warring parents. Throughout the film, similar shots of the two girls emphasise their innocence and express their mutual helplessness. The presence and connection of the two girls’ quiet stories speaks to the opening claim: Iran is not the optimal environment for young girls. However, the precarious situation of the girls is the result of their mothers’ actions, not just the socio-political situation of their country. The relationship between Simin and Termeh is strained from the start, and ultimately Termeh is a victim of her mother’s use of agency while disregarding the needs of others, including her family. Simin’s agency, exercised through leaving the family home, results in the appointment of Razieh and ultimately the conflict between the two families.

A Separation articulates bold critiques of class, divorce, and the position of women in contemporary Iran. Should A Separation, however, be classified as a feminist film? On one hand, the entire plot is propelled by the actions of women. On the other, the film also reveals how unbridled agency can disrupt family life, alienating children who do not have the agency to self-advocate. In a similar way to how Leila asserted female agency and strength, A Separation clearly affirms that Iranian women are capable, intelligent, and independent decision makers. However, the film does not overlook the consequences of strong, inward-looking women who fail to recognise the needs of others. A Separation, along with Leila and other contemporary Iranian films, exhibits a unique characterisation of women which this article has described as feminist realism. The film simultaneously portrays the damaging legal and social restrictions afflicting women in Iran while highlighting the profound consequences of challenging deeply embedded assumptions, traditions and systems. This feminist realism leaves room for the concept of the ‘new woman’ established during the Islamic Revolution – a woman with a strong religious identity – and for a female identity influenced by the West.



The Islamic Revolution led to the removal of the ‘painted doll’, the overly-sexualised Western image of women, from Iranian film and culture and replaced it with the image of Fatima, a figure present at the foundation of Islam and capable of transcending time and place. This ‘new woman’ was to exist within religious structures and expected to uphold the principles of dignity and piety. The Western interpretation of the Revolution, and the Islamic codes which followed, almost exclusively highlight the misogynistic, oppressive and patriarchal structures it imposed. An exploration of the film industry, however, tells a different story. Iranian cinema in the post-revolutionary decades is characterised by increased dignity and agency for both female characters and actors. It was the identity of the ‘new woman’ which destroyed the ‘virgin/whore’ dynamic that had dominated Pahlavi film and which had confined women to either weak or overly-sexualised roles. Post-revolutionary censorship demanded women take on asexual roles and refocused cinema around mundane, relationship-based plots. Increasingly these plots centred on the lives of women and enabled a deeper examination of gender relations across Iranian society. As a result of the increased presence of women on screens across Iran, cinema has become a place for commentary and resistance against the aspects of the Islamic Republic which restrict women. It remains one of the most important outlets for cultural commentary, debate and social resistance.


Bibliography & Filmography



Grass: A Nation’s Battle for Life (Dir: Merian C. Cooper, Ernest B. Schoedsack, and Marguerite Harrison, 1925).

Leila (Dir: Dariush Mehrjui, 1997).

The Lor Girl (Dir: Ardeshir Irani, 1933).

Qeysar (Dir: Masud Kimiai, Tehran, 1969).

A Separation (Dir: Asghar Farhadi, 2011)


Secondary Sources

Afkhami, G.R., The Life and Times of the Shah (Berkeley, CA, 2009).

Al Sharaji, A. S. Negotiating the Politics of Representation in Iranian Women’s Cinema Before and After the Islamic Revolution (unpublished master’s dissertation, University of Arkansas, 2016).

Atwood, B., Reform Cinema in Iran: Film and Political Change in the Islamic Republic (New York, 2018).

Naficy, H., A Social History of Iranian Cinema Vol 1: The Artisanal Era, 1897–1941 (Durham, NC, 2011).

Naficy, H., A Social History of Iranian Cinema Vol. 3: The Islamicate Period, 1978–1984 (Durham, NC, 2011).

Naficy, H., A Social History of Iranian Cinema Vol 4: The Globalizing Era, 1984–2010 (Durham, NC, 2012).

Najmabadi, A., ‘Hazards of Modernity and Morality: Women, State and Ideology in Contemporary Iran’, in D. Kandiyoti (ed.), Women, Islam and the State (Philadelphia, PA, 1991), pp. 48–76.

Najmabadi, A., ‘“Is Our Name Remembered?” Writing the History of Iranian Constitutionalism as If Women and Gender Mattered’, Iranian Studies, 29/1–2 (1996), pp. 85–109.

Nashat, G., ‘Women in the Islamic Republic of Iran’, Iranian Studies, 13/1–4 (2007), pp. 165–94.

Mehrabi, M., ‘The History of Iranian Cinema, Part Two’, <>

Qutb, S., Milestones (Cairo, 1964).

Razavi, S., Labour, Women, and War in the 1979 Iranian Revolution (unpublished doctoral dissertation, TED University, Ankara, 2017).

Sedghi, H., ‘Feminist Movements III: In the Pahlavi Period’, Encyclopaedia Iranica, 9/5 (1999), pp. 492–98.

Shariati , A., and  Bakhtiar, L., Shariati on Shariati and the Muslim Woman (Chicago, IL, 1996).

Takeyh, R., ‘Iran’s New Iraq’, The Middle East Journal, 62/1 (2008), pp. 13–30.

Tavakoli-Targhi, M., ‘Refashioning Iran: Language and Culture During the Constitutional Revolution’, Iranian Studies, 23/1–4 (1990), pp. 77–101.

Totaro, D., ‘Leila: Dariush Mehrjui’s Post-Revolution Masterpiece’, Off Screen Journal, 6/5 (2002).



[1] G. Nashat, ‘Women in the Islamic Republic of Iran’, Iranian Studies, 13 (2007), pp. 165–194.

[2] H. Sedghi, ’Feminist Movements III: In the Pahlavi Period’, Encyclopaedia Iranica, 9/5 (1999), pp. 492–498.

[3] Sedghi, ‘Feminist Movements’, p. 496.

[4] H. Naficy, A Social History of Iranian Cinema, Volume 1: The Artisanal Era, 1897-1941 (Durham, NC., 2011), p. 147.

[5] S. Razavi, Labor, Women, and War in the 1979 Iranian Revolution (unpublished doctoral dissertation, TED University, Ankara, 2017), pp. 102–104.

[6] G. R. Afkhami, The Life and Times of the Shah (Berkeley, CA, 2009), p. 237.

[7] A. Najmabadi, ‘Hazards of Modernity and Morality: Women, State and Ideology in Contemporary Iran’, in D. Kandiyoti (Ed.), Islam and the State (Philadelphia, PA, 1991), p. 60.

[8] M. Tavakoli-Targhi, ‘Refashioning Iran: Language and Culture during the Constitutional Revolution’, Iranian Studies, 23 (1990), pp. 77–101.

[9] A. Najmabadi, ‘“Is Our Name Remembered?Writing the History of Iranian Constitutionalism as if Women and Gender Mattered’, Iranian Studies, 29 (1997), pp. 85–109.

[10] Najmabadi, ‘“Is Our Name Remembered?”’, p. 86.

[11] Najmabadi, ‘“Is Our Name Remembered?”’, p. 88.

[12] Naficy, A Social History of Iranian Cinema Vol. 1, p. 10.

[13] Grass: A Nation’s Battle for Life (Dir: Merian C. Cooper, Ernest B. Schoedsack, and Marguerite Harrison, 1925).

[14] Naficy, A Social History of Iranian Cinema Vol. 1, p. 162.

[15] Naficy, A Social History of Iranian Cinema Vol. 1, p. 162.

[16] The Lor Girl (Dir: Ardeshir Irani, 1933).

[17] M. Mehrabi, ‘The History of Iranian Cinema, Part Two’, <>, (Accessed: 17/07/2020).

[18] A. S. Al Sharaji, Negotiating the Politics of Representation in Iranian Women’s Cinema Before and After the Islamic Revolution (Unpublished MA Thesis, University of Arkansas, 2016), p. 14.

[19] B. Atwood, Reform Cinema in Iran: Film and Political Change in the Islamic RepublicReform Cinema in Iran (New York, 2018), pp. 142–143.

[20] Atwood, Reform Cinema in Iran, p. 144

[21] Qeysar (Dir: Masud Kimiai, 1969).

[22] H. Naficy, A Social History of Iranian Cinema Vol. 4: The Globalizing Era, 1984–2010 (Durham, NC, 2012), p. 96.

[23] S. Qutb, Milestones (Cairo, 1964).

[24] A. Shariati and L. Bakhtiar (eds.), Shariati on Shariati and the Muslim Woman (Chicago, IL, 1996), p. xvii.

[25] A. Shariati and L. Bakhtiar, ‘Woman in the Heart of Muhammad’, in  Shariati on Shariati, p. 5–7, 43.

[26] A. Shariati and L. Bakhtiar, ‘The Islamic Modest Dress’, in  Shariati on Shariati, p. 43.

[27] A. Shariati and L. Bakhtiar, ‘Fatima is Fatima’, in  Shariati on Shariati, p. 79.

[28] Shariatiand Bakhtiar, ‘Fatima is Fatima’, p. 80, 83, 99.

[29] Shariati and  Bakhtiar, ‘Fatima is Fatima’, p. 110.

[30] Shariati and  Bakhtiar, ‘Fatima is Fatima’, p. 111, 112, 119.

[31] Shariati and  Bakhtiar, ‘Fatima is Fatima’, p. 139, 42.

[32] Shariati and  Bakhtiar, ‘Fatima is Fatima’, p. 212, 213.

[33] A. K. Ferdows, “Women and the Islamic Revolution” International journal of Middle East Studies, 15 (1983), pp. 283–298, pp. 293.

[34] R. Takeyh, ‘Iran’s New Iraq’, The Middle East Journal, 62 (2008), pp. 13–30.

[35] Takeyh, ‘Iran’s New Iraq’, p. 17.

[36] Shariati and  Bakhtiar, ‘Woman in the Heart of Muhammad’, p. 7.

[37] Naficy, A Social History of Iranian Cinema Vol. 4, p. 13, 15.

[38] Naficy, A Social History of Iranian Cinema Vol. 4, p. 25.

[39] H. Naficy, A Social History of Iranian Cinema Vol. 3: The Islamicate Period, 1978–1984 (Durham, NC., 2012), p. 118.

[40] Naficy, A Social History of Iranian Cinema Vol. 3, pp. 122–123.

[41] Naficy, A Social History of Iranian Cinema Vol. 3, p. 130.

[42] Naficy, A Social History of Iranian Cinema Vol. 3, p. 186.

[43] Naficy, A Social History of Iranian Cinema Vol. 3, p. 187.

[44] Naficy, A Social History of Iranian Cinema Vol. 4, p. 94, 95, 96.

[45] Naficy, A Social History of Iranian Cinema Vol. 4, pp. 111–112, 114.

[46] Naficy, A Social History of Iranian Cinema Vol. 4, p. 121.

[47] Leila (Dir. Dariush Mehrjui, 1997).

[48] Leila, Minute 2:03:14.

[49] D. Totaro, ‘Leila: Dariush Mehrjui’s Post-Revolution Masterpiece’, Off Screen Journal, 6 (2002).


[50] A Separation (Dir: Asghar Farhadi, 2011).

[51] A Separation, Minute 03:28.

[52] A Separation, Minute 08:57.

[53] A Separation, Minute 1:06:08.

[54] A Separation, Minute 1:48:20.


‘A Political Fight Over Beer’: The 1977 Coors Beer Boycott, and the Relationship Between Labour–Gay Alliances and LGBT Social Mobility

Link to PDF

Featured image courtesy of Online Archive of California

Author Biography

Kieran Blake is a postgraduate student of History at the University of Lincoln, researching twentieth-century American social movements—specifically addressing queer studies and the history of sexuality.


This paper examines the 1977 Coors beer boycott, to analyse the interplay of socio-political groups during 1970s America promoting the idea that labour and gay forces could form an alliance over economic disputes that were mutually beneficial. The workers’ strike demanded an end to the mandatory, homophobic polygraph tests; to do so, workers went on strike and asked San Franciscan gay bars to boycott Coors beer. By examining newspaper articles, trade union pamphlets and visual iconography, the paper highlights how labour forces invited the LGBT community because their bars were a powerful tool in forming a gay identity and allowed LGBT consumers to utilise their economic agency. Boycotting an alcohol brand allowed consumers to exercise their fundamental American rights, which, in turn, promoted their legitimacy as American citizens. Crucially, promoting a boycott enabled an economic spat to snowball into a wider social movement, as it was taken outside the parameters of the factory floor.

‘A Political Fight Over Beer’: The 1977 Coors Beer Boycott, and the Relationship Between Labour–Gay Alliances and LGBT Social Mobility


[Coors] is convinced that a boycott will not work because they

do not believe the consumer really cares about human rights or

the manner in which Coors violates the law.[1]

In 1977, brewery workers belonging to the trade union division Local 366 of the Adolph Coors Beer company printed and distributed a small flyer with one objective: to persuade the public to endorse their strike. The flyer was decorated with an illustration of a Coors beer can that had been crossed out. Displayed in a large font, the flyer told recipients to ‘BOYCOTT COORS BEER’.[2] Written overleaf was an informative bulletin in which Local 366 told readers why it was important to boycott the beer. The article was written in response to 1500 Coors employees walking out on strike against their employer in April of that year.

Local 366 was the trade union division which represented the workers of Coors. The strike was over a clause in employee contracts, which required all workers to take a mandatory polygraph test where they could be asked directly to reveal their sexual orientation. There was initial scepticism towards the strike, from Coors itself, workers and the public.[3] However, Local 366 found an unlikely partner in the gay community of America’s west coast—particularly San Francisco—courtesy of the American Federation of Labour and Congress of Industrial Organizations (ALF-CIO) president, George Meany.[4] Meany allowed the strikers to advertise the boycott in the sixteen states Coors was sold, by informing communities that Coors infringed on the human rights of its employees.[5] Due to the homophobic element to the polygraph test, the workers’ dispute gained a receptive LGBT audience in gay bars when they removed the beer from their bars and backed Local 366’s campaign. This withdrawal of Coors from San Francisco bars helped to produce a de facto ten-year ‘political fight over beer’.[6]

This article examines the 1977 Coors beer boycott, arguing that the protest cemented a labour–gay alliance, which transformed an economic spat into a gay rights social movement. This enabled an emerging sub-culture to advocate and utilise its economic agency and consumer rights to campaign for an end to discrimination in the workplace. By using the boycott as a case study to examine the interplay of socio-political groups during 1970s America, this article promotes the idea that, as consequence of such alliances, labour and gay forces found an unlikely partner in one other’s advocacy. Moreover, an examination into newspaper articles, trade union pamphlets and visual iconography sheds light into how a narrative focused on a shared understanding of oppression ran through both labour and gay forces; the oppression they faced—albeit over different grievances—promoted a mutual respect towards each other’s campaign.

The LGBT community exercised its economic and consumer rights by choosing what alcohol they purchased. In doing so, they highlighted their American citizenship—by this, I am referring to the fundamental values of suffrage, integration and economic agency which they used to credit themselves as American in an era of ever-expanding socio-political mobilisation.[7] As a legacy of the boycott, cooperation between labour–gay forces highlighted an effective method in which discrimination could be tackled on a case-by-case basis. As a result of such alliances, workers could legitimise their strike by taking it out of the locus of the factory floor. The gay bars’ invitation to boycott Coors provided a platform to work in tandem with workers, who, like anti–Vietnam War protesters, second-wave feminists and African–American activists, felt disadvantaged in comparison to the hegemony of the white, middle-class heterosexual.[8] Alongside these movements, the LGBT community could perpetuate its own wish to increase its social mobility from their bars.[9]

The history of American sexuality has found its feet in the last thirty years. Scholars have written on the topic to understand how a gay identity and LGBT community came to fruition in the twentieth century. The work of Elizabeth Armstrong, John D’Emilio and Margot Canaday, for example, suggests the LGBT movement was not born from the infamous 1969 Stonewall Riot. Instead, homosexual activism groups of the 1950s were the crux of activism, by aiming to re-educate heterosexuals’ pre-conceived attitudes regarding a homosexual morality.[10] D’Emilio’s ground-breaking research, Sexual Politics and Sexual Communities, summarised how ‘the [gay] movement constitutes a phase, albeit a decisive one, of a much longer historical process through which a group of men and women came into existence as a self-conscious, cohesive minority’.[11] Armstrong goes on to support this hypothesis, by suggesting the gay protests regarding those arrested at Stonewall provided the catalyst for the emergence of activist groups like the Gay Liberation Front by 1970.[12] Research into LGBT application of economic agency and consumer rights has received some, but not extensive, analysis. Miriam Frank’s Out in the Union, constitutes some of the only solid research into the boycott. Frank argues the emergence of a visible LGBT movement in 1969 augmented a relationship where some LGBT workers wished to construct a labour–gay alliance to help collectively improve welfare politics for workers.[13]

The LGBT movement of the 1960s and 1970s marks itself as another social movement at a time when socio-political mobilisation was rife in US society. Social movement theorists have noted the importance groups regarded identity for defining criterion on which they campaigned. David Meyer, Nancy Whittier and Belinda Robnett have argued the ‘standpoint’ of a social movement’s ideology rests upon the identity acquired, or the cultural changes which have brought it into being.[14] In the context of this paper, the identity that was nurtured in the gay bars and the actions of those activists in the 1950s, along with customers’ ability to choose the alcohol they drank based on LGBT politics rather than just its price, was the driving force in campaigning for the workers’ dispute with Coors.[15]

This article focuses on the significance of San Francisco’s community, particularly examining the impact gay bars had on this remarkably understudied event in the history of twentieth-century American sexuality. Firstly, the context of LGBT social mobility—in a century of changing attitudes towards sex and gender—is drawn upon, to show why gay bars became crucial to the boycott in 1977. In doing so, it highlights how those who frequented gay bars came to acknowledge them as a place where individual and a collective gay identities were nurtured, as well as a location for enabling gay customers to exercise their economic free-will.[16] These factors were essential in promoting a link between labour concerns and LGBT political demands, and suggested the boycott was essential to validate the workers’ demands and promote the LGBT agenda of acceptance.

Secondly, the paper examines the different perceptions of the boycott. This section considers the key figures who helped orchestrate labour–gay interactions: Local 366 head, Alan Braid, and the unofficial mayor of Castro, Harvey Milk (Castro refers to Castro Street, the most prominent LGBT area in San Francisco). Both figures respected and understood the oppression faced by the other, and showcased the importance of validating citizenship for the LGBT community and striving to meet workers’ demands by creating a mutually beneficial alliance.[17] As well as this, this section considers the role written press played in ensuring an alliance between labour–gay was perpetuated. Badges, newspaper interviews and posters were specifically addressed to the LGBT community through local press which ensured they were targeted and invited to boycott the beer, instead of a quasi-pact between two distinctly separate forces. Crucially, the rhetoric invoked by these articles informed LGBT boycotters how Coors had no desire for their employees’ working or human rights.

It shall also consider what impact a gay boycott had on Coors’s profits, reception of workers, as well as its need to re-brand itself as a corporation that was pro-worker and pro-LGBT once profit-loss became a tangible marker that the strike and boycott held resonance with San Franciscans.

Finally, this paper goes on to evaluate the legacy of the boycott by tracking the progression of LGBT socio-economic rights. The paper does not assert that the boycott provided a turning point in the history of sexuality—indeed, LGBT progression, I argue, cannot be viewed linearly in positive correlation.[18] However, the pact that developed between labour–gay forces through the boycott presented a system of alliance which showcased how the two could work together to tackle discrimination on case-by-case bases through similar economic disputes such as Florida’s orange production. Further socio-economic disputes were also fought by a mutually beneficial campaign which respected and understood the oppression faced by each other in a strive for citizenship through self-determination of economic free-will.[19] This, in turn, counters Alexandra Chasin’s argument who suggests that although boycotts emphasized a captive gay market, they ultimately reduced the choices available for the community as personal choices are not mutually exclusive to political action.[20]

Making America Queer: The Politics of Gender and Sexuality in Twentieth-century America

The socio-political climate of America in the 1960s—a time of protests from African–Americans, anti–Vietnam War protesters, second-wave feminists, as well as the counterculture movements—provided both a framework and platform for homosexuals to articulate and defend their new-found gay identities.[21] The campaigns of the 1960s—all of which focused on promoting an equal, yet nuanced American identity—produced a new generation of campaigners who successfully used protests as a method of deconstructing the hegemony of the white, middle-class, heterosexual norm.[22] The campaigns in the 1960s suggest LGBT activists emerged at the end of the decade because they belonged to the same generation of protesters. As Simon Hall suggests, the gay rights movement of the 1970s ‘followed the example of the “black, the poor, and the student”—who had been actively confronting systems which deny and demean them—joining the “age of revolution”’.[23] Direct-action protests such as rallies, marches and launching petitions were established as an effective way for minority groups to tackle the disadvantages they faced from this hegemony.[24]

San Francisco in the 1960s was a city that could facilitate and maintain political activism for minority groups on the quest of civil rights. The LGBT community—both on the cusp of liberation after Stonewall and its long struggle for agency and acceptance in the previous two decades—came to view San Francisco as their pseudo home: ostensibly, it was a homosexual town.[25] San Francisco was almost unique in its position—synonymously known as a permissive town where social norms were not fulfilled—according to Nan Boyd’s study, Wide-Open Town.[26] As a result of San Francisco’s status as a city like no other in the United States, a more coherent and tangible LGBT community, therefore, had the potential for greater agency. Crucially, they could become an effective social movement to campaign for civic equality by the 1970s when the openness of the LGBT community became ever-more present.

San Francisco’s plethora of gay bars became a hub for the LGBT community during the mid-twentieth century. These places offered a place outside of heteronormative society where people’s heterosexual ‘mask’ lifted and they were free to partake in identity-building practices such as dancing, drag artistry and drinking.[27] However, their openness was not welcomed by the San Francisco Police Department (SFPD). One defamatory article written in The San Francisco Examiner in October 1969 passes comment on the rocky relationship between the SFPD and gay bars. The author attributed this poor relationship to bars’ poor structural and hygiene control, as well as the clientele to whom the bars catered.[28] Undercover officers, ‘rookies’, the author said, solicitated with customers before ‘figurately blowing [sic] the whistle’ on the bar.[29] Officers tasked with entrapment were not, according to Christopher Agee, adhering to an anti–gay bar policy derived from the SFPD echelons, but were acting on their own personal prejudices.[30] This apparent lack of professionalisation allowed ‘gay bar owners to use [sic] an existing discourse about police organizational reform to integrate their movement into the mainstream political sphere.’[31] The Matachine Society’s president, Harry Call, argued:

the police are obsessed with the desire to supervise and regulate people […] for instance, they object to our dancing together. Next to sex, dancing is one of our most important of human joys. I believe that I speak for all homosexuals, and certainly for the Matachine Society, when I say we oppose police and other supervision.[32]

The gay rights movement, through the homophile organisations and bar-based culture, used the 1960s as a decade to express their hostilities to the civic order which deprived them of their fundamental rights. San Francisco’s bars and community therefore, according to union representatives of the boycott of Coors, appeared a fruitful place to engender and bolster the movement when they branched out for support. The gay scene was prepared to fight those who denied them their basic rights.

As San Francisco’s LGBT community expanded throughout the mid-decades of the twentieth century, its image as the ‘gay hub’ was cemented in the city’s psyche. One of the most prominent de facto homosexual communities was Castro Street. Located between Market Street and ‘19th Street’, the small district offered a public space for homosexuals to meet. Castro offered a plethora of gay bars for its LGBT customers, which played a critical role in creating a socio-political gay rights movement.[33] Gay bars were essential in helping to cement an identity and one necessary element was the liquor customers drank.[34] San Franciscan bars were noted as being extremely cheap—for example, one bar reportedly sold champagne for two dollars ‘served in a real champagne glass.’[35] As San Francisco developed into a de facto homosexual town, it also created a common market because gay bars provided a space where homosexuals specifically choose the types of alcohol customers could drink. Therefore, withdrawing an alcoholic brand from the bars held the potential to significantly impact a corporation’s profit margin.

Under this context, San Francisco, an ostensibly homosexual town, held a network of heavily frequented gay bars in which homosexuals were accustomed to fighting against oppression that denied their equality.[36] The foundation of the gay bars’ socio-political framework created a space useful for Coors strikers because it provided a capacity to transform their dispute into a community-led social movement. The cheap price of liquor ensured that bars became a hub for homosexual integration, whilst enabling customers to exercise their economic agency. It therefore meant that if a customer were to choose another brand—irrespective of price, but on a matter of politics—they promoted their rights as American citizens, by synthesising their spent revenue with political activism. Moreover, it served as a tangible act of defiance towards Coors, as their profit-loss threatened security as the Western States’ top beer seller. This outreach, in turn, validated the workers’ strike over their contracts. In doing so, the ensuing labour–gay pact ensured that the Coors boycott became a movement that was mutually beneficial.

We Need Some Milk: San Francisco, Gay Bars and People’s Reaction to the Boycott

In an interview in the New York Times in 1977, San Franciscan public figure Harvey Milk acknowledged the conservative attitudes of LGBT economic activity.[37] Milk argued that it was hypocritical for homosexuals to live a capitalistic lifestyle, but oppose conservative policies that denied LGBT-socio equality:

I’m a left winger, a street person… [m]ost gays are politically conservative, you know, banks, insurance, bureaucrats. So their checkbooks are out of the closet, but they’re not. So you get something going, and all the gay money is still supporting Republicans except on this gayness thing, so I say, ‘Gay for Gay’…[38]

Milk’s statement suggests if you were to campaign for full equality, gay meant gay. If one had consumer rights and economic agency, then it should be used to fight discrimination and recognise oneself as a full American citizen. This next section examines how the Coors boycott was received amongst the general public, and how the economic disruption of the factory floor became a labour–gay social movement in the community.

The merit of the boycott’s ability to become more than an workers’ dispute was its accessibility to an LGBT audience. After Local 366 was granted permission from the ALF-CIO, it was important to integrate themselves in the community to validate their concerns with the employment practices of Coors. Integration with the workers’ dispute was presented through the interaction of both Local 366s leader and Castro Street’s ‘unofficial’ mayor, Allan Braid and Harvey Milk, respectively. Moreover, trade union pamphlets, newspapers and visual iconography, all aimed to inform the community as well as invite them to take part, by invoking the idea that they were equals who understood the mutual oppression they faced.

The challenges faced by gay bars and the homophile movements during San Francisco’s journey for self-identity and openness in a heteronormative society, arguably made the gay rights movement a sensible choice for the union to approach to endorse their strike against Coors. As a key member in organising the Coors workers’ strike, Braid spoke with shop keepers asking for them to pledge to stop selling Coors beer.[39] Braid also met with Milk to inform him of Coors’s homophobic polygraph tests asking for support from the Tavern Guild to stop selling the beer.[40] The Tavern Guild was a network of gay bars established in the 1960s. As a resident of Castro himself, Braid was conscious of the potential agency LGBT people had allowing them to be useful allies in a social boycott. Sympathetic to Harvey Milk’s work, Braid’s eulogy to Milk highlighted his abilities to seemingly unify the LGBT movement, and create a safe and politically active space for the community in Castro.[41] As John Sweeney has argued, labour movements showed themselves as ‘capable of broadening to include and represent every class of workers’.[42] This highlighted that through Braid, the efforts of labour workers to manufacture a social movement benefited the LGBT community as it showed a progressive stance towards equality for all.

As Braid encouraged the gay bars to join the boycott, Harvey Milk brought it to the attention of the rest of the LGBT community. Milk had already encouraged members of Castro to boycott Coors’s beer in 1974; by 1977 Milk was a suitable figure to approach in order to gain support. Writing in the local San Francisco newspaper, Bay Report, in 1976, Milk delivered a speech in which he strongly urged folk to boycott Coors.[43] Milk implied that the LGBT community was closely related to the labour movement, therefore it was the LGBT community’s duty to call out Coors’s ‘very poor labour [sic] history’ as well as their ‘humiliating’ treatment to its employees.[44] Milk’s unique social status in Castro held him in good stead for engendering support for other minority groups’ struggles, like that of the workers; for the LGBT community to gain true equality, homosexuals had to use their economic agency, power of voting, and the commitment to better social relations with other minority groups in order to truly bring forth equality and legitimise LGBT citizenship.[45] The goal of key individuals was to appeal directly to the gay rights community, urging them to use their economic agency and strength as a new and open social movement to boycott the beer thus supporting the need of labour–gay relations. Newspapers and pamphlets, such as the one presented at the beginning of this paper, were a strong way to gain support because the radius of audience was specifically targeted at San Franciscans. Moreover, the language used specifically implied that Coors was breaching both the working and human rights of its employees. Local press and community members ensured that the LGBT community was informed of Coors’s homophobic actions and meant the workers could directly invite the LGBT community into a socio-political partnership.

As well as the promotion of economic agency, the relationship between Coors strikers and boycotters highlighted how the boycott promoted a united labour–gay alliance, rather than two separate movements. Cultural iconography helps assert the idea of a mutually beneficial socio-economic dispute. Mass produced artefacts such as posters, flyers and badges, which marked themselves as anti-Coors material, were designed to resonate with the recipient. One poster [Figure 1] emphasised the unethical nature of Coors’s homophobic polygraph tests. The illustration of a male and female worker strapped to the polygraph machine by a sinister-looking senior official, emphasised to its audience how workers were forced to take the test against their will. Moreover, badges and t-shirts advocating the Coors boycott, which were worn by some LGBT activists, cemented a direct affiliation to the protest [Figure 2]. Crucially, posters were directly addressed to ‘friends of labor [sic]’, as showcased in this article’s opening bulletin, by directly addressing the reader and commanding them to abstain from Coors’s alcohol.[46] The cultural products surrounding the boycott, more specifically the language they used, suggests LGBT customers were specifically chosen to engage with the boycott instead of attaching their agenda onto an altogether separate movement. Language had become a method in which homosexuals were able to express their concerns for their welfare.[47] Clothing and badges invited them to show their contempt when they were doing other activities.[48] Therefore, the LGBT movement was able to transform the boycott into a social movement that was mutually beneficial for both striker and LGBT boycotter.[49]

Figure 1: A Poster Promoting the Coors Beer Boycott, Online Archive of California (OAC), Unknown Author, Unknown Date, [online resource] <>, accessed on 23 January 2019.

Cultural symbols and the language used in media representations allowed the LGBT community to express their support for the boycott. The protest shifted an economic dispute into a social movement because it synthesised a labour–gay alliance, through the invitation to protest for workers’ equality and the utilisation of economic agency by incorporating culture and language. This did not, therefore, mean the LGBT movement joined a labour-shaped bandwagon—instead it was a labour–gay protest movement.

Figure 2: Oklahoma State University Library, item oksa_phelps_11-07-0035, 1977, Edna Mae Phelps Collection, [online resource] <>, accessed on 16 January 2019, and at the Digital Public Library of America, [online archive] <>, accessed 18 November 2018.

The boycott was successful in making the community aware of Coors’s employment practices. As polygraph tests and ‘search-and-seizures’ became common knowledge, Coors had to present their own side of the story. One consequence of the boycott’s establishment in the gay bars was profit loss, estimated at between eleven and twelve percent, to nineteen percent.[50] Coors’s damage control was three-fold: threaten workers with the sack, branch out their sales to Eastern States at extortionate prices, and donate to LGBT-charities.

Prior action from Coors employees had done little to dent Coors’s reputation as a leading beer brewer in the Western states. In 1974, Coors’s profits accounted for 49 percent of California’s beer sales—despite Teamster Union Local 888 (the truck driver division), Latino, and Chicano workers having already protested against their employer’s treatment towards them.[51] However, as the initial 1974 boycott progressed, and gained greater momentum through support from the LGBT community becoming increasingly aware of the discriminatory nature of Coors’s practices, sales of Coors’s beer began to falter. According to Milt Moskowitz’s article for The San Francisco Examiner, by 1976 Coors’s profits in the California region dropped by nineteen-point-six percent.[52] As Coors could lose the top-spot in California’s beer sales to the nationwide leader Budweiser, it suggests the boycott’s proliferation from the strikers and the LGBT movement held the capacity to threaten Coors’s economic security. In doing so, the boycott was something Coors could not ignore, especially when Local 366 joined in the strike in 1977. This is evident when chairman of Coors, William K. Coors, told The Wall Street Journal he would take ‘great satisfaction in opposing all the forces that would like to put [Coors] out of business.’[53] This also implies the power workers and the LGBT community had as social movements through their use of direct-action protests towards the heteronormative, middle-class establishment. When they worked in tandem, they could fight the capital interests of a company for the civil rights of workers.

Coors’s representatives presented a media front that was ready to fight against strikers and boycotters through antagonistic language. In the Colorado Springs Gazette-Telegraph, one article described how Coors employed permanent staff to replace those on strike—threatening strikers by suggesting ‘it may lead to the loss of your job.’[54] However, this threat was seemingly left unfulfilled through Coors’s rise in philanthropic work. The Empty Closet in 1980, an LGBT newspaper that emerged after the birth of the Gay Liberation Front, wrote an article about how Coors had donated a delivery truck to Denver’s Metropolitan Community Church (MCC).[55] This act of charity is crucial to understanding Coors’s seeming lack of desire to follow through on its threats because the MCC was an LGBT Christian church. This philanthropy not only showed that Coors donated to the church, but how it openly donated its own property to a LGBT institution in an apparent act of kindness. Coors highlight how it believed the boycott was an unfair attack, and this would therefore suggest they wished to present themselves as a pro-LGBT company.[56]

Coors found loopholes in strikers’ efforts to void them. Coors’s philanthropy to the church was one method of achieving this. Another method took the form of donations to AIDS-related charities when the epidemic took root in America in the mid-1980s.[57] Though profits may have decreased in Western states, in Eastern states Coors’s demand only increased. Writing about Coors’s acquittal on a trade restriction charge, the Fort Collins Coloradoan describes how beer sales were so quick that it was not even being refrigerated.[58] Because of the beer’s scarcity in Eastern states, Coors demanded prices of fourteen to eighteen dollars per case.[59] This suggests that Coors still held a captive audience in other states where the boycott was not as prolific and implies they capitalised on the deficits to ensure the company’s profits were not at a loss. Although these actions show soft forms of defence against the boycott, it emphasises how Coors was compelled to surrender to the LGBT movement. Ultimately they were left with little choice other than to make concessions in employment practices and the social conditions for LGBT people outside the factory walls.

Workers taking their grievance with Coors outside the parameters of the factory floor, ensured that the LGBT community was invited into dispute because workers were keen to emphasise both groups understood what it was like to be oppressed. Perpetuating the word through newspaper articles, posters and badges, a network of key figures both of whom held mutual respect for each other, along with having an inter-connected network of bars, allowed the LGBT community to utilise its economic agency by withholding the sales of Coors which promoted their citizenship. Crucially, the labour–gay pact that stemmed from moving the strike onto the streets brought Coors’s unethical practices into the public domain, something Coors was compelled to respond to given there was a quantifiable impact against their reputation as the top beer seller in Western states.

Tracking Progression: Labour–Gay Alliances, post–Coors Beer Boycott

This next section argues that the boycott did not create a psyche of LGBT acceptance to all, by considering the scope of acceptance towards homosexuality after the boycott began. On 28 June 1977, The San Francisco Examiner published a side story on the twenty-second page about a young man from Chicago who had been raped. This gentleman was a taxi driver who, on the night in question, was stopped by two male customers. As they got into the car, they informed him that this was a heist and ‘they [sic] have a .38 right here and if you see it, it will be the last thing you ever see’.[60] Taking control of the taxi, the two men drove around picking up passengers with the intention of stealing their possessions. Eventually, the taxi was stopped and—to ensure the victim would not go to the police—the victim was told he had to do something for them: ‘he’ll never cop (admit) to this. It will make him feel queer’.[61] The taxi driver was raped by the two men under the assumption that he would not report them to the police because he would feel homosexual. Indeed, the victim did not want people to find out for fear he may be labelled a homosexual—despite being heterosexual. The article’s publication date places it three months into the Coors workers’ strike, and the author, Roger Smith, pays homage to the active gay rights campaigns that were ongoing, such as the Florida orange juice boycott. Smith strongly asserts the law-abiding nature of homosexuals involved in the protest movements.[62] However, the subtext in the article’s message suggests the boycott did little to change the psyche of people’s attitudes surrounding homosexuality: for some, its connotations brought about feelings of shame and disgust.[63] Naturally, the boycott was limited in its scope, as its locus was specifically where Coors was a strong market force—the West coast. This article demonstrates that attitudes towards a person’s moral integrity—specifically, the perceived maxim that homosexuality was something perverse—had not wholly shifted after the boycott, despite labour–gay pacts promoting a shared understanding of oppression. What the boycott did bring, however, was an effective method of demonstration which involved linking economic agency and social movements, to vilify homophobic commercial figures or products.

One boycott which has received heavy scholarly analysis is the boycott of Florida’s orange juice, whose main commercial figure was singer and model, Anita Bryant. Her fundamental Christian values and strong anti-homosexual attitudes led her to run the Save Our Children campaign, which aimed to ban anti-discrimination laws against Florida’s LGBT community’s housing, employment and public accommodation welfare. Interestingly, the response towards Save Our Children was overwhelmingly negative.[64] Gay bars retaliated to these initiatives by banning orange juice in their bars, preferring to serve vodka with apple juice instead. The politics of this boycott appear to follow a similar pattern to the ones used in the Coors boycott: economic withdrawal from a homophobic organisation, and the social mobilization in the community to endorse the boycott and bolster support for the gay rights movement. However, this does not mean the gay rights movement should be viewed in positive correlation towards full equality. This is reflected in the origins of the orange juice boycott: it was a retaliation towards homophobic institutions. Though the Coors boycott therefore provided a blueprint to effectively campaign against anti-LGBT establishments through the promotion of LGBT economic agency, it did not provide a broad consensus amongst Americans to change their attitudes towards the gay rights movement.

The Coors boycott did produce some level of national support for the gay rights movement in so much as further boycotts such as the Orange boycott were spearheaded by a labour–gay alliance. However, some of the United States’ more conservative attitudes towards a person’s perceived moral integrity were not as easy to dissipate through social boycotts. Following the Save Our Children campaign in Florida, Proposition 6 was devised by San Franciscan governor John Briggs, which aimed to remove all gay and lesbian teachers from working in California’s public schools. Colloquially coined as the ‘Brigg’s Initiative’, the plan also received overwhelmingly negative responses. Opposition came from figures such as California’s then Governor Ronald Reagan, and President Jimmy Carter; amongst critics included the ALF-CIO and the Coors Boycott Committee. President of the California Federation of Labor, Al Gruhn, suggested it would ‘cause a witch hunt and destroy the basic functions of our education system.’[65] By pledging support towards fighting homophobia within other aspects of LGBT life, suggests that the labour–gay alliance was mutually beneficial: they ostensibly show that they recognized the daily struggles beyond oppressive conditions found in the locus of Coors’s factory floor. The labour–gay alliance showed continuing support for LGBT social mobility on a political dispute that affected the LGBT community’s rights as American citizens.

Despite the budding relationship between striking workers and gay boycotters, they had been unsuccessful at challenging the Christian values of the American status quo. Elizabeth Armstrong suggests this was a consequence of the United States’ federal governance.[66] Although the LGBT community was a pseudo-political organisation, and it could express it attitudes against the status quo, the federal nature of governance often made nationwide change a slow process because it was harder to implement pro-LGBT policies on a national scale.[67] The bureaucracy, in essence, ensured the government’s fundamentally conservative views stunted LGBT acceptance.

Although the Coors boycott was able to provide a systematic method per se to campaign against discriminatory institutions by forging of labour–gay relations and withholding gay economic agency, it could not transform the United States’ psyche into something overwhelmingly pro-LGBT due to the entrenched heterosexual binary in individual and federal politics.[68] Even the assassination of Harvey Milk in 1978 showed little sign of instigating a complete overhaul of the American psyche. His small obituary shared a page with a large Christmas advertisement informing the reader on where to get the best, most cost-effective suit.[69]

Creating a labour link helped increase LGBT visibility. Ultimately this developed a relationship between workers and a gay community who could go on to tackle further discriminatory practices of both the economic giants and of individuals. Indeed, whilst the relationship forged by the Coors boycott allowed for a method to tackle discrimination in the working environment, it was not wholly successful in transforming Americans’ attitudes towards homosexuality. Jeffrey Weeks noted it is important not to examine the history of gay rights in a linear fashion because it was not one long path towards full political, social and economic equality.[70] Moreover, Michel Foucault asserted individual and collective notions of sexual identity were paradoxically built from the oppressive power which denied its existence.[71] What this does highlight, however, is that examining case studies determines how LGBT protested navigated the dichotomies of oppression they faced in that particular incident. Given each campaign focused on a different trigger—be it homophobic alcohol brands, commercial figure heads or homophobic legislature—they had to tackle what sparked that campaign in the first place.


It was not until Robert H. Chanin, the National Education Association’s general counsel—one of the largest union organisations in the United States—and Peter H. Coors—Coors’s brewery division president—met in 1985 that plans for an end to the boycott were discussed.[72] The New York Times made comment about the new-found necessity for labour forces and management to see fit to end the strike and subsequent boycott:

It [is] a classic tale of labour-management [sic] relations—of two enemies slinging arrows at each other for years, until, battered by a changing economy, they need each other badly enough to compromise.[73]

By this point both men were keen to see an end to strife; ‘the ALF-CIO had been caught up in implementing the boycott, not ending it.’[74] It was not until 1987, ten years since the first action was taken, that the boycott was brought to an end. How was it, then, that a boycott that initially captivated small interest—both in terms of its media representation and the strikers themselves—maintained itself as a ten-year ‘political fight over beer’?[75]

This article has examined the 1977 Coors beer boycott as a case study to understand the interplay of labour–gay alliances in the battle for LGBT social mobility and consumer citizenship. The utilisation of LGBT consumer rights and economic agency which developed in gay bars—some of the only open homosexual places for a person during the mid-twentieth century—created useful allies from homosexuals for the strikers. The rise of the gay rights movement at the close of the 1960s, and San Francisco’s unique position as an ostensibly homosexual town, created a receptive audience to the boycott. The LGBT community, like the strikers, were born from a generation who used protests to campaign for full equality. These direct-action protests were utilised with some degree of success. Workers and homosexuals utilised this to campaign for equality for workers overall.[76]

Through an analysis of the boycott and of social networks in 1970s America, this article offers two significant conclusions. Firstly, by examining the language used in newspapers, trade union flyers and cultural iconography, the article has demonstrated that the ensuing labour–gay alliance allowed an economic dispute around employment to transform into a social movement away from the factory floor and onto the streets of San Francisco. The Tavern Guild’s agreement to ban Coors from San Francisco’s gay bars not only presented a rejection of Coors’s ideology for invading workers’ privacy, it also impacted Coors’s sales and profits. Moreover, newspaper interviews by activists such as Harvey Milk and pamphlets written by Local 366 carefully selected the language they used when describing Coors’s employment practises. The language considered was deliberately hyperbolic to stress the indecency of invading workers’ working and human rights, which, therefore, informed those outside the factory walls precisely why the strike was creditworthy. Moving the strike onto the streets through a boycott meant Coors could not ignore the situation and had to respond through philanthropic donations to LGBT organisations. This resulted in Coors having little choice but to rebrand themselves as a pro-worker and pro-LGBT company.

Secondly, the use of the gay bars as an establishment in which a homosexual identity could develop was also significant in building up a gay economic agency.[77] As some of the only open spaces available of homosexuals, gay customers were given the choice to choose what they drank. Crucially, the customer’s choice was not only made on a financial level, but on a political level, too. Therefore, LGBT customers legitimised their American citizenship through this synthesis of economic and political matters within their daily life. The labour–gay alliances, which promoted and utilised the economic agency of the community, formed a blueprint of protest towards other homophobic individuals or organisations. This was repeated when gay bars removed Floridian orange juice to signify their contempt of Bryant’s homophobic ideology. Though the boycott did not produce an immediate national consensus of support, it did, however, provide a method in which the LGBT community could advance its social mobility towards the prospect of equality on case-by-case bases.

As Frank has suggested, labour–gay alliances linked two seemingly different groups into an entity that could become mutually beneficial.[78] While Chasin has commented that boycotts denote a captive gay market, she concludes that boycotts limit homosexual progression as individual choices do not constitute political legislation.[79] This paper has offered an alternative argument, suggesting that LGBT communities withholding their economic agency and consumer rights emphasizes they had the same rights to property as other American citizens. As a social movement, the exercising of economic free-will only enhanced the political agenda and identity nurtured from 1960s protests which highlighted the LGBT community was also excluded from white hegemony.[80] Therefore, withholding their expenditure against a homophobic organisation highlighted their citizenship in American society—especially when Coors’s profit loss became a tangible effect of a labour–gay assault against a homophobic, anti-labour organisation, highlighting that the boycott was a dispute that could not be ignored. The Coors boycott took LGBT consumers out of their bars and onto the streets of San Francisco, so they could openly throw away their beer.



Appendix 1:

1a: Flyer cover written by Local 366 advertising their strike against Coors beer. For reference, please go to: Digital Public Library of America , (DPLoA), Eduardo Margo, 30 August 1977 [online archive] <>, accessed on 16 November 2018.

1b: The overleaf of appendix 1a, the informative bulletin informing the recipient why they should boycott Coors beer. Please see: Digital Public Library of America , (DPLoA), Eduardo Margo, 30 August 1977 [online archive] <>, accessed on 16 November 2018.


Appendix 2:

A compiled list of gay and lesbian bars in San Francisco. Please note, this list accounts for establishments founded from the 1960s up until 1977, only bars with complete dates of open and closure have been included, bars are listed in ascending geographical location. Full credit for this list goes to the Uncle Donald’s Castro Street online archive, without whom I would not have been able to gain such a comprehensive list of gay bars in the city. For the full table, please see: Uncle Donald’s Castro Street (UDCS), Uncle Donald, 12 January 2012, Castro Area Bars, [online archive] <>, accessed on 9 November 2018.

Bar Address Approx. date of open and closure
Twin Peaks 401 Castro 1973–open
Twilight 456 Castro 1971–1972
Dirty Dick’s 456 Castro 1973–1975
Le Bistro 456 Castro 1976
Nothing Special 469 Castro 1972–1984
Toad Hall 482 Castro 1971–1979
Elephant Walk 500 Castro 1975–1996
Midnight Sun 506 Castro 1971–1972
City Dump 506 Castro 1973
Midnight Sun (moved to 18th Street in 1981) 506 Castro 1974–1981
Mistake 3988 18th St. 1971–1976
Corner Grocery Bar 4049 18th St. 1973–1978
Village 4086 18th St. 1976–1988
Watergate West 4121 18th St. 1973–1974
BADLANDS 4121 18th St. 1975–1999
I-Do-No 4146 18th St. 1967–1968
Honey Bucket 4146 18th St. 1969–1971
Pendulum 4146 18th St. 1971–2005
Libra 1884 Market St. 1967–1972
Tree House 1884 Market St. 1972–1973
JB’s House 1884 Market St. 1973–1974
The Mint 1942 Market St. 1968–open
Naked Grape 2087 Market St. 1972–1975
Tool Box 2087 Market St. 1976
Hustle Inn 2087 Market St. 1976–1977
Rear End Bar – at Tuck Stop 2100 Market St. 1974–1976
Mind Shaft 2140 Market St. 1973–1977
Alfie’s 2140 Market St. 1977–1983
Cardi’s 2166 Market St. 1977
Bal ony (Balcony) 2166 Market St. 1977–1983
Purple Pickle 2223 Market St. 1972–1977
Shed (after hours) 2275 Market St. 1972–1977
Missouri Mule 2348 Market St. 1963–1973
Hombre 2348 Market St. 1973–1979
Scott’s Pit (Lesbian) 10 Sanchez 1971–1984
Caracole 3600 16th St. 1976–1979



[1] Digital Public Library of America , (DPLoA), Eduardo Margo, 30 August 1977 [online archive] <>, accessed on 16 November 2018.

[2] The flyer’s cover and overleaf can be viewed in the appendices.

[3] M. Frank, Out in the Union: A Labor History of Queer America (Philadelphia, 2015), p. 79; [Anon], ‘Coors Bolsters Boycott’, Santa Ana Register, 22 April 1977, p. 48; R. West, ‘Coors Charges Brewery Union Workers’, The Los Angeles Times, 9 May 1978, p. 46.

[4] Ibid.

[5] Ibid.

[6] M. Moskowitz, ‘A Political Fight Over Beer’, The San Francisco Examiner, 18 April 1976, p. 104.

[7] J. E. Black and C. E. Morries III, Harvey Milk, An Archive of Hope: Harvey Milk’s Speeches and Writings, (London, 2013), p. 18.

[8] Black and Morries III, Harvey Milk, An Archive of Hope, p. 18; S. Hall, Peace and Freedom: The Civil Rights and Antiwar Movements of the 1960s (University of Pennsylvania Press, 2005); J. D. Suran, ‘Coming Out Against the War: Antimilitarism and the Politicization of Homosexuality in the Era of Vietnam’, American Quarterly, 53 (2001), pp. 452–88; P. Lewis, Hardhats, Hippies and Hawks: The Vietnam Antiwar Movement as Myth and Memory, reviewed in P. Joseph, Peace and Change: A Journal of Peace Research, 40 (2015), pp. 272–76; b. hooks, Feminist Theory: from Margin to Center (Oxford, 2015), pp. 18–19.

[9] E. Armstrong, ‘Movements and Memory: The Making of the Stonewall Myth’, American Sociological Review, 71 (2006), p. 725; Frank, Out in the Union, pp. 76–77.

[10] J. D’Emilio, Sexual Politics, Sexual Communities: The Making of a Homosexual Minority in the United States, 1940–1970 (London, 2nd Ed., 1998), p. 4; Armstrong, ‘Movements and Memory’, p. 725.

[11] D’Emilio, Sexual Politics, Sexual Communities, p. 4.

[12] The aftermath of the riots at the Stonewall inn became a turning point in homosexual vernacular; homosexuals began to use the previously pejorative term ‘gay’ as a marker of their identity. See Armstrong, ‘Movements and Memory’, p. 725.

[13] Frank’s insightful study of the relationship between labour forces and gay activists constitute some of the only concrete research into the Coors boycott. Her work has been invaluable to this thesis. For more of the relationship between gay activists and workers see Frank, Out in the Union, p. 8.

[14] D. Meyer, N. Whittier and  B. Robnett, Social Movements: Identity, Culture, and the State (Oxford, 2002), p. 121.

[15] D’Emilio, Sexual Politics, Sexual Communities, p. 4; Armstrong, ‘Movements and Memory’, p. 725.

[16] N. Boyd, Wide-Open Town: A History of Queer San Francisco to 1965 (Berkeley, 2003), p. 160.

[17] Frank, Out in the Union, pp. 76–77.

[18] M. Foucault, The History of Sexuality (London, Vol. 1, 1978), pp. 83–85.

[19] B. Shepard, ‘Bridging the Divide Between Queer Theory and Anarchism’, Sexualities, 13 (2010), p. 516.

[20] A. Chasin, Selling Out: The Gay and Lesbian Movement Goes to Market (New York, 2000), p. 161.

[21] ‘Homosexuality’ was the term used to define someone who had a sexual attraction to a person of the same gender. The binary of what constituted a man and what constituted a woman focused on heavily on gendered expectations. Chauncey offers an insightful examination into this perceived axiom in 1930s America; Canaday tracks this progression of categorizing homosexuality as a political state cemented post-Second World War, and how this helped construct a homosexual–heterosexual binary. Please see, G. Chauncey, Gay New York: Gender, Urban Culture, and the Making of the Gay Male World 1890–1940 (New York, 1994); M. Canaday, The Straight State: Sexuality and Citizenship in Twentieth-Century America (London, 2009); Foucault, The History of Sexuality, pp. 77–89.

[22] J. Scott, ‘The Evidence of Experience’, in H. Abelove et al. (eds), The Gay and Lesbian Studies Reader, (New York, 1993), pp. 397–415; Foucault, History of Sexuality, p. 79.

[23] M. Stein, City of Sisterly and Brotherly Loves: Lesbian and Gay Philadelphia, 1945–1972 (Chicago, 2000), pp. 277, 279, referenced in S. Hall, ‘Protest Movements in the 1970s: The Long 1960s’, Journal of Contemporary History, 43 (2008), p. 662.

[24] Hall, ‘Protest Movements in the 1970s’, p. 657.

[25] After the Second World War, many of those soldiers who had been expelled from the army due to homosexual activity moved to cities such as San Francisco with the hope of starting a new life. For many, the fear of their community discovering their homosexuality was a risk they did not want to take. For more information, please see Boyd, Wide-open Town, p. 5; D’Emilio, Sexual Politics, Sexual Communities, p. 39; Canaday, The Straight State (New York, 2009).

[26] Boyd, Wide-open Town, p. 5.

[27] Historians of sexuality such as Craig Loftin and Matt Houlbrook suggest that homosexuals during early to mid-twentieth century often had a mask of heterosexuality whilst in the public sphere. This notion was common practice in both Britain and America as a method of ensuring homosexuals appeared to conform to the gendered expectations society required from them. This mask was always worn, except for their homes and upon entry to a gay bar or drag hall. For more, see C. Loftin, Masked Voices: Gay Men and Lesbians in Cold War America (New York, 2012), p. 11; M. Houlbrook, ‘Lady Austin’s Camp Boys: Constituting the Queer Subject in 1930s London’, Gender Studies, 14 (2002), pp. 31–61.

[28] R. Patterson, ‘The Dreary Revels of S.F. “Gay” Clubs’, The San Francisco Examiner, 25 October 1969, p. 5.

[29] Ibid.

[30] C. Agee, ‘Gayola: Police Professionalization and the Politics of San Francisco’s Gay Bars, 1950–1968’, Journal of the History of Sexuality, 15 (2006), pp. 462–465.

[31] Ibid.

[32] Patterson, ‘The Dreary Revels of S.F. “Gay” Clubs’, p. 5.

[33] A compiled list of gay and lesbian bars in San Francisco can be viewed in the appendix. Please note: the list accounts for establishments that opened between 1960–1977, and only contains bars where full dates of approximate open and closure occurred. Bars are recorded in ascending address order. Full credit for the information goes to Uncle Donald’s Castro Street Archive, without whom I would not have such a detailed account of gay bars in the Castro Street area at the time of the boycott. To view all bars in order, please see Uncle Donald’s Castro Street (UDCS), Uncle Donald, 12 January 2012, Castro Area Bars, [online archive] <>, accessed on 9 November 2018.

[34] Boyd, Wide-open Town, p. 160.

[35] Patterson, ‘The Dreary Revels of S.F. “Gay” Clubs’, p. 5.

[36] Boyd, Wide-Open Town, p. 160.

[37] H. Gold, ‘A Walk on San Francisco’s Gay Side’, New York Times, 6 November 1977, referenced in Black and Morries III, Harvey Milk, An Archive of Hope, p. 19.

[38] Gold, ‘A Walk on San Francisco’s Gay Side’.

[39] Frank, Out in the Union, p. 78.

[40] Frank, Out in the Union, p. 78.

[41] Uncle Donald’s Castro Street (UDCS), Allan Braid, 19 May 2007, [online resource] <>, accessed on 9 November 2018.

[42] Sweeney. ‘The Growing Alliance’, p. 32.

[43] Harvey Milk, ‘Reactionary Beer’, Bay Area Reporter, 18 March 1976, referenced in Black and Morries III, Harvey Milk, An Archive of Hope, p. 125.

[44] Black and  Morries III. Harvey Milk, An Archive of Hope, pp. 125–26.

[45] Black and Morries III. Harvey Milk, An Archive of Hope, p. 18.

[46] [Anon], The Billings Gazette, (Montana), 12 Aug 1979, p. 55.

[47] In the introduction to his book, Letters to ONE: Gay and Lesbian Voices from the 1950s and 1960s, Craig Loftin notes the power letter writing to LGBT newspapers had for homosexuals. For those who were not members of a homophile group, letter writing provided an opportunity to express their own understandings towards the treatment of homosexuals, as well as an opportunity to participate in some of the only networking organisations that allowed homosexuals from across the United States to express their attitudes and talk to others who arguably understood the difficulties faced. Also, letters offer a glimpse into the perceptions of homosexuality on a grass roots level. For more information, please read, C. Loftin, Letters to ONE: Gay and Lesbian Voices from the 1950s and 1960s ([London], 2012). LGBT newspapers, such as The Empty Closet, frequently encouraged its readers to write in with their day-to-day concerns, socio-political issues and viewpoints. For examples of this, please see The Empty Closet’s archive through River Campus Libraries (RCL), Empty Closet: Past Issues [online archive] <>, for monthly issues dated 1971–2014.

[48] Loftin, Masked Voices, pp. 4, 6–7.

[49] Frank, Out in the Union, pp. 76–77.

[50] R. West, ‘Coors Charges Brewery Union Workers’, The Los Angeles Times, 9 May 1978, p. 46; Moskowitz, ‘A Political Fight Over Beer’, p. 104.

[51] Moskowitz, ‘A Political Fight Over Beer’, p. 104.

[52] Ibid.

[53] Ibid.

[54] [Anon], ‘Coors to Replace Striking Workers with Permanent Help’, Colorado Springs Gazette-Telegraph, 11 April 1977, p. 2.

[55] M. Gay, ‘Coors Boycotted’, The Empty Closet, September 1980, p. 8.

[56] Ibid.

[57] Frank, Out in the Union, p. 80.

[58] [Anon], ‘Jury Acquits Coors, Cheyenne Firm of Anti-Trust’, Fort Collins Coloradoan, 8 June 1978, p. 28.

[59] Ibid.

[60] R. Smith, ‘Rape—A New Angle on the Same Story’, The San Francisco Examiner, 28 June 1977, p. 22.

[61] Ibid.

[62] Ibid.

[63] Ibid.

[64] Chasin, Selling Out, p. 161.

[65] [Anon], ‘Protect Ours Schools Don’t Legalize Discrimination’, The San Francisco Examiner, 3 November 1978, p. 7.

[66] Armstrong, Forging Gay Identities, p. 161.

[67] Ibid.

[68] Canady, The Straight State.

[69] The obituary that I refer to, is a narrow piece located on the page’s right-hand side. Meanwhile, the gentleman’s cost-effective suit advertisement takes up the rest of the page. See: The Philadelphia Inquirer, 3 December 1978, p. 17.

[70] J. Weeks, Sex, Politics and Society: The Regulation of Sexuality since 1800 (Harlow, 2nd Ed., 1989); Foucault, The History of Sexuality, pp. 83–85.

[71] Foucault, The History of Sexuality, pp. 77; 83–85.

[72] J. Tasini, ‘The Beer and the Boycott’, The New York Times Magazine, 1 January 1988, p. 6019.

[73] Ibid.

[74] Ibid.

[75] Moskowitz, ‘A Political Fight Over Beer’, p. 104.

[76] Hall, ‘Protest Movements in the 1970s’, p. 657.

[77] Boyd, Wide-open Town, p. 160.

[78] Frank, Out in the Union, p. 8.

[79] Chasin, Selling Out, p. 161.

[80] Meyer, Whittier and Robnett, Social Movements, p. 121; J. Scott, ‘The Evidence of Experience’, pp. 397–415.



Primary Sources

[Anon], ‘Coors Bolsters Boycott’, Santa Ana Register, 22 April 1977, p. 48.

[Anon], ‘Coors to Replace Striking Workers with Permanent Help’, Colorado Springs Gazette-Telegraph, 11 April 1977.

[Anon], ‘Gimble Gifts’, The Philadelphia Inquirer, 3 December 1978.

[Anon], ‘Jury Acquits Coors, Cheyenne Firm of Anti-Trust’, Fort Collins Coloradoan, 8 June 1978, p.28.

[Anon], ‘Protect Our Children Don’t Legalize Discrimination’, The San Francisco Examiner, 3 November 1978.

Black, J. E and Morries III, C. E, Harvey Milk, An Archive of Hope: Harvey Milk’s Speeches and Writings, (London, 2013).

Digital Public Library of America, (DPLoA), Eduardo Morga, 30 August 1977, [online archive] <>, accessed on 16 November 2018.

Fortune, D., ‘Gays Icy Towards Coors Courtship’, The San Francisco Examiner, 26 October 1977.

Gay, M., ‘Coors Boycotted’, The Empty Closet, 1 September 1980, p. 8.

Ledwell, T., ‘S.F. Gays Mourn Loss of Leader’, The Philadelphia Inquirer, 3 December 1978.

Moskowitz, M., ‘A Political Fight Over Beer’, The San Francisco Examiner, 18 April 1976.

Oklahoma State University Library, item oksa_phelps_11-07-0035, 1977, Edna Mae Phelps Collection, [online resource] <>, accessed on 16 January 2019, and at the Digital Public Library of America, [online archive] <>, accessed 18 November 2018.

Online Archive of California, (OAC), [unknown author], [unknown date], [online archive] <>, accessed on 23 January 2019.

Patterson, R., ‘The Dreary Revels of S.F. “Gay” Clubs’, The San Francisco Examiner, 25 October 1969.

Simon, R., ‘Rape – A New Angle on an Old Story’, The San Francisco Examiner, 28 June 1977.

Tasini, J., ‘The Beer and the Boycott’, The New York Times Magazine, 31 January 1988.

The Billings Gazette, 12 August 1979.

The Empty Closet, 1 June 1978.

Uncle Donald’s Castro Street (UDCS), Uncle Donald, 12 January 1977, Castro Area Bars, [online archive] <>, accessed on 9 November 2018.

Uncle Donald’s Castro Street, (UDCS), Allan Braid, 19 May 2007, [online archive] <>, accessed on 9 November 2018.

Valley News, 26 August 1977.

West, R., ‘Coors Charges Brewery Union Workers’, The Los Angeles Times, 9 May 1978.


Secondary Sources

Abelove, H. et al. (eds), The Gay and Lesbian Studies Reader (New York, 1993), pp. 397–415.

Agee, C., ‘Gayola: Police Professionalization and the Politics of San Francisco’s Gay Bars, 1950–1968’, Journal of the History of Sexuality, 15/3 (2006), pp. 462–89.

Armstrong, E., ‘Movements and Memory: The Making of the Stonewall Myth’, American Sociological Review, 71/5 (2006), pp. 724–51.

Armstrong, E., Forging Gay Identities: Organizing Sexuality in San Francisco, 1950–1994 (London, 2002).

Boyd, N., Wide-Open Town: A History of Queer San Francisco to 1965 (Berkeley, 2003).

Brick, H. and Phelps, C., Radicals in America: The U.S. Left Since the Second World War (Cambridge, 2015).

Canaday, M., The Straight State: Sexuality and Citizenship in Twentieth-Century America (London, 2009).

Chasin, A., Selling Out: The Gay and Lesbian Movement Goes to Market (New York, 2000).

Chauncy, G., Gay New York: Gender, Urban Culture, and the Making of the Gay Male World 1890–1940 (New York, 1994).

D’Emilio, J., Sexual Politics, Sexual Communities (London, 2nd Ed., 1998).

Engle, S. M., The Unfinished Revolution: Social Movement Theory and the Gay and Lesbian Movement (Cambridge, 2001).

Esterberg, K. G., ‘From Illness to Action: Conceptions of Homosexuality in The Ladder: 1956–1965’, The Journal of Sex Research, 27/1 (1990), pp. 65–79.

Foucault, M., The History of Sexuality (London, Vol. 1, 1978).

Frank, G., ‘Discophobia: Antigay Prejudice and the 1979 Backlash Against Disco’, Journal of the History of Sexuality, 15/2 (2007), pp. 276–306.

Frank, M., Out in the Union: A Labor History of Queer America (Philadelphia, 2015).

Gosse, V. and Moser, R., The World the Sixties Made: Politics and Culture in Recent America (Philadelphia, 2003)

Hall, S., ‘Protest Movements in the 1970s: The Long 1960s’, Journal of Contemporary History, 43/4 (2008), pp. 655–72.

Hall, S., ‘The American Gay Rights Movement and Patriotic Protests’, Journal of the History of Sexuality, 19/3 (2010), pp. 536–562.

Hall, S., Peace and Freedom: The Civil Rights and Antiwar Movements of the 1960s (University of Pennsylvania Press, 2005).

hooks, b., Feminist Theory: from Margin to Center (Oxon, 2015), pp. 18–19.

Houlbrook, M., ‘Lady Austin’s Camp Boys: Constituting the Queer Subject in 1930s London’, Gender Studies, 14 (2002), pp. 31–61.

Joseph, P., Peace and Change: A Journal of Peace Research, 40 (2015), pp. 272–76.

Krupat, K. and McCreery, P., ‘Homophobia, Labor’s New Frontier? A Discussion with Four Labor Leaders’, Social Text, Out Front: Lesbians, Gays, and the Struggle for Workplace Rights, 61/ (1999), pp. 59–72.

Loftin, C., ‘Unacceptable Mannerisms: Gender Anxieties, Homosexual Activism, and the Swish in the United States, 1945–1965’, Journal of Social History, 40/2 (2007), pp. 577–96.

Loftin, C., Letters to ONE: Gay and Lesbian Voices from the 1950s and 1960s ([London], 2012).

Loftin, C., Masked Voices: Gay Men and Lesbians in Cold-War America (New York, 2012).

Meyer, D., Whittier, N. and Robnett, B., Social Movements: Identity, Culture, and the State (Oxford, 2002).

Rouge Ramierez, H. N., ‘“That’s My Place!”: Negotiating Racial, Sexual, and Gender Politics in San Francisco’s Gay Latino Alliance, 1975–1983’, Journal of the History of Sexuality, 12/2 (2003), pp. 224–58.

Shepard, B., ‘Bridging the Divide Between Queer Theory and Anarchism’, Sexualities, 13 (2010), pp. 511–27.

Stein, M., ‘Theoretical Politics, Local Communities: The Making of US LGBT Historiography’ GLQ: A Journal of Lesbian and Gay Studies, 11/4 (2005), pp. 605–25.

Suran, J. D., ‘Coming Out Against the War: Antimilitarism and the Politicization of Homosexuality in the Era of Vietnam’, American Quarterly, 53 (2001), pp. 452–88.

Sweeny, J. J., ‘The Growing Alliance Between Gay and Union Activists’, Social Texts, Out Front: Lesbians, Gays, and the Struggle for Workplace Rights, 61/4 (1999), pp. 31–38.

Turner, W. B., ‘Review: Nan, Boyd. Wide-Open Town: A History of Queer San Francisco to 1965, Journal of American History, 91/1 (2004), pp. 264–66.

Weeks, J., Sex, Politics and Society: The Regulation of Sexuality since 1800 (Harlow, 2nd Ed., 1989).

Book Review: B. Simms, Britain’s Europe: A Thousand Years of Conflict and Cooperation (London, 2016)

In this article, Robert reviews Britain’s Europe: A Thousand Years of Conflict and Cooperation by Brendan Simms, published immediately prior to the referendum on Britain’s membership of the European Union in 2016. The book challenges the existing historical tradition that places Britain as exceptional due to its insular geography and instead gives an account of the centrality of European relations to British home and foreign policy, in the form of a narrative from the medieval period to the present, concluding with a section on modern relations with the European Union. The result is a stimulating read, though is not without shortcomings, most notably in relation to the brisk treatment given to the British Empire.



Robert Frost

Author Biography

Robert Frost (@RobertF32691246) is a first-year AHRC-funded doctoral student with joint Geography and History department supervision for his research on Georgian and early Victorian travel and exploration in the Eastern Mediterranean.

B. Simms, Britain’s Europe: A Thousand Years of Conflict and Cooperation (London, 2016)

By suggesting that the history of England, and later that of the United Kingdom, has been one predominantly determined by its relationship with neighbouring Europe, as opposed to its geographical separation as an island, Brendan Simms propounds a subtle not entirely original, but stimulating paradigm shift in how British history should be viewed,  though by no means one without problems. Britain’s Europe offers a longue durée of over one-thousand years of political history, which covers both Britain’s international relations and its own constitutional development. Simms has two central arguments. First, British foreign policy has consistently been based on a grand strategy of preventing continental Europe from being dominated by a single power, especially in the Low Countries, though later moving east to an obsession with Halford Mackinder’s heartland theory. This was achieved time and again by the country building coalitions to oppose an expansionist power, whether King Phillip II’s Spain or Napoleon’s France. Second, the form of the United Kingdom’s own political geography has been primarily forged in response to its engagement with Europe. Simms traces the emergence of the English nation-state to Alfred the Great’s opposition to the Danes and interprets the Union of the Crowns and the Acts of Union as efforts to expand the resources of England and prevent encirclement by France. By contrast, the British Empire is portrayed solely as a means to increase Britain’s standing in Europe rather than as a legitimate enterprise in its own right. Simms also challenges other quasi-isolationist approaches, in particular the ‘Our island story’ narrative, as particularly grotesque distortions of a reality in which Britain has far more often than not been part of a cross-channel state in some form.[1] Though these ideas do not totally convince, they parallel other authors’ attempts at provincialisation. Simms’ lineage includes Hugh Kearney’s call for a four-nation ‘Britannic’ alternative to ‘self-contained’ histories of England, an approach widened again by Norman Davies’ efforts to set the whole of the British Isles in its European context, which is ultimately Simms’ starting-point.[2] Perhaps the ultimate provincialisation was Brotton’s consideration of Elizabethan England/ Britain in its relation with the geographically-proximate Islamic world, though like Simms, he summarises his approach as being to enrich British history rather than diminish it.[3]

Britain’s Europe consists of ten chapters, which are evenly-spaced chronologically after a brief account of the medieval period. Four-fifths of these offer a chronological narrative of Britain’s history, with interactions with Europe given the centre stage. ‘The Bonds of Christendom’ recounts English/ British-European relations up to the fifteenth century, starting in quite a traditional manner with Alfred’s response to Viking raiders leading to the formation of the English nation-state.[4] Simms reinterprets the Cinque Ports as a ‘cross-channel ferry service’ to link the Anglo-Norman/ French and later Angevin, domains.[5] Simms notes John of Gaunt (Ghent), whose speech is held highly by insular-focused historians such as Christopher Lee, had French origins, as many nobles did, while a common Christian culture provided the basis for crusader alliances.[6] ‘A piece of the continent’ outlines the origins of the (aforementioned) grand strategy that Simms forwards as taking place during national soul-searching after England’s defeat in the Hundred Years’ War.[7] The critical importance of the Low Countries, described as the ‘counter-scarp’ by William Cecil and ‘outworks’ by others, takes shape in an age of England’s navy having neither the technology nor ability to intercept a cross-channel force; the channel could only be a second line of defence.[8] Hence England made common cause with the Dutch early on.[9]

‘The bulwarks of Great Britain’ introduces the importance of Germany and its various incarnations, starting with the Holy Roman Empire, as a key counterbalancing power. Simms also argues that the overlooked union of ‘Hanover-Britain’ was a truly European state.[10] He includes the interesting vignette that before the late eighteenth century, those referring simply to ‘The Empire’ meant the Holy Roman Empire, but even when the expanding British Empire was in mind it was regarded as valuable only in terms of the increased strength it could bring on Europe, especially in territorial swaps such as after the Seven Years’ War.[11] ‘The Age of revolution’ on the French and American revolutionary wars serves as a warning as to what could happen when Britain sidelined continental engagement in favour of an imperial ‘blue water’ approach: the ‘first’ British Empire was partitioned.[12]

‘The age of Napoleon’ recounts what may be the best-known pre-twentieth century example of an isolated Britain bringing together a grand coalition and leading it to eventual victory.[13] Simms introduces the ‘fiscal-military’ state as a key advantage that Britain had over rival states, especially France. By way of an ‘implicit contract’ that had grown up between political elites and private finance over the preceding century, the country was able to tap into private wealth generated during the Industrial Revolution by way of credit. In turn, parliamentary democracy gave the British state greater legitimacy than others.[14] Simms also finds the threat from revolutionary France to be decisive in leading to the Act of Union with Ireland in 1800.[15] ‘Britain and Europe in the age of nationalism’ surveys the long nineteenth century, during which Britain was forced to contend with an acquiescent German confederation morphing into a rival German Empire under Bismarck, a transformation which made the self-centred British guarantee of Belgian independent-neutrality from France dangerously anachronistic in 1914.[16]

‘Britain and Europe in the age of total war’ covers Britain’s handling of the ‘German Question’; mobilising a global coalition to prevent domination of Europe by Germany in two world wars.[17] As in 1792-1815, Simms holds Britain’s parliamentary and ‘fiscal-military’ state as key, a conclusion also recently reached by Adam Tooze.[18] Irish independence is ignored however. The final chronological chapter is devoted to events since 1945 in which Britain faced a ‘negotiated merger’ with the European Economic Community and European Union rather than a ‘hostile takeover’, which, unlike earlier Acts of Union, diluted power in Westminster.[19] Simms is critical of the chances Britain might have had in the nascent European Coal and Steel Community, maintaining that such a move would have been catastrophic for domestic industry and still-strong Commonwealth links.[20]

The final two chapters break the chronological structure to bring in an analysis of present and future trends. The first, referring to Britain as ‘the last European great power’ provides a welcome critique of the post-war ‘declinist’ discourse which has dominated so much of recent historiography, often closer to ideology than reality.[21] The final chapter differs from previous ones by offering what  comes across as an attempt to opt out of expressing a concrete position on the referendum campaign then in its final stages, by offering a quixotic call for a radically-reformed English-speaking federal EU. Simms emphasises the need for this to be created in a sudden ‘event’ in the manner of Bismarck, as opposed to the ever-closer-union ‘process’.[22] In fact, it is an argument that Simms has forwarded on several occasions, both before and after the publication of Britain’s Europe, most recently presenting Emmanuel Macron as the new Bismarck.[23] It is also a watered-down summary of the manifesto of the Project for Democratic Union think tank, though Simms omits any mention of the group and his control of its presidency.[24] Despite this, the call seems cavalier and in conflict with the rest of Britain’s Europe. Recognising that Britain would not likely join a fully-federal “superstate”, even an English-speaking one, he brushes aside concerns of his millennial-length British grand strategy thesis by insisting that relations would be friendly due to mutual self-interest.[25] This has not however stopped grandstanding during current Brexit negotiations. The idea that a majority of Europeans would vote to relinquish any remaining national sovereignty appears unlikely, especially given the massive opposition to issues such as the proposal to overcome the shortcomings of the Dublin regulations by way of EU-directed settlement of migrants to Hungary and other central/ Eastern European countries. The reader is left puzzled as to why Simms seemingly disowns his own arguments of thousand-year precedent for the future. A comparison with the strong federal nature of Germany also makes the reader wonder whether the apparently hyperdynamic British model is the best option.

The principal consistent weak point in Simms’ argument however, is surely the secondary role he gives to the British Empire. Though Simms mentions kinship links between members of the Medieval English elite and Europe, his primarily political perspective leaves little room for considering that most kinship links in the nineteenth and twentieth centuries were imperial due to emigration.[26] There are also more strictly political shortcomings. The argument that expansion of the British Empire was due to a desire to strengthen Britain’s place in Europe overlooks eagerness for colonial plunder. The Scramble for Africa culminating at Fashoda, the acquisition of Cyprus, and the exchange of Helgoland for faraway Zanzibar brought along tense Anglo-French relations, Turkish alignment with the Central Powers and a strengthened Germany.[27] Likewise, Britain’s first twentieth-century alliance was with Japan, in part to bolster its interests in China against Germany and Russia. By writing British possessions in the Mediterranean off as imperial, Simms marginalises them in favour of Northern Europe, especially the ‘German Question’, thus missing the extent to which that sea became a ‘British lake’ up to the mid-twentieth century, causing Italian hesitation in entering both world wars.[28] The idea that British decolonisation was swift, clean and driven by a desire to keep up appearances in Europe also ignores the renewed enthusiasm for empire after 1945, the drawn-out nature of decolonisation in Kenya and the impact of US pressure.[29]

Though the abovementioned omissions are serious and provide a somewhat ironic warning over the dangers of excessive Eurocentrism, they should at the same time not mask the common ground between Simms and historians of empire such as Niall Ferguson and John Darwin. Both give Europe a central role, the former in the twentieth century in particular, while the latter goes as far as describing the American War of Independence as ‘almost a side-show’ next to the Anglophobic League of Armed Neutrality.[30] Also like Darwin, Simms’ methodology combines extensive secondary literature with plentiful primary sources (in his case mainly quotations from diplomats and politicians), and reaches a good compromise between breadth and depth, crucial to such a grand survey. One of the key strengths of the book is its treatment of the English Channel being as much a highway as a barrier. That Britain’s frontiers lie in the Low Countries is a fascinating concept. Though some of the quotations appear metaphorical, the events that Simms recounts from the Hundred Years’ war and Anglo-Dutch wars through to Napoleon and the twentieth century provide a strong argument against the idea that Britain was regarded as detached from Europe by contemporaries. [31] In many cases of critique, the reader is left wanting more, rather than change. Though Simms includes an incredible twenty pages of maps at the beginning showing Britain’s long-standing territorial links with Europe, he leaves many details out. Why certain features, such as the ‘British postal intercept station’ at Celle, were important is not fully explored.[32] More crucially though, an expanded section on what the union of the crowns with Hanover looked like on the ground would have helped overcome the book’s social-cultural shortcomings: the reader is left assuming that since Westminster did not include Hanoverian MPs as Dunkirk once did, the trans-channel state was analogous to Anglo-Scottish relations prior to the Act of Union (1707). Similarly, the ability of the reader to think of several examples that could have been included in Britain’s Europe, such as the Hanseatic League and Anglo-Portuguese alliance surely strengthens the thesis.

To conclude, Simms’ thesis is convincing, with the exception of his marginalisation of the British Empire. Even here however, the reviewer would place this factor as of equal importance to Europe as opposed to greater importance. Although Simms’ manifesto seems impractical, it is at least as interesting as it is unorthodox. Overall, Britain’s Europe provides a welcome revision of Britain’s place in relation to the continent, highlighting an obsession with cooperation to win conflict on the continent at a time when many apparently believe that Britain can leave Europe altogether.


[1] B. Simms, Britain’s Europe: A Thousand Years of Conflict and Cooperation (London, 2016), p. xiii; Simms is particularly critical of Arthur Bryant for giving this narrative credibility, in his work such as Set in a Silver Sea.

[2] H. Kearney, The British Isles: A History of Four Nations (Cambridge, 1989), p. 1; N. Davies, The Isles: A History (London, 1999).

[3] J. Brotton, This Orient Isle: Elizabethan England and the Islamic World (London, 2017), p. 305.

[4] Simms, Britain’s Europe, pp. 1-3.

[5] Simms, Britain’s Europe, p. 4.

[6] Simms, Britain’s Europe, pp. 7-9.

[7] Simms, Britain’s Europe, pp. 20-22.

[8] Simms, Britain’s Europe, p. 30.

[9] Simms, Britain’s Europe, pp. 31-32.

[10] Simms, Britain’s Europe, p. 55.

[11] Simms, Britain’s Europe, pp. 52-69.

[12] Simms, Britain’s Europe, pp. 71-92.

[13] Simms, Britain’s Europe, p. 114.

[14] Simms, Britain’s Europe, pp. 98-110.

[15] Simms, Britain’s Europe, p. 112.

[16] Simms, Britain’s Europe, pp. 116-142.

[17] Simms, Britain’s Europe, pp. 143-144.

[18] Simms, Britain’s Europe, pp. 145-164; A. Tooze, The Deluge: The Great War and the Remaking of Global Order (London, 2014), pp. 173-217.

[19] Simms, Britain’s Europe, p. 170.

[20] Simms, Britain’s Europe, pp. 177-178.

[21] Simms, Britain’s Europe, pp. 206-218; J. Tomlinson, The Politics of Decline: Understanding Post-war Britain (Harlow, 2000).

[22] Simms, Britain’s Europe, pp. 219-227.

[23] B. Simms, ‘Towards a mighty union: how to create a democratic European superpower’, International Affairs, 88/ 1 (2012), pp. 49-62; B. Simms, ‘The ghosts of Europe’s past’, New York Times, 10 June 2013, p. 23; B. Simms, ‘The storm on fortress Europe: the continent’s old crises have not been resolved’, New Statesman, 24-30 November 2017, p. 29.

[24] Project for Democratic Union <>, accessed 19.4.2018.

[25] Simms, Britain’s Europe, pp. 235-236.

[26] W. S. Churchill, History of the English-Speaking Peoples: The Birth of Britain, Volume 1 (London, 1956), pp. vii-viii.

[27] T. Pakenham, The Scramble for Africa (London, 1991).

[28] C. Duggan, A Concise History of Italy (Cambridge, 1994); R. Holland, Blue-water Empire: The British in the Mediterranean Since 1800 (London, 2012).

[29] J. Darwin, Unfinished Empire: The Global Expansion of Britain (London, 2012).

[30] N. Ferguson, Empire: How Britain Made the Modern World (London, 2004); Darwin, Unfinished Empire, p. 317.

[31] For instance, Stanley Baldwin’s assertion that Britain’s frontiers lay on the Rhine or Elbe, in Simms, Britain’s Europe, p. 157, cannot be taken anywhere near literally, or as something that must be defended, rather they imply that Britain had an interest in Germany. Harold Macmillan took a similar approach to show solidarity with India against communist China by declaring that ‘Britain’s frontiers are on the Himalayas’ in 1965, Darwin, Unfinished Empire, p. 378. However Simms does point out that due to NATO commitments, ‘the United Kingdom’s eastern defence perimeter now effectively ran and runs along the eastern flank of the European Union’, Simms, Britain’s Europe, p. 197.

[32] Simms, Britain’s Europe, pp. xxvii.


Brotton, J., This Orient Isle: Elizabethan England and the Islamic World (London, 2017).

Churchill, W. S., History of the English-Speaking Peoples: The Birth of Britain (London, 1956).

Darwin, J., Unfinished Empire: The Global Expansion of Britain (London, 2012).

Davies, N., The Isles: A History (London, 1999).

Duggan, C., A Concise History of Italy (Cambridge, 1994).

Ferguson, N., Empire: How Britain Made the Modern World (London, 2004).

Holland, R., Blue-water Empire: The British in the Mediterranean Since 1800 (London, 2012).

Kearney, H., The British Isles: A History of Four Nations (Cambridge, 1989).

Lee, C., This Sceptred Isle (London, 1998).

Simms, B., ‘Towards a mighty union: how to create a democratic European superpower’, International Affairs, 88/ 1 (2012), pp. 49-62.

Simms, B., ‘The ghosts of Europe’s past’, New York Times, 10 June 2013, p. 23.

Simms, B., Britain’s Europe: A Thousand Years of Conflict and Cooperation (London, 2016).

Simms, B., ‘The storm on fortress Europe: the continent’s old crises have not been resolved’, New Statesman, 24-30 November 2017, pp. 24-29.

Tomlinson, J., The Politics of Decline: Understanding Post-war Britain (Harlow, 2000).

Tooze, A., The Deluge: The Great War and the Remaking of Global Order (London, 2014).

Pakenham, T., The Scramble for Africa (London, 1991).

Project for Democratic Union <>, accessed 19.4.2018.

Beyond Antisemitism: Hungarian Ideological and Pragmatic Motivations for the Holocaust


Capitol, Capital, and the Ancient City: The Influence of Roman Urban and Architectural Models of the Design of the Capitol Building in Washington D.C.

Capitol, Capital, and the Ancient City In his 1992 landmark text Architecture, Power, and National Identity Lawrence Vale demonstrated the extent to which government buildings ‘serve as symbols of the state’ and how one can ‘learn much about a political regime by observing closely what it builds’.[1] The Residence Act of 1790 gave the American […]

Why Were Colonial Powers Interested in Sexuality?

Why were colonial powers interested in sexuality? In his 1847 account of the Aboriginal Australians, designed to familiarise new white settlers with the indigenous population, George Angus made sure to note why the settlement of aboriginal lands was entirely justified. ‘The population of the native tribes inhabiting South Australia is not considerable’ remarked Angus, because […]

‘Othering’ and the Persistence of Imperial Attitudes: Media Representations of Ethnicity, Gender and Class in the Grunwick Dispute

In this article Phoebe Brown analyses media representations of the 1976-1978 Grunwick industrial dispute. Phoebe focuses on the role of the South Asian women involved, analysing a variety of media sources and highlighting how they emphasised particular aspects of the strikers’ identity  to serve diverse political agendas: the right-wing press, for example, emphasised the women’s ethnicity and gender to undermine their position as workers and political activists so as to not disrupt their prevailing ethnocentric vision of the social order. The socialist media, on the other hand, emphasised the women’s position as workers and political activists, depicting the Union movement as inclusive of minorities. Overall, Phoebe highlights how and why the media representation of the strikers did not acknowledge the complexity of the South Asian women’s identities. The ‘othering’ of the South Asian women and the media’s reinforcement of various stereotypes demonstrates how difficult Britain found transitioning to an increasingly diverse, post-colonial society. Contemporary interpretations and commemorations of the Grunwick dispute provide further evidence of how this transition may, as yet, be far from complete.

Download a PDF Version of this Article Here

Phoebe Brown

Author Biography 

Phoebe Brown graduated from the University of Nottingham in 2017. This article formed part of Phoebe’s undergraduate dissertation supervised within the Department of History.

Read more

The Underlying Dynamics of Colombia’s Civil War

In this article Oliver Dodd examines the processes of capitalist development to account for the underlying dynamics of the Colombian Civil War (1964-2002). Oliver argues that economic development did not take place in an orderly or steady manner, but rather involved conflict and antagonism between various social-class forces engaged in a ‘struggle for hegemony’. The article, therefore, concludes that it was capitalist development in Colombia which led directly to the political violence of the Civil War.

Download a PDF Version of this Article Here

Oliver Dodd

Author Biography 

Oliver Dodd is starting a PhD in September 2018 examining Colombia’s 2016 peace agreement. He completed a masters in International Relations at the University of Nottingham and undergraduate studies at Aberystwyth University. His MA dissertation sought to explain the underlying dynamics of Colombia’s armed conflict. Oliver regularly conducts ethnographic research in Colombia, and as part of such field-work has spent five-months observing the National Liberation Army of Colombia (ELN).

Read more

The “Russian” Woman? Cultural Exceptionalism among Noblewomen in Late Imperial and Revolutionary Russia

In this article Darcie Mawby poses two important questions: firstly, to what extent did cultural exceptionalism exist among Russian noblewomen in the late imperial and revolutionary periods? Secondly, were Russian noblewomen part of a transnational European elite, or is national specificity integral to understanding their identity construction? In doing so Darcie provides important insights into the extent to which Russian noblewomen consciously engaged with national and international ideological developments related to marriage, education and adult vocations and the impact these interactions exerted on their sense of national identity. Through a comparison with the written work of English upper-class women, particularly travel accounts of Russia, Darcie identifies points of similarity and departure which highlight instances of transnational cultural crossover and national specificity. This article offers new interpretations of cultural exceptionalism and national identity in Europe during the increasingly global nineteenth- and early-twentieth centuries.

Download a PDF Version of this Article Here

Darcie Mawby

Author Biography 

Darcie Mawby is a Masters student in the Department of History at the University of Nottingham. This article formed part of her Undergraduate dissertation which was completed in the summer of 2017.

Read more

Pierre Nora, Memory, and the Myth of Elizabeth I

In this article Tom Rose explores the dominant theme in cultural history: the concept of memory. Tom argues  that the concept of memory should be a vital component of early-modern studies and evaluates the applicability of the theorist Pierre Nora to the mythology of Elizabeth I.

Download a PDF Version of this Article Here

Tom Rose

Author Biography

Tom Rose is a Midlands3Cities AHRC-funded PhD student based at the University of Nottingham. Tom’s current research explores the relationship between hunting, politics and culture in early Stuart England. This essay explores the role of memory in the production of history and was written for the University of Nottingham History Masters module ‘Research Methods in History’.

Read more

Ants and Cicadas: South American Football and National Identity

Ants and Cicadas Introduction Despite having spent centuries together as part of the Spanish colonial Viceroyalty of the Río de la Plata, the independence wars of the nineteenth century and their aftermath saw Argentina and Uruguay separate, with the creation of the latter as an independent buffer state guaranteed by the UK in 1827 to […]

Book Review: Daniel Martin Varisco’s, Orientalism: Said and the Unsaid


In this article David Robinson explores how historians communicate, interpret and commentate on the work of Edward Said. As David acknowledges, most Arts and Humanities students will encounter  Said’s canonical work, Orientalism, at some point during their degree. For those uninitiated or inexperienced in literary criticism, however, it can be a difficult, even opaque, text. Unsurprisingly, many turn to commentaries on Orientalism; to borrow a bad pun from the work under review here, to see what has been said about Said. David argues that while Daniel Martin Varisco’s Reading Orientalism: Said and the Unsaid, (Seattle, WA., 2007) is certainly a comprehensive study and is to be recommended to students as a reference work on Said,  it fails in its primary aim of going ‘beyond the binary’ of East versus West.

David Robinson

Author Biography

David Robinson is a second-year, AHRC-M3C-funded PhD student supervised by the Department of History at the University of Nottingham. His thesis is entitled ‘Orientalism or Meridionism? Comparing Imperial and European Travel Writing in the Creation of British and European Identity’ and explores the British construction of Italy and India as cultural and geographical spaces contributing to British identity formation.


Reading Orientalism

Edward Said’s Orientalism is the seminal work proposing a ubiquitous ‘othering’ of the Orient by Europe, evident in canonical European literature from Aeschylus onwards, a process Said called ‘orientalising’. Said claimed that the Orient was ‘almost invented’ by the West, as a feminised, exoticised, and eroticised space; an unchanging and unchangeable mirror-image of the rational, morally and culturally superior Occident.[1] ‘Orientalising’, claimed Said, was largely responsible for two centuries of European imperialism.[2] Attracting adoration and vitriol equally, Said was an American scholar with a Palestinian heritage, politically active in the cause of his cultural homeland.[3] Orientalism has, consequently, a significant political edge, polarising opinion as either a brilliant expose of Western prejudice, or a polemical rant which ‘invents’ the West as equally as Said accuses the West of inventing the East. Regardless, Orientalism has remained in print since 1978 and ‘its influence can hardly be disputed.’[4] Credited by many as the founding text of post-colonialism, Orientalism remains one of the most cited academic works of modern times.[5]

Read more

Guess Who’s Coming to Dinner and Hollywood’s Misrepresentation of the Politics of Interracial Relationships in 1960s America

Guess Who’s Coming to Dinner Guess Who’s Coming to Dinner (1967) is a Hollywood film, starring Sidney Poitier as an African-American man who is engaged to Joanna Drayton, a white woman with liberal parents. The film, directed by Stanley Kramer, depicts the reactions of the couple’s parents to their prospective union, ultimately emphasising an acceptance […]