Before the 1990s, FVIII was obtained from whole blood donations (Figure 1), then transfused into haemophilia patients. Blood transfusions began in 1818 when James Blundell, an English physician, performed a human-to-human blood transfusion. Although the patient subsequently died, Blundell was committed to achieving a successful blood transfusion. Throughout the early 1800s, he experimented with a series of patients, of which, over half survived. This remarkable achievement may have been due to luck, as no scientific understanding relating to blood groups and their compatibility were yet discovered. It was not until the twentieth century that Viennese scientist, Karl Landsteiner, discovered the reactivity of isoagglutinin’s (antibodies causing clumping of red blood cells) in the blood, with red blood cells of other test subjects. This discovery led Landsteiner to identify the four main blood groups, A, B, AB and O (Alter and Klein, 2008)
Blood transfusion changed in 1940, when Dr Edwin Cohn, a protein chemist based at Harvard University, developed a technique to separate the various fractions of plasma. One fraction that proved especially useful, was Albumin, which was used in treating soldiers during World War 2 (WW2) who had suffered severe blood loss. Albumin kept blood vessels dilated, maintaining oncotic pressure and blood flow to the damaged areas. Also, unlike whole blood, which had to be used almost immediately after collection, Albumin could be stored for long periods before use (Learoyd, 2012). Throughout the 1950s and early 1960s, haemophiliacs were treated with either whole blood or fresh plasma. However, such products lacked sufficient FVIII to successfully stop serious bleeding. Consequently, patients with severe haemophilia died prematurely, most commonly from brain haemorrhages and post-surgery bleeds (Morfini, 2017). In 1965, American scientist Judith Pool utilized the Cohn fractionation method in the discovery of cryoprecipitate. After freezing plasma and then allowing it to slowly thaw, Dr Pool identified a thick, insoluble residue of clotting FVIII remained (Bevan, 2009). This residue, termed cryoprecipitate, was found to have 30% more clotting power than fresh frozen plasma (Philip et al., 2014). Cryoprecipitate was instrumental to the research conducted by American scientists, Brinkhous and Shanbrom in 1968. They wanted to produce a product that could be stored for even longer and administered at home; by dissolving cryoprecipitate further, then treating it with chemicals, Brinkhous and Shanbrom were left with a crystalline powder of concentrated clotting FVIII (Schmidt, 1999). This new purified blood product, termed FVIII concentrate, was freeze-dried during processing to prolong expiry and was used predominantly for the treatment of Haemophilia (Krever, 1997). However, one disadvantage that arises through the manufacture of FVIII concentrates, is key to the events discussed throughout this report. Unlike cryoprecipitate, where the risk of viral transmission is relatively low – as each bag is made from one or two donations -FVIII concentrates are prepared from pools of blood from thousands of donors; significantly increasing the risk of contracting viral diseases such as Hepatitis and human immunodeficiency virus (HIV) (Craske, Dilling and Stern, 1975).
By the 1970s, fractionators in the United States (US) had begun research into creating an ‘inactivation’ method that would reduce the risk of transmission of Hepatitis B (HepB) and the virus causing AIDS. Towards the end of 1970, one of the most promising methods of ‘inactivation’ was heat-treatment of factor concentrates. During heat treatment, factor concentrates are heated at ~70 degrees for up to 72 hours. The purpose of this method is to inactivate the virus, such that, it can no longer function and cause infection (McGrath et al., 1985). Despite commercial fractionators spending significant resources on developing heat-treatment of factor concentrates throughout the 1970s, significant strides in producing a commercially available product, only began in the early 1980s. The first heat-treated FVIII product was introduced in the United Kingdom (UK) in 1984 (Penroseinquiry.org.uk, 2015).
During the mid-1970s, the UK health system struggled to manage the increasing demand of FVIII concentrates. This led commercial fractionators to obtain licences in the UK permitting the market of FVIII concentrates sourced from paid US donors. By the 1980s, FVIII concentrates had become a profitable business for commercial fractionators, who were distributing their products to many western countries (Hagen, 1982). Almost all western European countries faced difficulties in meeting the blood product demands of their haemophilia populations; many national governments agreeing that the importation of commercial FVIII concentrates from America, was the most effective solution. UK organisations went one step further to ensure blood product demand was met, even in the case of emergencies. They entered into what was known as “short supply agreements” with American Hospitals and blood banks, where supplies of recovered plasma could be sent over at short notice. In addition, plasma was bought from plasma brokers through what known as the ‘international spot market’, where goods could be delivered almost immediately when demands unexpectedly surged. (Krever, 1997). Donors who provided recovered plasma obtained from blood banks and hospitals, were voluntary, non-remunerated individuals. Whereas, source plasma was usually collected by commercial fractionators from numerous donor populations, including prisons, borstal institutions, poor neighbourhoods and universities. Prisoners and homosexual men were considered to be a valuable source of plasma, as donors usually had a high count of HepB antibodies. For this reason, they were integral during the manufacture of the HepB vaccine (Leveton, Sox and Stoto, 1996). As presented later in this report, such groups were found to be at high risk of transmitting HIV when donating blood through the 1980s.
In early 1984, the work of Dr. Robert Gallo, an American biomedical researcher, led to the discovery of the infectious agent responsible for acquired immune deficiency syndrome (AIDS). With Gallo’s research, a test for HIV was developed in late-1984 to detect the virus in blood. The introduction of HIV testing was adopted by commercial fractionators, blood banks and national transfusion services in most Western countries (such as Ireland, USA and England) by 1985 (Schmidt, 2000).
It was found that HIV attacks the body’s immune system, specifically CD4+ cells (T helper lymphocytes), which help to fight off infections. Without a host cell, HIV cannot grow or reproduce. Rather, the virus fuses with CD4+ cells, taking control of DNA replication to rapidly multiply (Avert, 2017). If untreated, HIV continues to reduce CD4+ count, causing patients to become more susceptible to other infections. A weakened immune system to fight off infection indicates that a patient had AIDS, the last stage of HIV infection (Centers for Disease Control and Prevention, 2019). In 1983, it was recognised that certain groups were at higher risk of contracting AIDS than others. This includes the “four-H club”, as termed by the media: Haemophiliacs, Homosexuals, Heroin users and Haitians (Healthline, 2003). Many of these groups are also at risk of Hepatitis, an inflammatory condition of the liver most commonly caused by 5 main hepatitis viruses, type – A, B, C, D and E. Hepatitis A and E are usually caused through the ingestion of contaminated food or water. Hepatitis D infection occurs only in patients already infected with Hepatitis B. Throughout this report, the significance of Hepatitis B and C during the blood contamination period will be discussed. Both viruses are usually contracted as a result of parenteral contact with infected body fluids such as blood plasma (World Health Organization, 2019).
Once HIV testing became available in 1985, it was found that individuals suffering from haemophilia and other blood disorders such as anaemia, blood cancer and sickle cell disease had received contaminated blood products. Ultimately, patients became positive for blood-borne viruses such as HIV and Hepatitis. (Brecher and Hay, 2005) Statistics for the UK haemophilia population indicated that over 30% had been infected with HIV; for those with severe haemophilia, who were using commercial factor concentrates more regularly, this infection rate was much higher at 75% (Rizza, Spooner and Giangrande, 2008). Haemophilia doctors were professionally committed to providing their patients with the treatment option believed to be most effective. As such, many continued to prescribe contaminated factor concentrates throughout the early 1980s; scientists were unable to produce adequate evidence pushing for the return to earlier treatment methods such as cryoprecipitate, during the time when HIV/AIDS emerged as a risk (Crane and Kaplan, 1973). Figure 2 gives a summary of the key events that have been outlined in this chapter and those that follow.