Modern genetics: building the foundations
Almost everything we know about what it means to be biologically human ultimately depends upon a single, simple fact: that each of our cells contains a set of genetic instructions to build and maintain a working body, which is passed from parent to child to form the chain of continuity that links the generations. The information encoded in the genetic material – DNA – reveals the rich diversity of humankind. Biomedical technologies use this information to diagnose disease and design new treatments.
The research that led us to this point was one of the most significant achievements of 20th-century science. It began with breeding experiments in plants and animals, and family studies in humans, that established basic principles of inheritance long before anyone understood what a gene was. It was not until the advent of molecular biology in the mid-20th century that scientists understood the key part played by DNA, and soon afterwards it became possible to read the genes themselves.
Modern genetics combines the insights of the early geneticists with the discoveries of the molecular biologists, giving us powerful tools in the study of medicine and human populations. The use we make of these tools is fraught with ethical dilemmas, however. Beliefs about ‘blood’, ‘breeding’ and ‘race’ long pre-date the modern science of genetics and continue to exert a powerful grip. The history of modern genetics is not only an account of how societies have responded to new scientific knowledge but also an account of how societies have shaped – and continue to shape – our understanding of genetics.
Classical genetics: establishing the rules
People have understood for centuries that offspring share some of the characteristics of their parents: more than one of Shakespeare’s characters jokes about the resemblance (or lack of it) between a father and his son as a comment on the faithfulness of the wife and mother. Similarly, plant and animal breeders knew that selecting stock with desirable characteristics would produce more of those characteristics in succeeding generations. However, it was not until the start of the 19th century that the concept of ‘inheritance’ was used to describe the transmission of traits from parents to offspring. Soon afterwards, the observations of Charles Darwin (1809–1882) on the selective breeding of animals helped him to arrive at his theory of evolution by natural selection. He knew, however, that his theory lacked a convincing mechanism of inheritance.
The Augustinian friar Gregor Mendel (1822–1884), working at St Thomas’s Abbey in Brünn (modern Brno, in the Czech Republic), was the first to make a statistical study of inheritance. He crossed pea plants with contrasting characteristics – such as the colour or the smoothness of the peas – and counted the numbers of each type that appeared in subsequent generations. He found that in many cases, only one of each pair of characters (green peas versus yellow peas, for example) was ‘dominant’ and would be expressed in the hybrid offspring. Non-dominant, or ‘recessive’, characters could re-emerge in later generations, in the proportion of one in four of the offspring of two hybrid parents. His work showed that for each character, every offspring received one ‘element’ or ‘unit’ of inheritance from each of its parents. If present, the dominant element would determine the colour of that plant’s peas, for example; however, the seed or pollen of hybrid offspring could pass either a dominant or a recessive ‘unit’ to the next generation. These observations provide the basis of our understanding of inheritance to this day.
Gregor Mendel, 1866.
We find it in every case confirmed that constant progeny can only be formed when the egg cells and the fertilizing pollen are of like character…
Mendel published his work in the journal of a local natural history society in 1866, but it did not become widely known beyond plant breeding circles until 1900. By this time the Dutch botanist Hugo de Vries (1848–1935) had independently arrived at very similar conclusions. De Vries, whose rivals insisted he acknowledge Mendel’s priority, also proposed that characters were transmitted between generations by particles or units that he called ‘pangenes’, a term later modified to ‘genes’. In Britain, the naturalist William Bateson (1861–1926) coined the term ‘genetics’ and provided the first translation of Mendel’s paper into English.
Bateson’s devotion to Mendelism, as it became known, brought him into conflict with other groups of researchers who were also investigating the mechanism of inheritance. Across the Atlantic, Thomas Hunt Morgan (1866–1945) and his colleagues at Columbia University in New York used a variety of physical and chemical techniques to generate new mutations in the fast-reproducing fruit fly Drosophila melanogaster, which they bred in milk bottles and scrutinised with microscopes. The German biologist Walther Flemming (1843–1905) had already discovered thread-like structures in the cell nucleus, later called chromosomes (the name means ‘coloured bodies’; Flemming used coloured dyes to make them visible). Morgan and his colleagues demonstrated that genes were carried on chromosomes and produced the first maps that placed the genes associated with specific mutations in order along them. Long before methods of physically locating genes became available, they did this purely on the basis that mutations that were most frequently inherited together – or ‘linked’ – would lie closest to one another on the chromosome.
From Eugenics to Population Genetics
While Bateson was slow to accept Morgan’s work, both shared a mistrust of a third major area of research into heredity: eugenics. Charles Darwin’s cousin, Francis Galton (1822–1911), had coined the term ‘eugenics’ to mean the scientific improvement of the human race through selective breeding of humans. Although it later became heavily tarnished by its association with the ‘racial hygiene’ movement of Nazi Germany, in the first few decades of the 20th century eugenics was a mainstream social and scientific movement with many proponents in Europe and North America. In 1904, Galton persuaded UCL to create the ‘Eugenics Record Office’ (later renamed the Galton Laboratory) to foster research in this field. In America, a similar facility was created as part of the Carnegie Institute’s ‘Station for Experimental Evolution’ at Cold Spring Harbor near New York. Later renamed Cold Spring Harbor Laboratory, it remains a leading research centre for genetics and genomics.
The founding director of the UCL Laboratory was Karl Pearson (1857–1936), a brilliant mathematician who developed many of the statistical techniques that are fundamental to scientific analysis today. He was also a fervent supporter of Galton’s eugenic beliefs.
Karl Pearson, 'National Life from the Standpoint of Science', 1905
You will see that my view – and I think it may be called the scientific view of a nation – is that of an organized whole, kept up to a high pitch of internal efficiency by insuring that its numbers are substantially recruited from the better stocks, and kept up to a high pitch of external efficiency by contest, chiefly by way of war with inferior races…
UCL attracted a succession of statisticians and geneticists who continued to explore the ramifications of Mendelian models of inheritance from a purely theoretical standpoint. Although the UCL ‘Biometricians’, as they were known, initially opposed Mendelism, from the 1910s onwards there was a gradual reconciliation.
Ronald Aylmer Fisher (1890–1962), who succeeded Pearson in 1933, had previously shown mathematically how Mendelian inheritance could be reconciled with the existence in populations of continuously varying characteristics such as height. J B S Haldane (1892–1964), who worked at Cambridge before becoming Professor of Genetics at UCL in 1933, worked out a mathematical basis for the rise and fall of genetic characters in populations, which drives evolution by natural selection. They, together with Sewall Wright (1889–1988) at the University of Chicago, used statistics to show how Mendelian inheritance, allowing for the occasional occurrence of new mutations, could indeed account for the evolution of species through natural selection as proposed by Charles Darwin. Biologists began to refer to this as the ‘modern synthesis’.
J B S Haldane, Perspectives in Biology and Medicine 1964;7(3):343-359.
In the consideration of evolution, a mathematical theory may be regarded as a kind of scaffolding within which a reasonably secure theory expressible in words may be built up.
One of the most distinctive and accessible examples of human individuality is blood type. In 1900 the Viennese researcher Karl Landsteiner (1868–1943) distinguished three main blood types (A, B and O); two years later his colleagues discovered a further type, AB. With certain exceptions, recipients of transfusions can only accept blood from matched donors: antibodies in blood serum attack and destroy the red cells from incompatible blood. In 1910–11, the Polish researcher Ludwik Hirszfeld (1884–1954), working in Heidelberg, demonstrated that blood types were inherited. Long before the advent of DNA analysis, blood groups began to be used to exclude fathers in cases of disputed paternity, or as evidence in criminal cases to help identify victims or suspects.
Blood groups offered a route to studying inheritance in any human being. One of R A Fisher’s first moves on taking over as the Galton Professor at UCL in 1933 was to establish a blood group unit. While one of its aims was simply to document the variety of human blood groups – many more blood group systems had been discovered after the original ABO system, including the Rhesus types – the existence of all this variety, available at the prick of a needle, provided a golden opportunity to look for linkage in humans and to map blood groups to chromosomes.
Robert Race (1907–1984) started work for Fisher at UCL in 1937. During World War II, Race and his research unit moved to Cambridge; in 1946, he returned to the Lister Institute in London to establish what was to become known as the Medical Research Council’s Blood Group Unit. With his colleague (and second wife) Ruth Sanger (1918–2001), Race located a newly discovered blood group antigen to the X chromosome: they subsequently mapped several linked genes to locations on this chromosome, pioneering genetic mapping in humans.
London was also home to the International Blood Group Reference Laboratory, where Arthur Mourant (1904–1994) collected blood samples during the 1950s and 1960s to document the worldwide distribution of blood groups. His work became a unique reference for studies of human population genetics and evolution, providing strong evidence of our shared inheritance: all human populations, and even some monkeys and apes, share the same blood group systems. Mourant argued that the distribution of blood types bore little relation to existing ‘racial’ classifications based on skin colour or facial features, undermining anthropological views on the existence of distinct human races.
Blood types helped to show the genetic similarity of every human being, but they were also part of the story of what makes every individual unique. This story grew with the discovery that although a skin graft taken from somewhere on a patient’s body could be helpful in treating burns, grafts taken from other individuals – even related donors – would always be rejected eventually. Peter Medawar (1915–1987), who joined the staff at UCL in 1951, devoted much of his career to studying what he called ‘the uniqueness of the individual’. Working with laboratory mice or rabbits, he established that graft rejection was an immune response: the graft provoked the body to produce immune cells, themselves generating antibodies to genetically dissimilar tissue. His research paved the way for the development of drugs to suppress the immune response, which made it possible for the first time to transplant kidneys, hearts and other organs.
P B Medawar, Nobel lecture 1960
The state of tolerance is specific in the sense that it will discriminate between one individual and another, for an animal made tolerant of grafts from one individual will not accept grafts from a second individual unrelated to the first; but it will not discriminate between one tissue and another from the same donor.
Molecular biology: unlocking the gene
Although the first half of the 20th century saw major advances in the understanding of inheritance, it remained a mystery how the characteristics of individuals were physically transmitted from one generation to the next. Genes were apparently attached to chromosomes, but what were they made of? An instruction book that some claimed could specify everything from eye colour to musical aptitude must surely be written in a suitably complex chemical language. Proteins, already known to exist in complex, three-dimensional forms, seemed for a while the most likely candidates.
Friedrich Miescher (1844–1895) at the University of Tübingen in Germany had discovered ‘nuclein’ in the 19th century, extracting it from white cells removed from pus-soaked bandages in his local septicaemia hospital. Later scientists distinguished two forms of this material, now known as nucleic acid: deoxyribonucleic acid (DNA) and ribonucleic acid (RNA). Although they seemed to make up a sizeable proportion of the cell nucleus, with only four subunits (compared with 20 for proteins), they seemed too simple to encode a living being.
Many insights into the biology of the gene came not from studies on humans, or even animals, but from studies on microorganisms such as bacteria and viruses. In 1928 the British bacteriologist Fred Griffith
(1879–1941) reported that harmless bacteria could be ‘transformed’ into deadly virulence if injected at the same time as dead bacteria of a virulent form. He inferred that a ‘transforming principle’ had been transferred from the dead bacterial cells to the living. In the USA, Oswald Avery
(1877–1955) and his colleagues repeated Griffith’s experiments and showed that the transforming principle was made of DNA.
Soon afterwards Alfred Hershey (1908–1997) and Martha Chase (1927–2003) at the Cold Spring Harbor Laboratory clinched the evidence that the genetic material was DNA. They were members of a loose alliance of scientists in the USA who studied the genetics of phage, tiny viruses that infect bacteria. Phage reproduce by taking over the genetic machinery inside the host bacterial cell. Using a kitchen blender to separate the cytoplasm of the infected bacteria from their cell walls, Hershey and Chase showed that DNA from the phage ended up inside the bacterium, whereas the phage protein was stuck to the outer membrane. That was enough to convince most members of the phage group, including a young postdoctoral researcher from Chicago called James Watson (b. 1928), that DNA was the molecule of inheritance. But how did it work?
From the 1930s onwards, the Cambridge physicist John Desmond Bernal (1901–1971) encouraged his young collaborators to use X-ray crystallography to find the structures of complex biological molecules such as proteins. It was an important principle for him and his colleagues that the three-dimensional structures of molecules would reveal something about how they worked. In 1947 Bernal’s former student Max Perutz (1914–2002) founded the MRC Research Unit for the Molecular Structure of Biological Systems within the Cavendish Laboratory at Cambridge. (Since 1962 this unit has flourished on its own site as the MRC Laboratory of Molecular Biology.) One of Perutz’s earliest recruits was Francis Crick (1916–2004), a physicist who had recently switched his attention to the physical basis of life. In 1951 James Watson joined him there from the USA. He persuaded Crick to change his focus from proteins to DNA.
Crick and Watson discussed the problem of the structure of DNA with Maurice Wilkins (1916–2004), who was working on X-ray studies of the molecule at King’s College London. Also at King’s was Rosalind Franklin (1920–1958), who had been given the impression by the head of the Biophysics Laboratory, John Randall (1905–1984), that she was to head the DNA research. This led to an unfortunate stand-off with Wilkins: the two pursued their research independently and with barely any discussion. Franklin produced a series of excellent X-ray photographs of fibres of DNA, which contained much of the evidence needed for a solution of the structure. But disillusioned with King’s, she accepted a job with Bernal at Birkbeck College and did not complete the interpretation of her images. Wilkins, meanwhile, showed one of the best (known as ‘Photograph 51’) to Watson without her knowledge.
DNA is made up of four principal chemical components, or bases: adenine (A), thymine (T), guanine (G) and cytosine (C). In 1950 the American biochemist Erwin Chargaff (1905–2002) had published evidence that in any sample of DNA there was always the same amount of A as T and the same amount of G as C. Watson realised that this might mean the bases on each strand of DNA were paired with their complementary bases on a parallel strand. As their glimpses of Franklin’s images and other data from King’s confirmed the possibility of a double-stranded, helical structure, Watson and Crick were able to construct a model of DNA: the double helix.
Published in Nature without much fanfare in April 1953, their discovery has since been hailed as one of the most significant of the 20th century. While Franklin published an important paper in support of the structure to coincide with its announcement, she was never to know that Watson and Crick had used critical insights from her work in building their model, for which they won a Nobel Prize in 1962. She died of ovarian cancer in 1958, ten years before the publication of Watson’s gossipy, first-person account of the discovery, 'The Double Helix'.
James Watson and Francis Crick, 1953
It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material.
With its twin strands of paired bases, the double helix revealed the possibility of a copying mechanism that would account for the inheritance of parental characteristics. Untwined and unzipped, each strand could construct a copy of itself simply by attaching new bases to their respective partners. The problem of the ‘simplicity’ of DNA was also solved. The four bases of DNA, like four colours of beads on a necklace, could be arranged along the length of the molecule in an almost infinite number of combinations. The next challenge was to work out how this four-letter language encoded the instructions to assemble proteins from the 20 available amino acids. Crick and his colleague Sydney Brenner
(b. 1927) proposed that a sequence of just three bases encoded each amino acid. The work of others, including Marshall Nirenberg
(1927–2010) and Har Gobind Khorana
(1922–2011) in the USA, eventually enabled them to crack this three-letter code. The gene turned out to be a message consisting of three-letter words written in the chemical language of DNA.
Another challenge facing Crick, Watson, Brenner and their contemporaries was to understand how the cell transcribed the genetic message into a form that could be read by the protein manufacturing machinery and how that machinery translated it into protein. The answer turned out to be that an RNA copy of the DNA sequence carries the message out of the nucleus, providing a template in the cytoplasm for the sequential construction of the protein chain. Research continues to this day to understand how this process is controlled: cells contain all the instructions to make every protein in the body, but in practice make only those that they need to carry out their specialist roles. Pancreatic cells make insulin, for example, and bone cells do not.
Chemists could disassemble proteins and nucleic acids into their component parts and measure how much of each they contained, but in so doing they would destroy most of the essential information in the molecules, which lay in their sequences. In the early 1950s Fred Sanger (1918–2013), working in the Biochemistry Department at Cambridge University, developed a technique that made it possible to read the sequence of amino acids in a protein chain. He published the sequences of the two chains that make up the hormone insulin.
From 1962, Sanger joined Crick, Brenner and others on the staff of the MRC Laboratory of Molecular Biology. Here Sanger turned his attention to finding a method of reading the sequence of DNA bases. Using an ingenious combination of clever biochemistry and standard laboratory analysis, plus enormous reserves of patience, he succeeded. In 1977 Sanger published the DNA sequence of the bacteriophage
ΦX174, using a method he had developed with Alan Coulson (b. 1947) two years previously. By then, however, they had already moved on to a new and much faster technique, now known as the ‘Sanger method’. For this work Sanger received the unique distinction of a second Nobel Prize for chemistry. Within a few years Sanger’s team published genetic sequences for DNA from human mitochondria, the energy factories of the cell, and for the more genetically complex bacteriophage lambda.
The genome sequence for phage
ΦX174 consisted of about 5000 bases. Those for mitochondria and for the lambda phage consisted of around 15 000 and 50 000 pairs of bases, respectively. The whole human genome contains around 3 billion base pairs. Sequencing the human genome therefore represented a huge technical challenge, yet by the late 1980s a plan was forming to set up a large-scale, international effort to read the sequence in its entirety. Since 1973 geneticists had met at a series of regular international workshops to map human chromosomes, using the evidence from linkage studies to assign the genes for genetic disorders or variations such as blood type to specific chromosomal locations. In 1989 Victor McKusick (1921–2008) in the USA and Walter Bodmer (b. 1936) and Sydney Brenner in the UK formed the Human Genome Organisation (HUGO). HUGO continued this work through a series of single-chromosome mapping workshops.
Mapping was an important complement to the sequencing effort, but its role was to identify widely spaced landmarks in the genome, not to read every letter along the way. James Watson
, by then head of the Cold Spring Harbor Laboratory in Long Island, took the helm of the Office of Genome Research at the US National Institutes of Health. With funding from NIH and the US Department of Energy, large-scale genome sequencing facilities opened at several sites in the USA. With Watson’s encouragement, the Wellcome Trust set up the Sanger Centre (now the Sanger Institute
) in Cambridge, UK, with John Sulston
(b. 1942) at its head, in 1993. From the early 1990s, massively automated processes simultaneously reduced the cost and greatly increased the speed of sequencing, although the machines continued to employ the basic chemistry of Fred Sanger’s method.
At the first of a series of international strategy meetings, held in Bermuda in February 1996, participants in the Human Genome Project agreed that the information in the genome sequence should remain in the public domain and that it should be released as soon as possible. In 1998 one of the consortium members, Craig Venter, set up a private company called Celera Genomics with the aim of completing the sequence more rapidly and making it available through commercial licensing agreements. His action galvanised the publicly funded sequencing efforts on both sides of the Atlantic to produce draft-quality sequence much more quickly. In a politically stage-managed event, US President Bill Clinton and British Prime Minister Tony Blair announced in June 2000 that the public and private efforts had both completed the task simultaneously, although neither had produced a high-quality product. The publicly funded project officially announced that it had completed the sequence to a high level of accuracy in March 2003, coinciding with the 50th anniversary of the discovery of the double helix: the Sanger Institute contributed one-third of the total sequence.
Genes and medicine: inherited disease
One of the justifications for the hugely expensive Human Genome Project was that it would speed up the identification of individual variations in the genetic sequence that might be linked to disease. Such ‘markers’ could be used to aid diagnosis but, more importantly, could act as signposts to the gene or genes that were misfiring and so causing the disease. Closer study of such faulty genes, exploring their impact on the expression of proteins in the cell, could shed light on the disease process and perhaps make it possible to design drugs to ameliorate its effects. At the same time, a better understanding of the variation between individuals could explain variable responses to commonly used drugs for conditions such as heart disease and bring in an era of ‘personalised medicine’.
Evidence that medical conditions could be inherited had been accumulating throughout the 20th century. In 1902 the London physician Archibald Garrod (1857–1936) described a condition called alkaptonuria that he observed in some families in which the parents were closely related. The condition caused urine to darken on exposure to light, and Garrod deduced that it was a recessively inherited metabolic disorder. He described further ‘inborn errors of metabolism’, including albinism, in a classic book of 1909.
The founding director of the Galton Laboratory at UCL, Karl Pearson, initiated the first systematic attempt to document inherited medical conditions in humans. His colleague Julia Bell (1879–1979), whom he encouraged to train in both mathematics and medicine, was the main author of 'The Treasury of Human Inheritance', published in serial form between 1911 and 1958. Bell and J B S Haldane worked together on conditions that could be mapped to the X chromosome. In the 1930s microscopical techniques were not available to visualise the X chromosome directly; however, because men have only one X chromosome while women have two, recessive conditions such as haemophilia that are linked to the X chromosome appear predominantly in males.
By relentlessly pursuing every haemophiliac they could find and contacting their relations, Bell and Haldane discovered that the genes for colour blindness and haemophilia
were inherited close together on the X chromosome. While Haldane acknowledged that the more serious of these conditions, haemophilia, was easy enough to recognise on its own, he was keen to establish the principle that linkage was useful: something as innocuous as blood type might be linked to a serious disease of later life, such as Huntington’s disease, and so provide a route to early diagnosis or genetic counselling. At around the same time, Haldane’s colleague Lionel Penrose
(1898–1972) made a detailed study of the families of people in institutions for those with (in the language of the time) ‘mental defects’, distinguishing between conditions that seemed to have a genetic cause and those that did not.
In the second half of the 20th century, two advances began to make it possible to relate purely genetic studies – which looked at the appearance of different characteristics in related individuals and made inferences about the location of the genes – to physical identification of the genes themselves. One was the ability to distinguish human chromosomes under the microscope. Not until the 1950s did researchers arrive at tissue preparation methods that allowed them to count human chromosomes accurately: the first well-documented evidence that there were 46 was published in 1956. Five years later, scientists in Paris discovered the extra chromosome in Down’s syndrome. The other key advance was the recognition of DNA as the genetic material and the discovery of its structure, which led to a whole range of technologies that made it possible to physically locate specific sequences on the chromosomes.
At Johns Hopkins University in Baltimore, USA, Victor McKusick set up the first medical genetics service to examine patients with inherited conditions in the late 1950s. As biochemical or genetic tests became available, prospective parents could be offered screening to avoid the possibility of giving birth to a child with a fatal recessive condition. A classic example is Tay Sachs disease, a fatal inherited brain condition that is carried by many Ashkenazi Jews in the USA. Since the 1970s, screening in Jewish communities has warned young couples before they marry if they are at risk of giving birth to a baby with the condition; this and other prevention programmes have been so effective that Tay Sachs is now all but eliminated from the Jewish population.
Victor McKusick, from the foreword to Emory and Rimoin’s 'Principles of Medical Genetics', 1983
[Medical genetics] holds particular fascination because it involves the most fundamental and pervasive aspects of our own species… To have combined with this intellectual and anthropocentric fascination the opportunity to contribute to human welfare and to be of service to families and individuals…is a privilege.
The painstaking efforts to identify the physical location of disease genes on human chromosomes began to bear fruit from 1989, when Francis Collins (b. 1950) and Lap-Chee Tsui (b. 1950) located the genetic defect that causes cystic fibrosis. Another landmark was the discovery of the mutation that causes Huntington’s disease in 1993. This was the work of a large collaboration including Nancy Wexler (b. 1945) of the Hereditary Disease Foundation, whose mother, grandfather and uncles had died from the disease. Wexler collected samples from a community in near Lake Maracaibo in Venezuela that included the largest known pedigree of Huntington’s patients. Many – perhaps most – of the comparatively rare, single-gene mutations that cause such conditions have now been discovered, although the promise of cures is yet to be fulfilled.
The new challenge is to understand the multiple genetic variables that make us more or less susceptible to common conditions such as asthma, heart disease or diabetes. The initial results from ‘genome-wide association studies’, which compare the pattern of relatively common markers in the genomes of people with or without such a condition, have confirmed that the causes of these diseases are dauntingly complex. While genetic studies that trace conditions through family pedigrees show strong evidence of an inherited component, the genes that have so far emerged from these studies mostly have small effects. New research efforts, such as the Thousand Genomes Project, aim to sequence large numbers of human genomes in their entirety to find rarer mutations that might have more powerful effects.
Several national cancer genome initiatives, coordinated through the International Cancer Genome Consortium, have also spun off from the Human Genome Project. Cancer is a genetic disease, in which mistakes in the DNA sequence multiply as body cells such as skin cells or liver cells reproduce themselves through division, leading to uncontrolled growth. Research teams in many countries are documenting genetic changes in tumour tissue as part of the effort to identify new routes to treating the disease.
Treatments for genetic disease have not necessarily had to wait for 21st-century genetic technologies. In the 1950s the German doctor Horst Bickel (1918–2000), working in Birmingham, UK, discovered that he could successfully treat an inborn error of metabolism, phenylketonuria, with a severely restricted diet that prevented damaging compounds from building up in affected children’s brains. Since the early 1960s, children in many countries have been screened at birth for the disease so they can begin their therapy at the earliest opportunity. Haemophilia, a genetic condition, can be treated with purified forms of the blood clotting proteins that patients are unable to make for themselves. Technologies such as gene therapy or stem cells have long been forecast as treatments that might correct the genetic defects that cause disease, but these therapies are still very much at the experimental stage.
Genes and society: what is possible, and what is right?
The science of genetics has never been separate from society. It informs, and is informed by, human beliefs and social practices and attitudes. Many of the founding fathers of modern genetics were inspired in their work by a desire to improve the ‘race’ – by which they usually meant white Anglo-Saxons – by eliminating undesirable characteristics from the population. Today, disability is firmly part of the equal opportunities agenda. Yet at the same time, in vitro fertilisation (IVF) and preimplantation diagnosis offer parents the chance to select a healthy baby over one at risk of serious genetic disease. Recombinant DNA technology, or ‘genetic engineering’, is widely used in biomedicine and agriculture, yet there remain widespread concerns about ‘playing God’ or acting ‘against Nature’. As with most areas of scientific discovery, genetics throws up ethical dilemmas that must be debated by society at large.
Around the turn of the 19th and 20th centuries, many intellectuals sincerely believed that the desperate conditions of the poor – hunger, disease, over-large families – were due to their own genetic imperfections rather than economic imperatives. At the same time, they thought that those with disabling inherited conditions, from any stratum of society, were ‘better unborn’. These beliefs gave rise to the eugenics movement (a term coined by Francis Galton). Its initial aim was to persuade those deemed to be physically and mentally ‘fit’ to have more children and those deemed ‘unfit’ to have few or none.
Sir Francis Galton
Eugenics is the science which deals with all influences which improve the inborn qualities of a race; also with those which develop them to the utmost advantage.
In Britain, Galton helped to form the Eugenics Society in 1907 and became its first president. In the same year the state of Indiana in the USA passed a law permitting the coercive sterilisation of people with certain conditions, and a further 26 states followed: people were sterilised in prisons and mental institutions not only for learning disabilities but also for conditions including epilepsy and deafness. In 1910 the zoologist Charles Davenport (1866–1944) established the Eugenics Record Office at Cold Spring Harbor and became an international advocate of eugenics. European countries including Germany and Sweden also adopted sterilisation policies. However, despite the high profile of its Eugenics Society, Britain never passed laws providing for compulsory sterilisation on eugenic grounds.
From 1930, the Eugenics Society shifted its philosophy in the direction of promoting access to birth control. It campaigned for legislation to allow voluntary sterilisation and had close connections with the National Birth Control Association.
Many of the leading figures in genetics and related fields, including Karl Pearson, Ronald Fisher and Peter Medawar, were members of the society. Others, including J B S Haldane and Lionel Penrose, demonstrated mathematically that sterilisation policies to eliminate unfavourable genes, most of which are recessive, were impracticable. In his inaugural lecture as the Galton Professor of Eugenics in 1945, Penrose told his audience that to eliminate just one cause of mental defect, phenylketonuria, would involve sterilising the 1 per cent of the population who were carriers. He went on: “Only a lunatic would advocate such a procedure to prevent the occurrence of a handful of harmless imbeciles.”
In the years following World War II, widespread outrage at the murderous eugenic policies of the Nazis in Germany led to a decline in support for eugenic aims (although eugenic practices, including sterilisation, persisted even in some social democratic countries until the 1970s). Attitudes to mental or physical defects also shifted: Lionel Penrose, who probably met more people with learning disabilities than anyone else in the 20th century, insisted that his patients deserved respect. He demanded that the word ‘eugenics’ in his title, his department and its journal be changed to ‘human genetics’. Although problems arising from poverty remained, they were increasingly associated with economic and social causes rather than inheritance. Countries newly independent from their former imperial rulers took their places on the international stage, and the multicultural societies of the former colonists campaigned against discrimination with some success. The idea of a genetically superior ‘race’ largely fell into disfavour, although it has by no means been eliminated.
One element of early 20th-century eugenic policy, birth control, became widely accepted and advocated as a humane method of allowing women to control the sizes of their families and improve their own state of health. The advent of modern genetics made the prenatal diagnosis of genetic conditions possible, with associated counselling services for parents faced with the choice of abortion for fetuses found to have disabling conditions such as Down’s syndrome or muscular dystrophy. Although the legality of such genetic selection has long been established in secular societies, its ethical basis remains controversial among some faith-based and disability rights groups.
In vitro fertilisation now makes it possible for parents to select a healthy embryo over one with a fatal or disabling genetic defect. In countries where children of one sex are more highly valued, differential birth rates show that couples practise selection on the grounds of sex even though it may not be legal. In principle it would be possible to select on the basis of eye colour, and at some point in the future it might be possible to select characteristics such as height or intelligence. Although some see the advent of ‘designer babies’ as a step too far, others regard it as a natural extension of choice in human affairs.
DNA technology has transformed many other areas of society. In the same year, 1984, that Alec Jeffreys (b. 1950) at the University of Leicester discovered DNA fingerprinting, police used it to identify a murderer and, more importantly, to release an innocent man who had been arrested for the crime. Now many countries have national DNA databases holding the records of convicted individuals, and in some countries those merely arrested, but arguments continue about whether and for how long these records should be kept if suspects are subsequently found not guilty.
New technologies based on DNA have not always been seen as beneficial. In 1974, recognising a wide level of public concern about the possible risks to human health, human evolution or the environment in the unfettered growth of genetic manipulation, the American biochemist Paul Berg (b. 1926) and others called for a voluntary moratorium on such research. The moratorium was swiftly followed by a conference, held at Asilomar in California in 1975, which after vigorous debates established a set of guidelines to allow such research to continue. Since then recombinant DNA, or ‘genetic engineering’, has become a standard technique in biological research and has been crucial in producing a wide range of therapeutic treatments, such as human insulin for diabetic patients.
Genetic engineering has also spawned a global trade in the production and distribution of genetically modified agricultural products, such as soya and cotton; however, this remains sufficiently controversial in the European Union that few such crops are licensed for production, and only trial plantings (often destroyed by anti-GM activists) have taken place there. The issue is complex, and – were it even available – scientific evidence that the crops are higher yielding, are drought resistant or need fewer chemical pesticides could not adequately resolve the many questions raised by opponents. Arguments include the occurrence of previous food safety scares such as ‘mad cow disease’, objections to global food production being concentrated in the hands of a small number of agrochemical corporations, and the claims made on health and environmental grounds for organic crops, which might be contaminated by pollen from GM varieties.
While all research using genetic modification is strictly controlled, it continues to throw up thorny dilemmas. Bird flu emerged in Hong Kong in 1997 and spread mainly in South-east Asia. More than half of those infected died of the disease. All the people who were infected caught the virus by handling poultry, but there is a real risk that the virus could develop mutations that make it transmissible between humans. In 2012, researchers in The Netherlands, the USA and Japan published papers showing how they had modified the virus genome to have just this effect (testing it on ferrets, which are often used as human substitutes in such studies). While some researchers argue that this kind of research is necessary to prevent future epidemics, others fear that making the information public could be a gift to terrorists looking for a deadly biological weapon.
The cultural impact of advances in biology has been profound. As an undergraduate, J B S Haldane
drafted an extraordinarily prophetic essay, eventually published in 1924 as 'Daedalus, or Science and the Future'. Purporting to look back from the 21st century, it predicted the development of in vitro fertilisation or ‘ectogenesis’. It made the bold claim that biology, rather than psychology or ethics, would henceforth be the most powerful force in reshaping society. Haldane’s book was an obvious influence on the novelist Aldous Huxley
, whose 'Brave New World' depicted a society in which embryos were artificially gestated and modified to provide different castes of worker. A similar premise underlies Margaret Atwood’s
dystopian novel 'Oryx and Crake'.
Generations of writers in print, broadcast and electronic media have made it their mission to explain the significance of genetic discoveries to the public at large. James Watson’s 'The Double Helix' (1968) and Richard Dawkins’s 'The Selfish Gene' (1976) have become classics, still in print today. Ideas of replication, mutation and the ties between generations have inspired choreographers, composers and playwrights. No TV cop show is complete without a revelation based on forensic DNA analysis. Film has seized on the possibilities of genetic engineering and cloning to explore the age-old theme of hubris: examples include The Boys from Brazil (1978) and Jurassic Park (1993). Gattaca (1997), meanwhile, plays on fears of a surveillance society in which every transaction requires genetic identification.
The DNA molecule itself has become a cultural icon. Its twisted ladder is represented widely in sculpture, architecture and graphic art: Salvador Dali incorporated it into several of his paintings. It has even become a cliché in the language of corporate communications: ‘sustainability is part of our DNA’, claims the website of a well-known manufacturer of yoghurt. Such genetic metaphors have become a ‘meme’ – a word coined by Richard Dawkins to denote a unit of cultural transmission such as a fashion or a piece of slang, by direct analogy with self-replicating units of genetic inheritance, or genes.
The concept of the gene, developed through the 20th century, has launched a revolution in biology that will continue to challenge the human imagination for years to come.