10 Great Turning Points in Medical History


What do you think is the greatest achievement of humanity?

You might also enjoy…

Perhaps you’d choose an engineering marvel; perhaps you’d pick putting a man on the Moon. Perhaps you’d go for the internet, enabling us to communicate across the world. But one achievement that should definitely be up there is the development of modern medicine. In nineteenth century Germany, only 50% of children lived to the age of five, and it was much the same across Europe. By the 1960s, 18% of children worldwide died before the age of five. Today, the figure is below 5%; that’s still too high, but consider that for most of human history, you wouldn’t, on average, have lived to the age you are today. A huge range of things have contributed to this, but one of the most significant is advances in medicine over the centuries.
There are so many things that happen to us today that we are unafraid of, but that could have proven fatal in earlier centuries. A nasty rash; a bout of flu; a broken arm; a healthy woman’s pregnancy; a tooth infection – all of these could have been killers at many times in the past. Dying of them now would be an unexpected tragedy. In this article, we take a look at all the advances in medicine that took us to where we are today.

1. Understanding the circulatory system (1242-1628)

It took several centuries for circulation to be understood.

Cultural distaste at the thought of dissecting dead bodies held back medicine for centuries, especially in Christian countries during medieval times, as this prevented doctors from gaining a full understanding of anatomy. The purpose of the heart, the liver, the lungs and of blood itself remained a mystery, limiting the kind of operations surgeons could carry out, and encouraging mistaken beliefs. For instance, a widespread medieval belief was that some diseases, as well as personality traits, were caused by an imbalance in a person’s four humours – yellow bile, black bile, phlegm and blood – and adjusting this balance could help to cure them.
In 1242, the Arabian physician Ibn al-Nafis published a reasonably accurate though not complete description of the circulatory system. As he was based in Egypt, this knowledge did not travel into Western medicine. Only in 1628 was a complete explanation of the circulatory system published, by the Englishman William Harvey. He understood how the heartbeat provided a continuous flow of blood through the bodies to the extremities – thereby allowing safer surgeries to take place and disproving many mistaken ideas about the operations of internal organs.

2. The development of vaccination (1798)

Cowpox, anyone?

One in five of all deaths of children in the UK have been averted due to the introduction of the measles vaccine alone. Mumps and whooping cough were similarly diseases that used to kill or severely injure significant numbers of children in the UK. But since the widespread introduction of vaccination, these have become almost harmless. And that’s to say nothing of smallpox, which has killed more people than almost any other disease in human history, and which now only exists in a small number of laboratories worldwide.
The idea that you could build up immunity to a disease by getting a milder version of it and recovering has been well-known for a long time, even if the mechanism was not understood. Variolation, a rudimentary form of vaccination, had been practised for centuries in India and Turkey. But in the late 1700s, the scientist Edward Jenner decided to investigate a well-known trend: that milkmaids did not catch smallpox. He hypothesised that catching the much less dangerous disease cowpox granted immunity to smallpox – and exposed his gardener’s son to pus from an infected cowpox blister to test it. The boy had a mild fever but recovered quickly – and when subsequently exposed to smallpox, did not catch the disease. The principle evolved so that instead of infecting people with any kind of active disease, modern vaccines allow the body to develop an immune response while being exposed only to an inert version of the disease – saving millions of lives.

 

3. The development of anaesthetic (1842)

Anaesthetic made surgery safe and painless.

Before the development of anaesthetic, there was no way of rendering someone unconscious that was in any way safe (and modern anaesthetic still carries risks). If you wanted to carry out surgery on someone, there was no way to knock them out: you would either have to hope they would pass out, or they would need to be conscious. A conscious patient cannot be relied upon to hold still like an unconscious one, making the job of a surgeon hugely more difficult. While there were natural sedatives that could be used, such as opium, they carried their own risks.
From the late 18th century onward, physicians began to explore the possibility of using nitrous oxide or diethyl ether for anaesthetic purposes, and in 1842, Crawford W. Long removed tumours from a patient called James Venable while Venable was under ether anaesthetic. In 1846, another physician, William T. G. Morton, demonstrated the same technique in public and published his experiences. The principle of anaesthetic had been proven, and has been refined for safer surgeries ever since.

 

4. Advances in hygiene (1847)

It’s amazing the difference soap can make.

It may seem remarkable that we developed something as complex as vaccination decades before we ever developed the principle that doctors should wash their hands. But at this time the cause of diseases was not yet understood; we merely understood some ways to treat or prevent diseases, not the mechanisms by which they worked.
The development of medical hygiene was similarly a case of acting on the basis of observation. Hungarian doctor Ignaz Semmelweis worked in the maternity clinic of Vienna General Hospital. He noticed that between the two maternity wards – one staffed by female midwives, one by male doctors and medical students – the maternal mortality rate was much higher in the one staffed by the men.
Semmelweis tested a series of variables, eventually landing on the fact that the male doctors and students carried out autopsies, while the midwives did not. He concluded that the men had still got infected material from the corpses they autopsied on their hands when they came to the maternity clinic, and ordered that they wash their hands in a chlorine solution after completing autopsies. The maternal mortality rate plummeted – though it took some years before proper hand-washing techniques were taken on board everywhere.

 

5. The development of germ theory (1860s)

Pathogens were responsible for diseases like cholera.

Semmelweis had developed a theory of hygiene, but he believed that hand-washing worked because it got rid of the bad smell of pieces of corpse on doctors’ hands. But at the same time as he was working in Vienna, scientists across Europe were beginning to develop a better understanding.
London doctor John Snow formulated a theory that germs were responsible for cholera, and saved countless lives by applying his theories to a cholera outbreak in Broad Street in 1854. Louis Pasteur discovered in the 1860s that microorganisms could not appear spontaneously in a nutrient broth, but had to come from somewhere. And this understanding came together in the work of Robert Koch, who worked out how to identify and isolate the pathogens that could be identified in his time. From then on, scientists faced with deadly diseases had a much better understanding of what they might be fighting.

6.The discovery of genetic inheritance (1866)

Genes can do some funny things.

The idea that people inherit traits from their parents has been known since prehistory; the human race has been making use of the same principle for the selective breeding of plants and animals for nearly as long. But the mechanism for this wasn’t properly understood until the work of scientist and Augustinian friar Gregor Mendel in the 1860s. Before Mendel, it was generally believed that the traits of the parents were blended together to produce the offspring, although the fact that children could be taller than either of their parents, for instance, indicated to some scientists that this couldn’t be the whole picture.
The monastery where Mendel lived had an experimental garden. It was there that he carried out a series of experiments on pea plants that caused him to develop the concept of invisible ‘factors’, some dominant and some recessive, that predictably determined various traits of the plant. These factors were genes, and Mendel’s work helped to unlock one of the other key causes of diseases: genetic inheritance.

 

7. The discovery of penicillin (1928)

A mildly disgusting accident, but it treats infection.

It had been known since the late nineteenth century that some moulds had antibacterial properties. But the key step was a happy accident by the scientist Alexander Fleming. A dish containing the bacteria Staphylococcus had accidentally been left open and had become contaminated with mould from a nearby window. Fleming noticed that the mould – penicillin – was inhibiting the growth of the bacteria and grew a pure version of it.
The scientific community took little notice for quite some time, but progress was eventually made, and fifteen years later, penicillin was being used in medicine to cure bacterial infections. By 1945, the Oxford-based scientist Dorothy Hodgkin was able to confirm the structure of penicillin using X-ray crystallography, for which she won the Nobel prize.

 

 

8. The discovery of the structure of DNA (1953)

The first steps on the ladder of gene therapy.

Our understanding of genetics took a leap forwards with the discovery of the double helix structure of DNA. In the early 1950s, at Imperial College London, Rosalind Franklin and Maurice Wilkins had created x-ray images of DNA proteins that showed – in two dimensions – a helix shape. In 1953, at Cambridge, those images acted as a springboard for James Watson and Francis Crick to work out how the individual molecules might fit together.
Mendel’s work nearly a century previously had enabled us to understand how genetics might affect our propensity to different diseases. But Crick and Watson’s discovery took us a huge step further, laying the foundations for the cutting edge of modern science. This includes gene therapy, an experimental technique that may one day enable us to cure diseases like cystic fibrosis that are currently fatal.

 

9. The development of organ transplantation (1954)

Transplants are coming on in leaps and bounds.

The concept of transplantation was well established by the 20th century; the first skin transplant had been carried out as long ago as 1869, and the first cornea transplant took place in 1906. But the hurdles to transplanting an organ were considerably greater, not least the lack of understanding about what might cause the body to accept or reject the transplanted organ, and the greater difficulty of keeping an organ alive in the transplantation process. In 1954, the first successful kidney transplant took place between living identical twins, where their identical genes meant that the risk of the body rejecting the organ was that much lower.
As the 20th century went on, immunosuppressants that stopped the body’s immune system from rejecting transplanted organs improved, and the milestones came thick and fast: the first heart transplant, the first small intestine transplant, the first transplant of a liver from a baboon to a human, and more recently the first hand transplant. The consequences are that once-fatal organ failure has now become increasingly survivable, and the shortage of organs for donation has become the key challenge.

 

10. Awareness of the dangers of smoking (1954)

The dangers of smoking weren’t discovered until relatively recently.

Life expectancy across the globe increased at an incredible rate during the 20th century. But it might have grown faster had it not been for the growth of smoking, too. Automated cigarette-making machinery made cigarettes easily affordable at the turn of the 20th century, and advertising helped make it fashionable.
There had been a debate for some time over whether smoking was healthy or unhealthy, but as uptake of smoking increased, doctors started to see the consequences. In 1948, Richard Doll published the first major study that demonstrated the health risks of smoking. By 1954, the British Doctors Study confirmed the link between smoking and lung cancer, and the British government publicised the findings. Even then, the rates of people smoking remained high – in the 1970s, half of all men in Britain smoked. Yet a determined public health campaign steadily brought the numbers down, so that just 17% of adults in the UK today are smokers. Campaigns against smoking have saved millions of lives worldwide, and continue to do so.
Images: blood cells; cow; syringe; soap and tap; pathogen; cat with different eyes; petri dishes; double helix; surgery; cigarette; looking through microscope; doctor