In a world where people want certainty, the provisional nature of scientific knowledge can be worrying. Non-scientists want to be reassured that something is safe, rather than have nuanced advice about relative risk. On the other hand, perhaps people find this lack of dogmatism welcome; scientists regularly come high on lists of trusted professions unless, that is, they are employed by government or industry.
Despite this intrinsic uncertainty, we talk rather loosely about the ‘proof’ of a particular hypothesis. We have seen recently that gravity waves, predicted by Einstein as a consequence of general relativity, have finally been shown to exist, having been detected by an exquisitely sophisticated interferometer designed specifically for this purpose. However, this in itself is not proof of the theory of general relativity.
Such a confirmation of the existence of something previously hypothesised – the Higgs boson also comes to mind as a recent discovery – is proof only of the prediction. More generally, its confirmation supports a particular hypothesis, but the hypothesis itself (which may now be relabelled as a theory because of the weight of supporting evidence) is not necessarily the whole story.
Proof of the existence of a phenomenon is an important part of the jigsaw, but not the same thing as having all pieces of the puzzle in place. In fact, if we extend that analogy, a particular area of study is like a never-finished jigsaw. As evidence is gathered, more pieces fit into place, but sometimes a new piece is discovered that shows the previous work to be incorrect and the pieces have to be put together in a different combination to give a different picture.
But to move away from awkward metaphors, the point remains that rather few things in science can be considered absolutely cut and dried. The important thing is to recognise that knowledge builds and creates new understandings. Take our understanding of the physical basis of life. When Crick and Watson famously claimed to have discovered its secret a little over 60 years ago, they were in fact only contributing a foundation stone that gave an understanding of the physical basis of heredity and allowed the blossoming of genomics and the various biotechnologies that have flowed from it.
At the same time, the discovery of the structure of DNA and the later deciphering of genomes gave strong support for the theory of evolution, by showing the high degree of conservation of genes across the animal and plant kingdoms. But it also led to a puzzle: although individual genes could be identified and their role understood, large parts of an organism’s genome (80% or more in the case of humans, for example) was found to be composed of DNA that does not code for proteins and has therefore been dubbed ‘junk DNA’.
It is reasonable to expect parts of the genome to have become redundant as a species has evolved, but scientists have been scratching their heads over the sheer size of these apparently useless sequences since they came to light. Now, in fact, some researchers believe that much of this apparently surplus genetic material is not junk at all, but plays more subtle roles in, for example, controlling the expression of the protein-expressing genes themselves.
The fact that an understanding of genes alone does not give the whole picture is reinforced by a growing knowledge of epigenetics, the pattern of methylation of the DNA double helix, which plays a crucial role in the expression of individual genes. What is particularly interesting is that this methylation pattern is in part determined by environmental factors and exposure to stress. What is very surprising is that these induced changes in the pattern of expression are inheritable, creating a form of micro-evolution across generations. The influence this has on character and behaviour in particular is a matter for research.
Just as the knowledge base is evolving in such arcane areas as cosmology and genetics, so it is in more mundane areas affecting our everyday lives. This is particularly true for nutrition, where the current official advice on diet is being questioned. Despite the well-established guidelines to reduce consumption of fat (saturated fat in particular), cut down on salt and consume a balance of protein and carbohydrate, levels of obesity, type 2 diabetes and cardiovascular disease have risen steadily in recent decades. Either people are not following the guidelines or the recommendations themselves are not right.
Nutritional advice does change. In the lifetime of many of us, we have seen the blame for weight gain being shifted from carbohydrates to fat and for polyunsaturated fat, once believed to be a key protective factor against heart disease, to be replaced by mono-unsaturates from olive and rapeseed oils. Some researchers are now suggesting that we should all be eating more fat because of its higher satiety effect. Certainly anecdotal evidence suggests this may work for some overweight diabetics.
In the meantime, the official advice is defended on the basis of the body of evidence that supports it, although the UK government has now started to review this in light of the claims made by the high-fat brigade. What the outcome will be we cannot tell, but the consequences for human health may be very significant, at least among those who follow dietary advice. It is ironic that our species is, in health terms, the victim of its own success; our minds and bodies are seemingly unable to deal with the constant availability of food.
The fact that nutritionists are at least willing to look again at the evidence is an example of how science should progress, although some of those deeply wedded to the current orthodoxy may take some persuading to change their minds if interpretation of the evidence also changes. Perhaps there are some important lessons here for the climate change community as well.
Martin Livermore
The Scientific Alliance
St John’s Innovation Centre
Cowley Road
Cambridge CB4 0WS
_________________________________________________