Skip navigation

On knowledge and pigs (Editorial)


Bandolier concentrates in finding evidence in systematic reviews and meta-analysis, on large randomised trials, and on informative observational studies. Each has its place in helping us shape our views on what to do, when, and for whom. Each adds to our knowledge, and sometimes also to our understanding.

Meta-analysis has been under the spotlight of late, with papers and editorials in the NEJM and BMJ. What can we learn from all this? Firstly, anyone wanting a readable exposition of what this all means could do no better than to read David Naylor's balanced and sensible editorial in the BMJ [1]. Then for those who are troubled that meta-analysis of small trials may not be the best thing, there is reassurance from a number of reports [2,3] that most of the time (four times out of five) questions answered both by meta-analysis and by large randomised trials give much the same answer.

Randomness and confidence

Most of the arguments about small trials and meta-analysis occur where only a small proportion of the people involved in the study have a particular outcome - small effects in large populations. In trials of magnesium in acute heart attack, for instance, if only about 7% of people die without treatment, then demonstrating a relative risk reduction of 50% means only three fewer deaths per 100 patients. You therefore need large numbers of patients to make sense of small changes.

But meta-analysis is also done where there are large effects in small populations. Here the issue may be one of effectiveness, in which case a large number of events may be found with smaller numbers of patients. But it can also be one of avoiding bad things which might be consequences of treatment (oral contraceptives being a good example), and again because these happen rarely many patients need to be studied to get accurate and precise information.

When Bandolier reads these papers, what it finds striking are the wide confidence intervals reported in statistical outputs like odds ratios, both for trials and for meta-analysis. That reflection of uncertainty tells us something about how little high quality information we have in the face of the random effects seen in clinical trials. Clearly, for more accurate estimates with better precision, more data is needed.

Give me a break

We live in a real world where decisions need to be taken now. So while it is often convenient to say that more research is needed, Bandolier sympathises with those who have to make decisions. The good news is that meta-analysis is continually improving our knowledge about clinical research, how to do it, and how to use the knowledge we gain from it. Anyone forced to sit and read in detail even ten trials on the same topic would soon come up with reasons why some should be kicked to touch.

In praise of observation

Lest you worry that all this is too recherché, think about circumstances in which you would be really confident that observations can provide you with high quality evidence.

Imagine that we tell you that Bandolier has trained a pig to talk. What foolishness you say. But we bring this talking pig before you and the pig says "Good evening", and proceeds to summarise the day's news for you. Hopefully you would be amazed by this phenomenon, and would not immediately demand a randomised selection of 100 pigs to check it out. The fact that any pig can talk is what is important.

Pidgin & creole

Teasing out how language develops involves observational studies and phenomena like the talking pig - or wolf child. Pidgin languages arise "when speakers of different languages have to communicate to carry out practical tasks but do not have the opportunity to learn one another's languages"[4]. Bickerton observed that children exposed to the pidgin at the age when they acquire their mother tongue inject grammatical complexity, and the pidgin becomes a Creole - rules of grammar seem to be innate.

The problem for us in medicine is that we are often looking for cause and effect. Does a class of oral contraceptives cause thrombosis? The observations may generate a hypothesis, but proof may require more stringent study architecture.

The same problem of observations and causation is described beautifully by Oliver Sacks [5]. He talks of his visit to Guam and describes the endemic "lytico-bodig" disease. This can present as a progressive paralysis (lytico) or as parkinsonism (bodig) or as dementia. Cycad seed made with fadang flour was a putative cause, but a continued puzzle of this geographical isolate was the fact that the younger people were not affected and the condition may be disappearing (sounds a bit like Kuru!).

It's all done by mirrors

A final observation is about alleviating phantom pain. If the phantom pain is of tightly clenched fingers digging into the absent hand, then putting the "good" hand in front of a mirror (so that it looks like the phantom hand) and unclenching the fist can diminish the phantom pain [6]. The brain is tricked by the mirror into thinking of the good hand as the absent one. Sounds crazy, but this one had been seen to work for a grateful patient.

Talking pig evidence rules sometimes for efficacy, and often for rare and serious adverse events.

References:

  1. D Naylor. Meta-analysis and the meta-epidemiology of clinical research. British Medical Journal 1997 315: 617-9.
  2. JC Cappelleri, JP Ioannidis, CH Schmid et al. Large trials vs meta-analysis of smaller trials. Journal of the American Medical Association 1996 276:1332-8.
  3. J LeLorier, G Grégoire, A Benhaddad et al. Discrepancies between meta-analyses and subsequent large randomized controlled trials. New England Journal of Medicine 1997 337: 536-42.
  4. Steven Pinker. The Language Instinct; Penguin.
  5. Oliver Sacks. The Island of the Colour-blind and Cycad Island; Picador.
  6. Ramachandran VS, Rogers-Ramachandran D. Synaesthesia in phantom limbs induced with mirrors. Proc R Soc Lond B Biol Sci 1996; 263:377-86.


next story in this issue