Skip navigation

Preventing medical errors

Systems causes of error
The big picture

Bandolier 28 reported on a study from Boston showing that about 2000 adverse drug events per hospital occur each year. An ADE was defined as an injury resulting from medical intervention relating to a drug. The importance of preventable errors has been brought home by surveys showing that they impose major burdens on health services, and may injure or even kill thousands of patients a year. It is probably a much bigger problem than hospital acquired infection, and a serious and thoughtful paper [1] quotes one estimate of 180,000 people dying every year in the USA at least in part due to iatrogenic injury. These episodes are also expensive. The estimated cost is $4,000 per event in the USA.

Systems causes of error


The question left begging is how to prevent what has been called serious medical error. This is not a pejorative phrase. It is not intended to blame individuals, but rather a complaint that healthcare professionals are too often expected to do too much with too few tools. Most errors result from failure to use basic human factors in the design of tasks and systems. Excessive reliance on memory, lack of standardisation, inadequate information availability and poor work schedules create situations in which individuals are more likely to make mistakes.

One of the biggest problems is in measuring errors. Leape makes the point that the rates of medical errors reported vary by a factor of at least 50-fold (Table). Voluntary self-reporting gives low rates, while the use of chart reviews and computer screening provides much higher rates. The paper examines a variety of different factors that contribute to medical error:

Method of detection Errors (%)
   
Voluntary self-reporting 0.20
Patient review 0.70
Computer screening 3.80
Chart review 6.50
Chart review + computer 10.00

Process design failures result from the failure to analyse the purposes of a system and how they can best be achieved, relying on the ways things were, perhaps. This is a big effect, may be the cause of about half of all errors. An example may be over-reliance on memory rather than having computerised systems to help.

Task design failures result from failure to incorporate human factors. Checklists, protocols and computerised decision aids (again) can help reduce this. Standardise and simplify might be the simple message here. An example might be to standardise postoperative analgesia protocols in an institution, locking into a system the impossibility of choosing wrong drugs or dosages and making everyone familiar with one system.

Equipment design failures arise from the bewildering number of different machines used in a hospital with so few people fully knowing how to use them. Should it be possible, for instance, to connect an epidural catheter to a syringe with a drug prepared only for intravenous use?

Organisational and environmental failures come from overall cultures in organisations, such as how it deals with issues of quality, training, and team building. Put simply, this comes down to a simple question: does the organisation care? If it doesn't, then why should its workers?

The big picture


If this sounds boring process gobbledegook, then it shouldn't. This is a thoughtful exposition of the problem from an author who has spent time thinking about it, in the Harvard Health Policy and Management Department. Healthcare systems are big, they are complex, they are often impersonal, and are difficult to change. Change comes not from one big idea, so beloved by governments and politicians, but by doing the right things better and continuing to do so. For anyone starting down that long road, this paper gives a good place to get the big picture before getting overwhelmed by the underbrush of detail.

Reference:

  1. LL Leape. A systems analysis approach to medical error. Journal of Evaluation in Clinical Practice. 1997 3: 213-222.
previous or next story in this issue