Hostname: page-component-7479d7b7d-c9gpj Total loading time: 0 Render date: 2024-07-09T00:05:35.987Z Has data issue: false hasContentIssue false

Improving safety in medicine: A systems approach

Published online by Cambridge University Press:  02 January 2018

Eileen Munro*
Affiliation:
Department of Social Policy, London School of Economics and Political Science, Houghton Street, London WC2A 2AE, UK. Tel: +44 (0)20 7955 7349; e-mail: E.Munro@lse.ac.uk
Rights & Permissions [Opens in a new window]

Extract

The statutory inquiries after homicides by people with mental illness have been replaced by a system of mandatory reporting to the newly established National Patient Safety Agency (Department of Health, 2001a: p. 24). This reflects a radical change in the way that adverse events or ‘near misses' in medicine are to be investigated. Drawing on lessons from engineering on improving safety in aviation and the nuclear power industry, the Department of Health has moved from an individual to a system-centred approach. Whereas the traditional investigation generally stopped when human error was identified, the systems approach takes error as a symptom, not a cause, and asks why it happened, and what were the factors operating on the individual that contributed to the negative outcome (Department of Health, 2000). This approach to investigations could potentially lead to far more constructive solutions than those offered by the current system of inquiries after homicides.

Type
Editorials
Copyright
Copyright © 2004 The Royal College of Psychiatrists 

The statutory inquiries after homicides by people with mental illness have been replaced by a system of mandatory reporting to the newly established National Patient Safety Agency (Department of Health, 2001a : p. 24). This reflects a radical change in the way that adverse events or ‘near misses’ in medicine are to be investigated. Drawing on lessons from engineering on improving safety in aviation and the nuclear power industry, the Department of Health has moved from an individual to a system-centred approach. Whereas the traditional investigation generally stopped when human error was identified, the systems approach takes error as a symptom, not a cause, and asks why it happened, and what were the factors operating on the individual that contributed to the negative outcome (Department of Health, 2000). This approach to investigations could potentially lead to far more constructive solutions than those offered by the current system of inquiries after homicides.

LIMITATIONS OF THE TRADITIONAL APPROACH

Traditional accident investigations tend to look first for any technical fault that explains why things went wrong; failing that, they go on to consider whether human actions or omissions led to the disaster. If human error is identified, this is usually accepted as a satisfactory explanation and the questioning ends. Investigations in a wide range of settings – aviation, anaesthesia, nuclear power plants – have concluded by blaming human error in 60–80% of cases. If accidents are due to human incompetence, then the solution lies in increasing control: Some mental health inquiries have been into organisations rather than individual cases, and have taken a more systemic approach, as in the Ashworth Special Hospital investigation (Reference Fallon, Bluglass and EdwardsFallon et al, 1999). Inquiries into homicides, however, have typically been of the traditional type. One study found that human error in relation to clinical or risk assessment was identified in 65% of homicide inquiries (Reference Munro and RumgayMunro & Rumgay, 2000). The solutions have also followed the traditional model: individuals are named and shamed, more procedures are introduced to reduce the scope for fallible individual judgement, and tighter managerial control is encouraged; yet there is little evidence that the inquiries are producing an accumulating body of knowledge that is improving practice (Reference Eastman and PeayEastman, 1996: p. 170).

‘To cope with this perceived unreliability of people, the implication is that one should reduce or regiment the human role in managing the potentially hazardous system. In general, this is attempted by enforcing standard practices and work rules, by exiling culprits, by policing of practitioners, and by using automation to shift activity away from people’ (Reference Woods, Johannesen and CookWoods etal, 1994, p. 2).

THE SYSTEMIC APPROACH

The new approach differs in seeking to understand more deeply the reasons people acted in the way they did. Front-line workers are not seen as autonomous decision-makers culpably responsible for their choices; it is assumed that their choice of action is influenced by the system in which they are operating. A distinction is made between latent and active errors (Reference ReasonReason, 1990). Active errors are the mistakes made by front-line practitioners, such as failing to record information or to pass it on to a colleague. Latent error conditions are embedded in the system – for example, in management policy, resource allocations or procedural instructions. These, when combined with local factors and individual errors, feed into a causal sequence that leads to an adverse outcome.

The National Patient Safety Agency recommends a root cause analysis: ‘a structured investigation that aims to identify the true cause of a problem, and the actions necessary to eliminate it’ (Department of Health, 2001b : p. 38). The focus of the analysis should be on systems and processes, not people, and investigators are encouraged to ‘repeatedly ask “why?” for each cause to drill down to the root cause(s)’. The findings should lead to modification of the system, rather than punishment of individuals. For example, in mental health care, staffing levels, bed availability and the priorities set by managers trying to meet government targets may act upon the professionals so that they often fail to make thorough risk assessments. Many of these incomplete assessments are reasonably accurate and no harm ensues, but occasionally a highly significant detail may be overlooked that, had it been known, would have substantially increased the assessment of risk. If a homicide then occurs, with hindsight it will be clear that the failure of a member of staff to ask relatives about the patient's state of mind (for example) played a crucial part in the faulty risk assessment. The traditional inquiry would then rebuke the professional responsible for the omission and remind all staff of the importance of speaking to relatives. The new approach, in contrast, would ask whether incomplete assessments were commonplace and, if so, why staff were choosing to cut corners. Woods et al (Reference Woods, Johannesen and Cook1994: p. 48) list three classes of factors to consider.

  1. (a) Knowledge factors: do staff know the importance of relatives’ information and do they have the skills to interview them competently?

  2. (b) Attentional dynamics: where was professional attention at the time? What was seen as higher priority?

  3. (c) Strategic factors: the trade-offs among different goals that conflict. These are particularly important in mental health work where professionals have to balance many conflicting demands: the rights and needs of the patient v. the safety of others; the demands of paperwork v. time with patients; the competing needs of patients, all of whom have serious problems.

This theoretical framework requires a substantially different way of conducting error investigations, since it includes factors such as organisational culture, management, cognition and emotions.

CREATING A REPORTING AND LEARNING CULTURE

The first, most important – and probably the most difficult – development is to create an open, non-punitive culture in which people are willing to report mistakes and near misses and provide the basic data for the learning process: Creating such a cultural change is a major task. Individuals, understandably, fear that reports of lapses in their work may be used against them. It is a radical move to learn to celebrate errors as an occasion for learning. The organisation has to devise ways of rewarding people for their honesty.

‘The core of the new adverse event system will be reports made by [National Health Service] staff. The success will depend on creating an open culture within all NHS organisations where staff feel that they can draw attention to errors or mistakes (so that learning can take place) without fear of disciplinary action’ (Department of Health, 2001a : p. 27).

For the National Health Service, there is the additional problem that it cannot create a completely ‘no blame’ reporting system. Individuals still have legal responsibility for their professional actions and cannot be given immunity from legal action by service users. Work needs to be done to clarify the criteria for reports that can be guaranteed confidentiality. Mental health services also face the problem of dealing with the intense public anxiety about violent patients. The mandatory inquiries after homicide were instigated as a response to this pressure. The new reporting system will need to be explained well to the public if it is not to be misinterpreted as a way of letting professionals avoid responsibility for poor practice.

CAUSES AND SOLUTIONS

In talking of a ‘true’ cause and identifying it as the focus for change, the National Patient Safety Agency misrepresents the story. Some causes may be ‘deeper’ than others in the system – embedded, for instance, in the organisational culture – but they are not ‘truer’. The adverse event arises from the interaction of numerous factors at all levels in the system; changing any one of them may significantly alter the causal interaction of the others and so reduce or increase the probability of error. Devising solutions will not follow automatically from identifying a root cause, but will take intellectual effort and be limited by political and economic factors.

Some organisations have been identified as maintaining a high level of safety, for example US air traffic control and nuclear aircraft carriers. These ‘high reliability organisations’ have several distinctive features (Reference La Porte and ConsoliniLa Porte & Consolini, 1991). They show flexibility and an ability to respond to unexpected events by moving from a hierarchical structure operating with standard procedures to an emergency mode where rank is less important than expertise and rules may be broken in the interests of dealing with the crisis. They also operate with a degree of redundancy of equipment and staff to enable them to cope with unexpected circumstances. They expect errors to happen, and train their workforce to recognise and recover from them.

Overall, the new reporting and investigation system holds out considerable promise of improving services. Homicide inquiries might have been useful in allaying public concerns but generated little effective learning. However, the new system requires substantial work both in developing skills for investigating systems and in changing organisational culture to encourage openness about lapses from established protocols and standards. Also, the National Patient Safety Agency, as a centralised agency, will face a significant task in disseminating lessons that need to be absorbed into the operating culture of local services.

DECLARATION OF INTEREST

None.

References

Department of Health (2000) An Organisation with a Memory. London: Department of Health.Google Scholar
Department of Health (2001a) Building a Safer NHS for Patients. London: Department of Health.Google Scholar
Department of Health (2001b) Doing Less Harm. London: Department of Health.Google Scholar
Eastman, N. (1996) Towards an audit of inquiries: enquiry not inquiries. In Inquiries after Homicide (ed. Peay, J.), pp. 147163. London: Duckworth.Google Scholar
Fallon, P., Bluglass, R., Edwards, B., et al (1999) The Report of the Committee of Inquiry into the Personality Disorder Unit, Ashworth Special Hospital. London: Stationery Office.Google Scholar
La Porte, T. & Consolini, P. (1991) Working in practice but not in theory: theoretical challenges of high reliability organisations. Journal of Public Administration Research and Theory, 1, 1947.Google Scholar
Munro, E. & Rumgay, J. (2000) Role of risk assessment in reducing homicides by people with mental illness. British Journal of Psychiatry, 176, 116120.CrossRefGoogle ScholarPubMed
Reason, P. (1990) Human Error. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Woods, D., Johannesen, L., Cook, R., et al (1994) Behind Human Error: Cognitive Systems, Computers and Hindsight. Wright-Patterson Air Force Base, OH: Crew System Ergonomics Information Analysis Center.Google Scholar
Submit a response

eLetters

No eLetters have been published for this article.