More reports = Higher risk?

I never thought I would ever publish a clip from Fox News. Well, I was wrong. Below you find a clip in which Fox reports that the number of reported incursions in the American air traffic operation (between aircraft and aircraft, between aircraft and ground vehicle, between aircraft and luggage lying around on the taxi ways) has increased dramatically over the last years.

Now, there are (at least) two ways of telling this story, and Fox actually tells both. The first way is to sound terrified and look at it as a trend of decreasing safety (or drift into failure if you like). However there is also a second way. Note how the report turns in a direction of interpreting the increased numbers as a sign of an improved safety climate with controllers more willing to send in their reports without fear of getting punished for their content.



In complex high-risk systems we need information to flow about upcoming risks, problems that occur or problems that might occur. These systems need the people working within them to trust their organization in that they can share information regarding safety-matters without the risk of being held accountable by sanctions or punishment for the content of their report. Don't believe in the ability to sort out the reports revealing occasions of negligence or violations - those are not clear definitions but complex judgment calls (and with highly doubtful value for explaining any behavior in complex industries). The FAA seems to be reporting a nice trend in this matter.

However, in Sweden (and especially in healthcare), we still tend to take a high number of reports as evidence for high risk. Just have a look at this article from Aftonbladet concluding a perfect correlation between number of reports and level of risk.

I concluded above that people need to feel that their reports will be treated fairly, without the risk of being punished for the content of the report. Well, we have some way to go there. The worst case in Sweden the last couple of years must be the nurse who reported the risk of committing a specific medication dosage "error". Later she actually made just that "error"and in the investigation she was judged as even more negligent because of the fact that she had earlier reported the problem. ("You, if anyone should have known and tried harder to get this right. You had even discovered the problem, goddammit!"). The case, which involves the same baby that later got severe brain injuries, died and for which a Swedish doctor is now in court for having killed before she died naturally, is described at the Swedish Läkartidningen here.

The problem of creating this, for the safety of our high-risk systems so vital, reporting culture is often labeled Just Culture. I'll end this post with a short clip of Sidney Dekker explaining the principles for a just culture. It is horribly edited so just close your eyes and listen :)

Kommentarer

Populära inlägg i den här bloggen

Formal invitation to my thesis defence

Video and slides of my keynote at Velocity 2013 in Santa Clara

One month left to apply to our MSc programme in human factors and system safety