Ape happened to stumble across this blog.
The author recounted the incident on Bt Panjang LRT when the train door was opened while moving. For the benefit of readers who may not be aware of the background and cjain of event, LRT is a driverless, short distance train. On that fateful day, due to certain issues, it failed to operate. Procedures call for a driver to take over and drive it manually. Although the driver ensured the doors were closed prior to moving off, he failed to check that it is locked. As a result, the door opened and the fail safe mechanism automatically applied brakes forcing the train to a stop. Following that, the driver checked a second time and ensured that the door was closed and locked before moving off. Fortunately, no one got injured as a result of this chain of events.
What caught ape’s attention is that disciplinary action was taken against the driver. Key word here is ‘disciplinary’. Ape is in the line of safety and whenever the words ‘disciplinary actions taken on operator’ pops up, like the failsafe brakes, ape automatically raises his eyebrows.
Back in the dark ages of safety management, or rather the lack of, whenever an incident or accident occurs, blame was almost always apportioned to the operator. It could be the driver, ship/plane captain, engineer etc. Since it’s the human operator to be blamed, well, disciplinary actions were taken on the operator. However, to err is human, as the cliché goes. To prevent the mistakes of an operator causing harm to productivity or human lives, machines were built and designed to have as much automation as possible, thus removing the human element and human errors. No human, no error. Perfect world, right? Things are not so simple. Machines can fail and sometimes with disastrous result. Complex ‘machines’ designed, built and managed by experts do fail as in the case of Space Shuttle Challenger, Chenobyl Nuclear Power plant or in more recent years, Fukushima nuclear power plant. Failure of Fukushima plant was attributed to natural disaster but humans were involved to recover and contain the further failure as much as possible. Furthermore, complex systems are designed and built by humans. An error introduced at the design and building stages can be dormant until that fateful day. The point ape want to make is that we can never remove the human element and with it, potential human errors. The question is whenever human error is involved, is it just or fair to impose disciplinary action on the ‘erring’ human? To err is human. If we accept that every one who has erred and ought to be punished, including possibility of job termination and criminal charges, rightfully, when such people is removed, there should not be any more of these problems, right? Accidents still continue to occur and often, involves human who ‘erred’. Take a moment to pause and think. In spite of disciplinary actions and even threats of criminal charges in some profession, who do people continue to ‘take short cuts’ or ‘take the easy way out’ or ‘forget’?
Remember, to err is human. Those in the line of psychology or human factors will tell you that the very nature of humans that make succeed and rise above animals, traits such as adaptability, are also the very reason how human can err. Yes. Humans adapt if you’ve not realised yet. Humans adapt under stress. Humans adapt when needed tools are not available. Humans adapt when they race against time. Humans adapt when there are conflicting priorities such as ‘get the train moving’ vs ‘check and double check’. Humans adapt when procedures were not clear. Humans adapt… you get the point yet?
Well, fortunately in current safety management, operator error is not the end stage of investigations but the beginning. We are never satisfied with the simple answer of operator action / inaction. We delve deeper. What could have caused the operator not checking the doors are closed and locked. Are the procedures clear? Was the driver receiving instructions that caused him to overlook? Was he trained properly? Were there other alarms or signs that could’ve warned him that the doors are not locked? Could the system be built that the train could not move in the first place until doors are closed and locked . Simply put, what caused the operator to commit the error. To be fair to SMRT who manages the LRT, at least they are looking into redesigning the system to address human errors. However, what if procedures were not clear or the driver wasn’t trained properly. Does he deserve to be ‘disciplined’? The question of culpability has to be addressed. How then should management decide when disciplinary actions be taken?
Ape will just end this post here with this last question for readers to think about.
Also, here’s an interesting chart for readers to refer. The chart is developed by a psychologist dealing with human errors and promoting just culture. It is meant to help people like ape to determine when should disciplinary actions be taken when an operator commits an error or violation (of prescribed SOPs)