Have you ever wondered what would happen if you were in a lift and the cable were to snap?
Perhaps you imagine a brief moment of terror before you free fall to your doom?
Well, fortunately, no such thing would happen. This is because of the “Otis safety brake”. The safety brake is an excellent example of a failsafe system. Indeed before the invention of the Otis brake, buildings were limited to a maximum of 7 stories in height as lifts were considered too dangerous.
The safety brake is an ingenious design. As long as the lift cable is pulling against the weight of the lift, it lets the lift move up and down the shaft. If the cable breaks, there is no longer tension, and brakes spring closed and stop the lift from moving .
In this way, it is a “fail safe” device. Failure of the cable causes the device to default to its “safe mode”.
Systems too can be designed to be fail-safe. A fail-safe system is one in which that when an anticipated failure happens, the system defaults to a safe course of action. In lifts, sooner or later the cable will wear out. When it does, it is designed so that the brakes will be applied safely. In systems, this means that when we design a workflow, if something doesn't happen, then everything should stop.
The WHO theatre checklist is an excellent example of a failsafe human system. If an item on the checklist is not completed, then the procedure should not go ahead. These checklists have been shown to have significantly reduced complications from operations.
So it may surprise you to realise that in the UK, when it comes to gun licensing, we do not have a failsafe system...
In fact, we have a fail-dangerous system. Which is ironic, as these changes were supposedly introduced to improve gun safety.
Dangerous system design
Currently, when someone applies for a gun licence, the police write to their GP. In their letter, the police request that the GP search the patient's records to see if there are any previous issues that may mean that the applicant isn’t suitable to own a gun. So far so good…
However, the next step in the process is where things start to go wrong. If the police don’t get a reply from the GP, they assume that everything is ok and issue the licence.
I think you may be able to see the problem here. If the police don’t receive a reply then they assume that everything is ok. This means that if the GP doesn’t receive the request for some reason then the licence goes ahead anyway. If the GP objects, and for some reason the reply gets lost, the licence goes ahead anyway. If the GP forgets to look, then the licence goes ahead anyway...
The letter also states that if the GP doesn’t object to the licence, then they also need to record that the patient has one. This puts the onus on GPs to report to the police if they then develop concerns about the patient.
Designing around "human factors"
In this scenario, we are relying on a GP surgery to act perfectly. We’re relying on the GP to never get distracted. We’re relying on the mail never to lose a letter. We’re relying on the GP never to overlook something.
You may well be thinking that this is a really important issue so the GP should just pull their socks up. This is an important issue, but so are many other things that a GP has to deal with every day.
GPs are human. Humans are frail. We all make mistakes. And just as we have designed our lifts to take account of the cable snapping, isn't it about time that we designed our safety systems to take account of GPs making mistakes?
We'd love to hear your thoughts on this. Let us know in the comments.