Chris Gale

Yes so as the previous introduction was I’m Chris Gale and I’m not a biomedical engineer at all. I’m an avionics technician by background but engineering by virtue of engineering is engineering. And I have a passion for this human factors thing. Now I really hope that everyone here has an underpinning knowledge of human factors. If you don’t I do find that quite worrying and please come and find me at stand E12 and we can discuss training in human factors. So one thing I’ve noticed with human factors training particularly through aviation, and that is where it was taken from and taken across into healthcare, is it’s a very reactive thing. So if look at human factors what we tend to find is we’re told that no matter how right we are we’re probably wrong.

And it’s part of the human condition to err. So human factors loves a model. It loves a picture and a diagram and through the training that you receive which unfortunately is now delivered very much on the basis of a mandatory training model and by videos which is not an effective way to train at all which actually builds in human factors issues within the human factors training. Everything we do is actually affected by something, OK. And this is highlighted in the shell model that you in the middle, the person, are directly affected by error from everything you interact with. Be it the software, be it hardware, the actual physical item you’re working with, the environment that you’re working in or the people that you work with. And also within human factors we’re shown other models. We’re shown things like the error chain which tells us that when an incident occurs it’s never a single incident. There’s always a build up, there’s always a chain of events that lead to that error.

From that error model which is an error management model which is always done post an incident, we actually build the barrier model. Now the barrier model is there after an investigation to allow us to build defences against that error happening again. But by virtue of the fact that humans build those barriers there will be inherent flaws within those barriers. So that leads us into the next model. If you’re getting the theme here, human factors does love to have a pictorial diagram. And it builds a Swiss cheese model. Has everyone here, anyone heard of the Swiss cheese model? Quite famous in human factors. OK. And this states that obviously humans will build these barriers therefore there will be errors within those barriers against defence and if the holes in the barriers line up the accident will still occur. And ultimately that’s what all of us are here for isn’t it. To engineer on equipment to make it as safe as possible because when people that are using the equipment that we work on, generic we, you guys work on they’re normally at their most vulnerable.

So the job that you guys have is very critical. Now in order to be able to continue our work what we’d have to do is we want to report things. And by reporting errors, and errors are actually or can be quite a good thing because we can learn from them. But what they, what we have to do is by reporting them we actually fill the holes in that Swiss cheese model OK to re-build the barriers. So when we talk about human reliability it’s stated that a human is only 80% reliable at best. Now I’m not insinuating at all that one day in five everything you do is going to go wrong. What I’m saying is that any given moment because of the way the human mind works there is a chance that we will induce an error. Now the human mind is split down to two parts. We have the episodic memory and the semantic memory. The episodic memory is that, it’s an episode.

Now experience is considered a protector. Most of us here are quite experienced within our roles. We’re quite confident in our ability. But actually it isn’t always a protector. The episodic memory plays tricks on us. The human brain is inherently lazy. It wants to protect us. Now technology has advanced a thousand times over the last few years alone. How much has a human advanced? It’s still the base level human that it was hundreds of years ago. Now this episodic memory is actually there as a protector. It’s there to make us think back to the times before we had this modern world that if we saw a pair of yellow eyes glowing from a dark cave don’t go near it. It’s probably going to eat you. OK. So the episodic memory would lay over. But in the modern world with the advancement of technology, think about some of the equipment you use in order to be able to test some of the medical devices on which you work on.

If you’ve done a task repeatedly hundreds and hundreds of times and you get the exact same outcome, a positive outcome, there is a chance that should that test not go the way it was supposed to and there was a failure your episodic memory could overlay a positive memory and you could actually miss a failure. Now another model that human factors likes is the iceberg model, the Heinrich model. Has anyone come across this model before in human factors training? OK. Classic example, we only see the tip of the iceberg. This is the patient fatality, this is the serious incidents. And it’s stated within the Heinrich ratio that for every one fatal accident there are 300 near misses. OK, that’s an average worked out and that’s this OK. So this is taken directly from aviation. So if there’s a near miss in aviation there’s almost an engine failure. It’s not too bad. But think about the incidents some of you guys are in. Think about a patient on a ventilator. A near miss could mean a few moments where there was lack of oxygen to that patient. That’s a devastating effect.

So my point is why do we have this human factors training? Because it’s telling us things are going to go wrong. And why are all the models reactive? Why do we have to wait till an error occurs to do an accident investigation, to build an error chain, to build a barrier model which we know inherently will be flawed because we are humans. And then wait for these near misses to happen in order to be able to report them, to be able to rebuild that Swiss cheese model into an effective barrier model. I would much rather that we took a proactive stance upon that. And one of the most effective proactive things we can do against human factors error is engage in training. But that training needs to be correct. A lot of training that we have nowadays, mandatory training, videos, annual training, those kind of things. Lip service is paid to them which introduces its own human factor error.

Before we move onto this slide I just want to pose something to you. As medical engineers do you have to be competent? Would you all agree that you have to be competent? Now competence is defined as the correct theoretical knowledge and practical ability. So let’s take it away from medical engineering. Let’s think about a footballer, professional footballer. What do footballers have to do? They have to do training. But as soon as that footballer gets his first start in the Premier League does he stop training? I know, he watches a 20 minute video once a year and then emails HR to tell them that he’s watched it yeah. Now what he does is he goes off with his friend that he works with, he plays football with and they just have a little kick about in the park, show each other what to do. That wouldn’t be acceptable for a professional footballer would it. So why is that considered acceptable for such a critical job as a medical engineer?

So if we take a step further and we look at this slide here. Now this is called the Dirty Dozen OK. Has anyone come across the Dirty Dozen before? It’s the 12 most likely human factors that will cause error. So I’m going to have a look at a few of them now. So complacency. Now as I aforementioned we have the episodic memory. This is where the complacency comes in. It doesn’t matter how well trained you are. It doesn’t matter how long you’ve been in the industry. You will always lose lack of focus by virtue of the fact that you’re human. Now complacency is often misguided and people think that it’s someone who’s lazy. But because the human brain is inherently lazy it will try and find the easiest ways to do things. Therefore complacency will creep in and a lack of focus. I said earlier on that experience is often considered a protector but it can actually have the adverse effect. Experience can actually make us complacent. Not make us bad at our job but we lose sight of why we are physically doing that task and ultimately it is produce safe equipment to save lives.

Lack of knowledge. Now this can be quite a controversial slide. I am by no means saying that anyone here has a lack of knowledge but what you will have, as we all have is gaps within your knowledge. So as well as the episodic memory we also have the semantic memory. This is just facts. And it is said that your semantic memory can hold an infinite amount of information as long as you deem it relevant. So these gaps in knowledge rather than lack of knowledge can occur just by virtue of the fact that the training you have carried out early on in your career was just a long time ago. Has anyone ever had the time where they can’t remember something and all of a sudden they sit bolt upright at three o’clock in the morning and remember it? That’s because your mind is trying to re-establish the neuron pathways to that piece of information. So it is in there somewhere OK. But can we have that delay when we’re trying to service a piece of equipment?

Also with the advancements in technology. Specific training to keep that knowledge relevant will protect against human factors errors. Stress and pressure then. So I’m going to ask a question, please do shout out a response. Is stress a good thing?

DELEGATE

Yes.

CHRISTOPHER GALE

The answer is no. Stress is not a good thing at all. And there is a massive misconception about stress out there. Stress is bad, stress is the body or mind’s adverse reaction to pressure. So the answer to the question is stress is bad, pressure is good OK. So if we look at this, this is the pressure performance graph. And if we look at the right hand side, this is what everyone presumes that stress is. Too much pressure applied on you, high stress, anxiety, task focus, inability to carry out tasks, that kind of thing. But if we look at this graph and we look at the left hand side what we see is the other end of stress. And this is called hypo stress. Where the pressure exerted onto an individual is not enough for them to be able to work in the area of best performance. And this leads to complacency, boredom and people are more likely to cut corners in this state because of the lack of pressure put on them.

So if we think back to right at the start of our careers when we’re out there receiving training we had pressure exerted on us. We had a drive, we had a passion, we were in an area of best performance, it was great. But think of other jobs that come up that are of a repetitive nature, that we’ve done time and time and time again. You can actually fall back into this area of hypo stress and then go back into the area of best performance and then back in hypo stress throughout the day. However, when we engage in training what it will do is it applies pressure to us. It reinforces our semantic memory, it reinforces our episodic memory and it puts us in an area of best performance. It rejuvenates our mind to realise the importance of the task that we are carrying out.

The next one I want to go into a little bit is lack of awareness. Again, everyone here is a professional, everyone understands but I’m talking about a desensitisation to the actual task that you’re doing. That can cause a problem. Mostly because we need that, we need that desire, that want. Why are we doing things? And the lack of awareness of the actual situation, a piece of equipment is just a piece of equipment. But if we can keep in mind what it does, how it works, the importance of it and keep the patient in mind as well we’ll have an overwhelming awareness of the greater good that we’re doing and therefore we will be able to effectively be better engineers and increase patient safety. Because that’s everyone’s concern.

And the last one to look at is norms. Norms is just the way we do it around here. So I’m going to put a little bit of an example out there. It’s five PM in a very busy train station in India OK. The inside of the train is absolutely filled. You could not fit another sharp suit, briefcase carrying businessman inside that carriage if you tried. Someone want to shout out? What do people do in order to catch that train?

DELEGATE

Climb on the top.

CHRISTOPHER GALE

Climb on the top of the train. Yes they do. So we’ll flip that. It’s five PM at London Waterloo. There is not a single bit of room left inside the carriage at all. What does absolutely no one do?

DELEGATE

Climb on top.

CHRISTOPHER GALE

Climb on top. Why? Why don’t they climb on top? Anyone. because it’s illegal and it’s really dangerous. So if we think about that, in one place an act which is very dangerous is considered completely normal, that’s great, that’s fine. But in another place it’s considered very dangerous. And if we take that into an organisation we can have unsafe practices normalised just by virtue of the nature that that’s how they do it round there. And the effect of this is absolutely escalated by training carried out within an organisation because that passes on the normalisation of unsafe behaviour. It also doesn’t apply a great deal of pressure to us if we’re being trained on something by someone we work with day in day out.

The last little bit is something called situational awareness. Situational awareness is basically just knowing what’s going on around you, knowing the bigger picture. We say that quite a lot. It’s a phrase that’s thrown out. It’s thrown out by management all the time. Let’s see the bigger picture. But PCP, perception, comprehension and projection this is how it works. What we need to be aware of, the cornerstone of this pyramid is perception. Whenever we carry out training to an individual, what we are doing is we are using our perceptual set of what we think they should know and trying to communicate it to that person with their perceptual set of what they think they should glean from it. And our comprehension of what we are trying to get across is not always the comprehension that the individual gets. Therefore the projection of either the importance of the task or what will happen can be slewed from individual to individual. This is again further highlighted by training carried out by individuals that are very experienced. Because it brings in their human factors into the training which passes those human factors errors across the entire staff.

So my last point is training in my opinion is one of the most important things we can do in order to reduce human error and increase patient safety. But what kind of training should we do? It needs to be measured and it needs to be the correct training. It needs to be done in such a way that the correct amount of pressure is applied to them. It needs to be done where there is peer to peer assessment. It needs to be done at a level that people are empowered to do things. Now there is a massive want for continual professional development and an escalation through the numerical academic system. However, the higher up the numerical academic system we get the less practical ability is involved and the more theoretical knowledge. So when you think about your staff and you think about yourselves do not be afraid to drop back down the numerical level of education. Do not be afraid to recertify in things that you have done many years ago in order to make yourselves and your staff the most effective engineers that they can be.

 

Christopher Gale's presentation at the EBME Expo : 'Human Factors' Training to Reduce Errors

 

Like what you see?

Hit the buttons below to follow us, you won't regret it...