It was a trip to Birmingham along the M40 and then onto the M42. As always, we relied on Google Maps. This has worked fine in the past.
There was a steady stream of lorries to pass. Until that is one of the lorries decided it was going to overtake another lorry- fair enough you would say, until one realises that lorries these days have speed limiters making their top speed 60 miles per hour. So for a lorry limited to 60 to pass another lorry doing 60 is quite a slow process, even if it possible. Naturally this produced a long tailback all travelling at about 60 miles per hour.
The impact of that was that 2 lanes were now blocked and I had my first lesson in fluid dynamics. Yes, fluid dynamics, as motorway traffic on a 3 lane road can be considered just that. If everybody is doing about the same speed, with no sudden braking then traffic flows freely. If however, the traffic is close together and one person applies the brakes a little too heavily then a shock wave will travel down the line and some way back the whole lot will come to a complete halt! This even if there is no stoppage up ahead.
But back to the journey in hand. When one gets to outskirts of Birmingham there is a junction on the M40 where the road splits into the M42 North and M42 South. We had always gone South and on this journey Google Maps did not make any clear announcement. As we later found, the Google Black Hole, so to speak-
It was only at the next junction when we were advised to leave the M42 and go to the 5th exit. This was to rejoin the M42 but this time going North that we discovered we had taken the wrong junction. We had assumed there were no errors in Google Maps.! (Assume- or otherwise quoted as making an Ass out of U and Me)
Luckily this was not a serious error. We got to our destination- a sports club fairly soon afterwards. As it happened the delay was helpful as it meant that we missed the Agua Sports session that was occupying the pool and deafening all present. So a few minutes delay arising from the programming of Google Maps actually helped. So it was a simple lesson that although the Google computer made no comment – it could still be wrong.
Such a small programming issue might not have so simple had we been travelling on the M1 a little further north at Kegworth on 8th January 1989.
On that night a new British Midland Boeing 737-400 crashed killing 47 and 74 sustained serious injuries. As was to be shown at the later enquiry it was a small programming design change that lead to the disaster.
At the time this was the newest of the Boeing 737 one of the 400 series, with the earliest models of the series only entering service 4 months before. The particular plane had only been in service 85 days and had only done 521 flying hours
On a flight from London to Belfast and despite its newness a fan blade in the left engine had apparently become detached. This lead to vibration, with smoke and fumes entering the cabin. The plane was diverted to East Midlands airport at Kegworth.
Then a crucial error- the Boeing 737- 400 had apparently been redesigned so that air to the cockpit was a mixed supply from both engines. This was new and lead the captain to incorrectly conclude that the still normally functioning right engine was the one to blame. In the account on Wikepedia it appears they were also interrupted in their checking the status of both engines by a radio message from east Midlands Airport.
They then shut down this undamaged right engine and fed fuel into the left leading to a large fire and for the plane to crash just before making the runway at Kegworth- actually on the embankment of the M1 Motorway. 47 were killed and 74 were injured..
The pilots were held responsible despite the fact that the alteration to the air feeds from the two engines to the cabin was not disclosed to them.
So where does the Swiss Cheese come in? This is a well known phrase in safety analysis writing. It was first proposed by James Reason. It assumes that all control systems have flaws and that only when several of these flaws all occur together that a problem occurs. It is pictured as if all the holes in slices of Swiss Cheese are all lining up. This model can be applied to all types of processes but it is most critical if there is a safety issue as a result of failure. It accepts that human errors will occur no matter what systems and training are used. The controlling feature however is that other steps in the process are set up to detect/stop these errors and prevent disaster. So the accident only occurs when all the errors ie holes (in the Swiss cheese) model of the process line up.
So what links all this together. Well we had a minor navigation issue with Google Maps however we were guided back onto the correct route soon enough and actually avoided a minor issue by arriving that little bit later.
In other areas of our lives however there are a variety of routes and therefore possible issues. This becomes more important as more processes are guided by computer algorithms and these are not visible to us. This means that it may also not be that easy to ensure that a dangerous combination of circumstances do not come together and lead to a potential disaster. Think of the multiple layers of process or programming that so much of our lives depend on. How often have you left the house and left a key item behind. Or worse started to pay somebody on line and put in the wrong details. Also trying to test all the various possible combinations becomes ever more difficult. That rule of unfortunate consequences will always lead to the one combination not tested is the one that proves critical on the fateful day- Usually Friday 13th.
It can come to feel quite close and personal. I have been very conscious of this on our recent flights with TUI and Ryanair.
In each instance, the plane concerned was a Boeing 737-800. Following the crash of 2 other planes of this same type, apparently due to design issues, all models of the plane has been grounded ever since! So I was not clear why our 737-800s were still flying
As a traveller I was not in a position to question why one 737-800 was grounded and yet we were flying in an apparently similar one.
At such a time one can only put ones trust in higher authorities both mankind and religious. I did however pay that bit more attention to the safety demonstration.
So far all has been well but it just signifies that one should always live life to the max as one cannot be sure what the future has in store for one.
If you have time I would recommend reading more about the Swiss Cheese Model in the book by James Reason.
Human Error by James Reason
The other is the 12 Principles of Error Management.
Although the classic high risk industries are removed from our every day actions, the principles and practice of this approach are still very relevant to every day life.
Even in business, politics and economic affairs a problem can occur when one small error starts off a chain of issues. Even in business it is also often said that the next economic recession occurs when, all those who were there last time, have retired. So setting the scene for the problems to be repeated.
I hope you have the good luck to enjoy a happy retirement.