The SS Torrey Canyon was a massive Supertanker, almost 300 meters in length and 40 meters wide, that was first launched in 1958. In early 1967, the ship left Kuwait, loaded with oil and headed for Milford Haven, a port in Wales. The route took the ship past the Scilly Islands, which are located around 24 miles west of the coast of Cornwall. When sailing towards Milford Haven, ships can choose to go west of the Isle of Scilly, into the deeper waters of the Atlantic Ocean, or to the east, squeezing between Cornwall and the Isles. The eastern route is much faster–as it is a straighter line between two points–but the narrow channel has a variety of navigational hazards, the most famous of which is the Seven Stones reef. At nearly two miles in length, and one mile wide, the reef is a well-known danger, responsible for over 200 wrecks in its history.
Running late, and in danger of missing their docking time in Milford Haven, Torrey Canyon’s Shipmaster Pasterengo Rugiati made the decision to take the shorter route, hoping to shave valuable time off the voyage. Having made the decision, Rugiati and his crew began to make preparations, but they realized they didn’t have the right charts; Rugiati was aware of the danger of the rocks, and was aware that he didn’t have sufficiently accurate charts, but made the decision to continue. As they got closer to the channel, Rugiati and his crew noticed it was full of smaller fishing boats, but still they continued. As they entered the channel, there was confusion as the ship’s exact location, and movement was made difficult by both the proximity of other ships, and Torrey Canyon’s huge size. Eventually, and perhaps inevitably, the ship struck the Seven Sisters reef, spilling its cargo of oil and causing a huge environmental disaster.
Sticking to the plan
As profiled by Tim Harford in his excellent podcast series Cautionary Tales, Rugiati was victim of plan continuation bias, which is defined as an unconscious cognitive bias to continue with the original plan in spite of changing conditions and updated information. In this case that meant continuing to take the shorter route despite more and more signs that was not the optimal route.
It’s a well-established issue in the aviation industry, where it is often termed get-there-itis. In this case, pilots, who can see the target airport in front of them, will often make relatively poor judgements about the safety of the weather in support of their decision to land; they have a plan, and they want to stick to it. Whilst the safer alternative would be to either abort the landing, or divert to a different airport, this process can be a hassle. Plan continuation bias, according to a NASA study, was a related causative factors in 75% of airline accidents between 1990-2000.
The godfather of human error research, Sidney Dekker, writes in his book that plan continuation bias occurs because an individual’s understanding of a given situation gradually diverges from the situation as it actually turned out to be. There are two key drivers of this; early information suggests the plan is safe, with this information is clear and unambiguous; and later information, which suggests the situation is changing and potentially becoming unsafe, is weaker, more ambiguous, and potentially hard to interpret.
Plan continuation bias in sport
It’s easy to see how this can play out in sport. If you’re a team sport player in a speed/power orientated position, you have to undertake high speed running in your training, often on a regular basis. Sprinting is generally safe, but it is also when the majority of non-contact hamstring injuries occur. Most coaches are aware of this, and so try to program sprint training when the athlete is fresher. There are plenty of early pieces of information which suggest that the plan is safe; experience, which tells us that we both need to sprint, and that sprinting is generally safe, along with the coaches visual perception that the athlete is moving well and appears to have good levels of energy. As the session gets closer, more information becomes available; the athlete’s load monitoring values are suggesting that they are perhaps moving towards a “red zone.” This is ambiguous information–more athletes that are in the red zone don’t suffer an injury than actually do–so how much weight should the coach place on this? As the session progresses, the athlete starts to mention general leg soreness, but not in a vocal manner; is this just normal complaining, or something more? The coach starts to process all these signals, but decides that (a) the original plan called for sprinting in this session, and (b) the player needs to improve their speed. As a result, the training progresses as planned, and the player pulls their hamstring.
Preventing plan continuation bias is really difficult, primarily because “disaster” is not always going to happen, and so, in sport, we have to accept some risk. It’s also difficult because of the lack of clear cues that an increased risk of disaster might be present; if an athlete was informing us that their hamstring was very sore, we’d obviously stop the session, but a slight deviation in monitoring scores and general comments around soreness can be difficult to interpret. One possible method is to use multiple modes of information, and look for consistency between the two. This can even include people; if we have a number of people involved in a decision, then each has their own perception which we can (hopefully) use to calibrate our thinking.
Finally, as with all cognitive biases, by just being aware of it we can ask “is this decision the right one, or am I falling victim to plan continuation bias?” A simple question, but one which, if asked, might stimulate some deeper thinking—and hopefully allow us to avert our own disasters.