When Evidence Hides in Plain Sight
Three Ways Organisations Can Outsmart Confirmation Bias
At 03:45 a.m. on Thursday 8 November 2018, an officer entered the bridge of the frigate HNoMS Helge Ingstad to take over as the Officer of the watch (OOW) for the 04-08 shift[1]. At this time, the frigate was proceeding south through the Hjeltefjord on Norway’s west coast, positioned approximately 7 nautical miles (nm) north of the Sture terminal. The frigate’s participation in the NATO exercise Trident Juncture 2018 had ended the previous day, and the voyage was being used for crew training in inshore navigation. It was a starlit night, the sea was calm, and visibility was good.
Between 03:45 to 03:53, the oncoming OOW and the OOW being relieved, carried out the handover procedure on the bridge. The outgoing OOW briefed his successor about what had been done during the watch, the plans for the hours ahead and the forecast weather conditions. Together they reviewed the bridge systems, radar and communication settings, and discussed the traffic in the fairway. On the port side of the frigate’s course line, three northbound vessels were approaching. These were acquired on the frigate’s radar[2]. The OOWs also noted a brightly lit stationary object at or near the Sture Terminal, starboard of the frigate’s course line. They discussed whether the object could be the terminal’s quay, or possibly a fish farm or a platform. The object transmitted AIS signals[3], but no speed vector, so the OOWs assumed it was stationary. Consequently, the object was not tracked on the frigate’s radar. As we now know, the object was the oil tanker Sola TS, departing from the Sture terminal. In the minutes that followed the OOWs’ mental model - that the object was stationary - proved firmly established, shaping how he and his team interpreted all subsequent information.
After the handover was completed at 03:53, the OOW focused on the three vessels approaching from ahead on the port side. The OOW, along with the other six watchstanding personnel present on the bridge, visually observed the yellow floodlights from the object on the starboard side. The only crewmember who realized that the floodlights belonged to a vessel was the helmsman (HM). However, the HM assumed that the OOW was aware of this and could see the vessel on the AIS. He also believed that there was sufficient passing distance for the vessel on the starboard side.
At around 03:58 the OOW noticed that the object on the starboard side appeared to be closer to the frigate’s course line than first assumed. At 03:59, the OOW ordered a slight adjustment of the course to port. At 03:59:56 HNoMS Helge Ingstad received a VHF call from the pilot on board Sola TS instructing him to turn starboard immediately. However, the OOW interpreted the call as coming from one of the three northbound vessels, that wanted the frigate to turn further to starboard to increase the passing distance. He replied that they would turn a few degrees to starboard after passing the platform on their starboard side. A minute later, HNoMS Helge Ingstad received another VHF call, this time from the Fedje Vessel Traffic Service (VTS)[4]: “Helge Ingstad, you must do something. You are getting very close”. At this point, the distance between the two vessels was 250 metres. The OOW, standing next to the VHF radio handset on the starboard side of the bridge, suddenly realised that the object was moving and that they were on a direct collision course. At 04:01:03, the VTS operator made another call: “Helge Ingstad, there will be a collision”. The OOW ordered rudder 20° to port, understanding that it was too late to turn to starboard. Despite the action, at 04:01:15 the two vessels collided outside the Sture Terminal. Fortunately, no lives were lost. However, the salvage operation cost over 700 million Norwegian kroner (NOK), and the ship that cost 4.3 billion NOK to build was ultimately scrapped.
The accident investigation identified a diverse set of factors contributing to the accident. One of the factors was the OOWs’ firm mental model that the object was stationary. In hindsight, it is easy to see that information indicating the object was a vessel existed, but it was neither observed nor conceived at the time. This phenomenon - information hiding in plain sight - has been a feature of many accidents, including lethal ones. For example, in the hours before the Macondo blowout in 2010, the crew failed to perceive warning sign like pressure increases and anomalous readings[5]. Similarly, in the 1994 accidental shootdown of U.S. Black Hawks over northern Iraq, the pilots in two U.S. Air Force F‑15 fighters did not perceive information contradicting their assumption that the helicopters were hostile[6].
How can information be hiding in plain sight?
We are all continuously trying to make sense of our surroundings as we go about daily lives. Scholars refer to this human on quest for meaning as sensemaking[7]. Sensemaking is not about perceiving all the information available in our environment, instead, we pick up cues that seem relevant, build a plausible mental model of what is happening, and move on. The cues we notice are largely influenced by what we expect to find, our prior understanding of the situation and our previous experiences. Often, we explain away cues that contradict our understanding, while paying attention only to those that support it. This is known as confirmation bias[8]. Our brains are constantly filtering and directing attention toward information that seems relevant to what we already know or the decisions we have already made. In doing so, we frequently overlook a vast amount of information that does not fit our current view - even when that information might be crucial.
What can your organization do to outsmart confirmation bias in day-to-day decision-making?
Vysus Group has extensive experience with risk management across diverse sectors. We are also inspired by research on high-reliability organisations (HROs)[9]. These are organizations that operate in complex, high-risk environments and consistently manage to operate safely and effectively over long periods. These organisations are described as being constantly aware of the possibility for failure. On this background we recommend building and maintaining cognitive redundancy - the capacity for multiple, overlapping perspectives - as a practical and effective defence against confirmation bias. This approach helps teams detect errors and surface alternative perspectives: when one individual’s or team’s sensemaking is biased or limited, others can identify discrepancies and provide different interpretations, reducing the influence of confirmation bias on decision-making.
Cognitive redundancy - how people think - is enabled by the organization around them. It depends on structural design, such as overlapping roles, cross-disciplinary collaboration, and built-in checks, as well as on social design, the ways people communicate, share, and challenge each other’s interpretations.
To build cognitive redundancy in practice, organizations must create conditions that allow multiple interpretations to coexist and interact. The following three practical strategies outline how this can be done - by encouraging the flow of uncomfortable information, integrating overlapping perspectives, and sustaining continuous learning. Implementing these strategies should be adapted to each organisation’s specific needs. Vysus Group has the expertise to support your organisation in tailoring and embedding these principles in everyday practice.
Strategy 1: Encourage bad news
Goal: Create an environment where concerns, anomalies, and “bad news” can be raised openly and without fear of blame. Simply put: Bad news is good news!
When people hold back uncomfortable information, early warning signs disappear, and flawed assumptions may persist unnoticed. Encouraging bad news means valuing the signals that challenge the prevailing view. Leaders play a decisive role here: by asking “What could we be missing?” or “What would make us wrong?”, they signal that uncertainty and dissent are essential to good decision-making, not threats to it.
How to make it happen:
- Establish safe reporting channels. Make it easy to share concerns - formally through a reporting system, or informally - without fear of repercussions.
- Assign formal challenge roles. Assign a devil’s advocate or red team in key meetings to question dominant interpretations and explore alternative explanations.
- Model openness. Leaders actively invite doubts, alternative views, and inconvenient facts.
- Acknowledge and reward speaking up. Recognize individuals or teams who surface weak signals early; this reinforces psychological safety and learning.
When bad news is openly shared, weak signals are not filtered out but examined, enabling the organization to detect emerging problems early.
Strategy 2: Build redundant/overlapping perspectives
Goal: Strengthen decisions by incorporating multiple, independent viewpoints into both planning and operational monitoring.
Redundancy in interpretation may seem inefficient, but it is one of the most effective safeguards against collective blind spots. When different people or teams view the same situation through slightly different lenses, inconsistencies emerge early—before they become shared certainties.
How to make it happen:
- Conduct cross-functional reviews. Bring together people from different disciplines to examine the same data, plan, or event.
- Use independent verification. Peer checks and double verification of critical tasks reduce reliance on a single perspective.
- Implement parallel monitoring. Allow different teams to track operations independently and compare interpretations.
- Encourage rotation and shadowing. Builds understanding of how others perceive the same event or data, while also promoting multidisciplinary communication.
- Model curiosity. Leaders ask, “What else could this mean?” and invite dissenting views.
Overlapping perspectives create what can be called productive friction: respectful disagreement that slows premature closure, tests assumptions, and keeps the organization alert to weak signals.
Strategy 3: Continuous learning
Goal: Treat sensemaking as an ongoing process, not a one-off activity that happens after incidents.
Organizations are constantly confronted with new situations that challenge existing assumptions. Continuous learning keeps those assumptions under review. After-action reviews, debriefs, and “lessons learned” sessions are most effective when they examine not just what happened, but also how people’s interpretations evolved during the event. This reflective focus helps teams recognize how cognitive biases influence perception and judgment.
How to make it happen:
- Use structured learning loops. Regularly review both successes and failures, emphasizing how decisions were made and interpretations formed, rather than only the results.
- Share lessons widely. Use internal briefings and cross-team knowledge sharing to ensure that learning is shared across the organization.
- Integrate bias awareness in training. Scenario-based exercises are effective in helping teams recognize cognitive traps.
- Promote “what if…” thinking. Encourage leaders and operators to routinely ask questions that explore alternative explanations and trigger analytical thinking, during training, and planning and operations.
Through continuous learning, mental models remain flexible and open to revision. This adaptability enables faster recognition when assumptions no longer fit the facts, allowing teams to adjust their understanding and their actions before small deviations turn into major failures.
Concluding remarks
Could these principles have made a difference in the HNoMS Helge Ingstad case? Hypothetically, yes. Encouraging bad news might have prompted the crew to question the unidentified object. Independent perspectives could have revealed conflicting observations, and “what if…” thinking might have uncovered alternative explanations before assumptions solidified. Together, these strategies illustrate how cognitive redundancy and a culture of inquiry help organizations detect weak signals and adapt before small uncertainties escalate into critical errors.
We recognize that putting these three strategies into practise can be challenging. Vysus Group’s expert teams are ready to provide the guidance and practical support your organization needs to succeed.
References
Accident Investigation Board Norway (2019). Report on the collision on 8 November 2018 between the frigate HNoMS Helge Ingstad and the oil tanker Sola TS outside the Sture terminal in the Hjeltefjord in Hordaland County.
Nickerson, R.S. (1998). Confirmation Bias: A Ubiquitous Phenomenon in Many Guises. Review of General Psychology, 2(2), 175-220.
Snook, S. A. (2000). Friendly fire: The accidental shootdown of U.S. Black Hawks over Northern Iraq. Princeton University Press.
U.S. Chemical Safety and Hazard Investigation Board (2010). Investigation Report Volume 3: Drilling Rig Explosion and Fire at the Macondo Well.
Weick, K. E. (1995). Sensemaking in organizations. Sage.
Weick, K. & K Sutcliffe (2015). Managing the Unexpected: Sustained Performance in a Complex World. John Wiley & Sons.
[1] Accident Investigation Board Norway, 2019
[2] Tracking requires manually placing a marker on the object shown in the radar display and pressing the ACQ (acquire) button. A criterion for the navigation system to generate alarms according to set limit values for the closest point of approach (CPA) and time until CPA (TCPA).
[3] AIS (Automatic Identification System): a maritime system that automatically transmits a vessel’s identity, position, course, and speed to nearby ships and shore stations.
[4] The Vessel Traffic Service (VTS) centres monitor and organise maritime traffic 24/7 in defined service areas along the Norwegian coast.
[5] U.S. Chemical Safety and Hazard Investigation Board, 2010
[6] Snook, 2000
[7] Weick, 1995
[8] Nickerson, 1998
[9] Weick & Sutcliffe, 2015
Related Services