Mindfulness-based leadership is an in-demand leadership objective that thrives on high levels of emotional intelligence, focus and attention skills, professional engagement and effective decision-making, culminating in increased leadership effectiveness, joy and overall job and life satisfaction.
Last week, a close friend’s father passed away in his early seventies from the complications of a stroke. The family had to make the difficult decision to shut off life support. The doctors had assured them that recovery and rehabilitation were highly improbable.
Yet, even as the machines were disconnected, I could sense my friends' lingering uncertainty and doubt. As humans, we simply do not always know what to do.
When faced with uncertainty about our next steps because the information we need to gain clarity is unavailable or incomplete, neuroscience suggests that we default to an interesting yet often unexamined process.
We continue to proceed relying on our intuition.
Intuition can be best described as an act of cognitive "short-circuiting“, offering arrival at a decision while the reasons for that decision are yet unclear or cannot be easily described. In some ways, it's familiar territory for healthcare providers, who use it to quickly form a working hypothesis early in the diagnostic process.
Intuition is a double-edged sword. A strong and developed sense of it helps us move forward in situations where the time to formulate an analytical decision is neither available nor practical. On the other hand, studies suggest that intuition can lead us to overreliance on heuristics (mental short cuts) and biases (pre-selected responses) that can cloud our decision-making. (For examples of how these look in a diagnostic context, see David Birdsall's post, "High Reliability Organizations: Cognitive Pause to Improve Patient Safety.")
Put more simply, when working from incomplete information, we can be influenced to do what feels right rather than doing what is right.
For healthcare providers, the implications are real. The influence of heuristics and biases can lead to less-than-optimal medical decision-making. In one common scenario, second-guessing (fueled by the ever-present threat of litigation) taxes the healthcare system with expensive, unnecessary diagnostics, procedures and prescriptions. And occasionally, unbridled intuition can place the health and well-being of patients at risk.
Healthcare training programs would be encouraged to explore the uses and limitations of heuristics. Practicing providers would likewise benefit from knowing the common "cognitive shortcuts" clinicians tend to fall into so they can better recognize these biases in themselves.
On the other hand, awareness doesn’t equal implementation, especially when physicians operate in environments that foster bounded rationality.
Research in cognitive psychology and behavioral economics strongly suggests that people are more prone to commit errors in judgment under constrained conditions. Time limitations, information overload and unfamiliar or unpredictable dynamics in the workplace are just some of the factors that can lead to cognitive biases. Under these challenging circumstances, heuristics and biases "feel" right, because they relieve cognitive load from our minds.
So given these realities, what can we do as leaders to mitigate our own thinking errors? Here are some practical ways to reduce your own exposure to limiting heuristics and unconscious biases in your healthcare leadership:
Develop an awareness practice. Mindfulness practices like meditation, sensory awareness and emotional intelligence have been shown to alleviate cognitive load and return the mind to a state of clarity and lucidity. For practical suggestions on getting started, see a previous post: "How Mindfulness Can Improve Doctor-Patient Relationships."
Have a strong sounding board. Two pairs of eyes see more than one. When in doubt, consult with a colleague or a more experienced professional than yourself. There is always more to learn, and your patients, clients and employees deserve your curiosity and humility.
Keep a beginner’s mind. No two situations are exactly the same, ever. As the British philosopher Allan Watt’s once said: “If you insist that your present is best informed by your past, you are like a person driving your car down the road looking always in the rear-view mirror to know where you are going.”
In addition, healthcare research has identified a large number of cognitive "debiasing" strategies that can help physicians critically evaluate their own decision-making. A great example is the cognitive pause, discussed on this blog a few weeks ago.
None of these strategies is difficult to implement if providers keep in mind that their capacity for clarity and accuracy has inherent limitations. Of course, this is easier said than done. Some professional cultures tend to embrace damaging infallibility myths. In addition, the current breakneck pace of change in this industry naturally breeds resistance, including perceived threats and defensive reactions.
While recognizing our own fallibility as clinicians and leaders can be humbling, it's essential to patient safety. The more aware we are of our biases and thinking errors, the better equipped we will be to mitigate them through protective strategies.
In the end, my friend trusted his understanding and intuition and allowed his father to be removed from life support. Uncertainty made it a difficult decision, but his father passed peacefully within hours. In the medical and scientific sense, there are no absolutes, yet the balance of the evidence suggested that the possibility of recovery was extremely remote. His loving act honored his father's dignity and was, as all family members attested, what his father would have chosen for himself.
[Image credit: "The Thinker Musee Rodin" by inoxiuss licensed under CC BY 2.0]