Why smart organizations ignore warning signs and how they get away with it
When Paolo Macchiarini was recruited to Karolinska Institutet in 2010, the excitement was palpable. He was a surgeon who could make medical history, a potential Nobel Prize for the institution, fame for those around him, and, above all, hope for patients with no other options.
What followed over the next five years was far darker. Repeated warnings about his clinical and scientific malpractice were raised, escalated, and systematically ignored. Patients suffered, and some even died. An SVT documentary eventually brought the story to light, and Macchiarini was dismissed in 2016.
The question Essén and her co-author, Mats Alvesson, set out to answer was not simply what went wrong, but how this could continue for so long. Five levels of managers at Karolinska received warnings. None acted on them in any meaningful way. How is that possible?
The art of ignoring
Essén’s research project, which she calls the “art of ignoring,” starts from a simple observation: some problems that should receive attention do not. And it’s not because people are lazy, incompetent or lack information, but because they actively avoid engaging with it. In her framing, it is a form of work. Organizations construct reasons not to know.
In the Macchiarini case, that work took a specific and revealing shape. The study identified two complementary moves that managers made, often simultaneously, whenever warnings arrived.
First, they made certain things seem very clear: Macchiarini was a genius, and the critics were jealous or unable to understand his methods.
Second, they made other things seem impossibly complex. They claimed the science was too advanced to judge, the consequences of investigating were too unpredictable, and responsibility was too diffuse to assign to any one person.
Essén calls this pairing “ambiguity juggling.” It’s the simultaneous reduction of ambiguity in some areas and amplification of it in others. Together, these moves created a justification for inaction that felt reasonable to those involved, even as patients continued to deteriorate.
From informal warnings to formal alarms
One of the study’s key insights is that ignoring is not a stable state, but one that must be actively maintained and updated as pressure to respond increases. In the early phase, when warnings were informal and came from people with relatively little authority, the complaints were relatively easy to ignore. Managers could dismiss concerns with vague references to the resistance often faced by pioneers.
When the warnings escalated with formal reports, external reviewers who found misconduct, and regulatory agencies who filed police reports, the earlier justifications no longer held.
The genius narrative then gave way to a more procedural defense: each warning was assigned to a formal category that seemed least threatening. For example, delegated to an ethics committee, or framed as a philosophical debate.
No one felt responsible for the whole picture because it had been broken into pieces too small for any single person to own.
A complex environment that rewards innovation
The authors are careful not to reduce the story to simple villainy. A hospital's environment is genuinely complex, and the science Macchiarini pursued was at the frontier of what was knowable. In these cases, caution is rarely rewarded in the same way as innovation. And investigations require time and resources.
Many of those who remained silent were not cynics. They had convinced themselves, through a process Essén describes as both reactive and strategic, that looking away was the sensible course of action.
“The ignoring work doesn’t always feel like ignoring,” Essén explained. “It feels like pragmatism, like trusting the right people, or like not overstepping one’s role.”
By using the vocabulary of responsibility, such as “that’s not my area,” “we can’t know for certain,” or “he is a world-leading expert,” one can avoid the issue just that little while longer.
What this means for organizations
Essén and Alvesson’s research does not offer a simple checklist for preventing future cases like Macchiarini’s. It does, however, provide a vocabulary (ambiguity juggling, sustained ignoring work, organized ignorance) that helps explain what is happening when warning signs are dismissed, which could be a first step toward disrupting it.
For students and practitioners, the lesson may be this: when someone in an organization says a concern is “too complex to judge,” or that the person raising it “doesn’t understand,” it is worth paying attention.
Not every uncomfortable warning signals a crisis. But the mechanisms for dismissing them tend to look the same, whether the warning matters or not.
Read the full study here: "Juggling Ambiguity in Sustained Ignoring Work: The persistent dismissal of warnings at a university hospital."