Irrespective of party affiliation, I’m willing to bet that most Americans are irked, at the very least, over the failure of the ACA website to effectively launch after the government invested hundreds of millions of dollars to create it. Starting now and going forward, there will be many software autopsies performed (excuse the healthcare metaphor) to find out how this portal could have been launched without having been properly structured and thoroughly tested.
What caused this to happen? Many are examining this deficiency as a technical failure. Surely, engineers will uncover and identify design, coordination, and testing flaws. But I suspect, that ultimately, we’ll learn a key defect also involved an all-too-familiar workplace cultural and organizational process glitch–a human bug, so to speak.
It’s hard to believe that no one raised warnings about obvious design and operability concerns. They were too profound and widespread for that not to have happened. My guess is that multiple developmental flaws were evident to many involved in the project, and that they knew a site failure lurked on the horizon. But, they either did not speak up, or perhaps their voices were not heard. Does this sound familiar? It should. We rely on humans to sound alarms. But, this is the same defective mechanism that failed to prevent disasters in nuclear operations (Fukushima), space shuttle flights (Columbia and Challenger), offshore drilling (Deepwater Horizon), and financial services (Madoff and the broad collapse of the subprime lending market).
Unfortunately, these are only relatively recent examples of a long list of human “stiflings” that span our history. In each, the pattern is the same. It works like this: People recognize serious problems before disaster strikes, but one way or another their concerns do not lead to constructive action. As we continue to find out about the ACA website development, my guess is we’ll learn: (1) there was great pressure to launch on a deadline; (2) people believed the greater risk to the project was in not launching on time as opposed to addressing underlying issues; (3) certain individuals in powerful positions did not want to hear about issues involving their work, responsibility, or problems with delivering on time; (4) many “insiders” were not surprised the launch proved as problematic as it did – and among themselves had predicted problems on the same scale as the ones we witnessed.
We’ll likely learn that the individuals who raised concerns about the website design and workability encountered a variety of responses, including:
- being ignored. No one investigated their concerns.
- having their issues receive only minimal attention.
- receiving the message, either through direct comments or subsequent treatment, that they’d be removed from the project if they kept complaining.
- being removed from the project or experiencing negative job actions following their complaints.
Additionally, with a project of this scope, there are likely many who knew there were major problems but chose to keep quiet rather than point out the obvious flaws they witnessed.
It’s a common human reaction to resist concerns, particularly in the middle of high- stakes situations. This occurs in large public and private organizations as well as in smaller enterprises.
If we want to ferret out the bugs which cause avoidable blunders, we need to encourage, rely upon, and listen to individuals who raise concerns. And, this cultural fix requires as much time, attention, inspection, and ongoing review as building a complex website. The payoff, though, is fewer disasters that, in retrospect could have been avoided or minimized. Whether we’re talking about events now or in the distant future when technology will presumably be even more powerful, that’s what’s needed to prevent disasters.