Many years ago I witnessed my employer’s brutal reaction to a very public product failure. The product, a climate control heat exchanger, was based on long proven technology. The design had been rigorously tested. It fully complied with the prevailing company design, manufacturing and client standards. And, after pre-production testing, had been “passed off” by the OEM customer for volume production.
And yet when a failure subsequently occurred some five years later with much public embarrassment to the end client, my employer tracked down the lead project design engineer, by then a well respected Division General Manager, and fired him for being “responsible” for the alleged “fault”.
Yes, my employer was embarrassed. Its customer was being sued. But none of my colleagues felt this was a fair or justifiable dismissal. Everyone knew about the firing; hardly anyone knew of the critical leaning points captured in several design manual updates.
I was reminded of this tale and its outcome by the ongoing saga of the VW emissions scandal. It is widely reported a team of engineers developed a solution to US emissions regulations by changing some critical engine performance parameters when the vehicle was on an official test. The “solution” was undoubtedly rigorously tested before release and implementation. It beat the emissions tests and in so doing mislead the consumer. VW is facing $17bn of fines and settlement costs.
But what about the culture that made such behaviour at VW possible? Did no one speak out or will we find that those that tried, could not get their colleagues’ attention to make them reconsider the ethical merits of the fix?
Perhaps we could all learn lessons from aviation. The 1978 crash of United Flight 173 (10 people lost their lives) changed the way the industry handled the human factors inherent in communication.
Flight 173, DC8, ran out of fuel circling Portland whilst trying to solve a landing system fault. The investigation showed that whilst the Flight Engineer regularly drew the Captain’s attention to the dwindling fuel supply, his focus was elsewhere trying to understand and solve the landing gear issue. In the hierarchy of command prevalent back then, the Flight Engineer had been trained to be subordinate to the Captain and could not bring himself to challenge his boss directly.
This accident changed the aviation industry. In the subsequent operations bulletin, carriers were urged to “ensure that their flight crews are indoctrinated in the principles of flight deck resource management with particular emphasis on the merits of participative management for Captains and assertiveness training for other cockpit crew members”.
Aviation is a young industry. Its lessons have frequently been learnt through loss of life:
“Everything we know in aviation, every rule in every book, every procedure we have, we know because someone somewhere died. We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass it on to succeeding generations. We cannot have the morale failure of forgetting these lessons and have to relearn them”.
Captain Sullenberger, US Airways 1549
All businesses can learn from failure. And bending the rules should never be an acceptable route to success.
Management at all levels must encourage their reports and associates to pushback and probe whenever they feel something is not quite right.
Easier to say than do but had my employer focused on the learning points, I would have remembered those rather than just my colleague’s shock termination