What Was Volkswagen Thinking?

On the origins of corporate evil—and idiocy

Justin Renteria

One day in 1979, James Burke, the chief executive of Johnson & Johnson, summoned more than 20 of his key people into a room, jabbed his finger at an internal document, and proposed destroying it.

The document was hardly incriminating. Entitled “Our Credo,” its plainspoken list of principles—including a higher duty to “mothers, and all others who use our products”—had been a fixture on company walls since 1943. But Burke was worried that managers had come to regard it as something like the Magna Carta: an important historical document, but hardly a tool for modern decision making. “If we’re not going to live by it, let’s tear it off the wall,” Burke told the group, using the weight of his office to force a debate. And that is what he got: a room full of managers debating the role of moral duties in daily business, and then choosing to resuscitate the credo as a living document.

Three years later, after reports emerged of a deadly poisoning of Tylenol capsules in Chicago-area stores, Johnson & Johnson’s reaction became the gold standard of corporate crisis response. But the company’s swift decisions—to remove every bottle of Tylenol capsules from store shelves nationwide, publicly warn people not to consume its product, and take a $100 million loss—weren’t really decisions. They flowed more or less automatically from the signal sent three years earlier. Burke, in fact, was on a plane when news of the poisoning broke. By the time he landed, employees were already ordering Tylenol off store shelves.

On the face of it, you’d be hard-pressed to find an episode less salient to the emissions-cheating scandal at Volkswagen—a company that, by contrast, seems intent on poisoning its own product, name, and future. But although the details behind VW’s installation of “defeat devices” in its vehicles are only beginning to trickle out, the decision process is very likely to resemble a bizarro version of Johnson & Johnson’s, with opposite choices every step of the way.

The sociologist Diane Vaughan coined the phrase the normalization of deviance to describe a cultural drift in which circumstances classified as “not okay” are slowly reclassified as “okay.” In the case of the Challenger space-shuttle disaster—the subject of a landmark study by Vaughan—damage to the crucial O‑rings had been observed after previous shuttle launches. Each observed instance of damage, she found, was followed by a sequence “in which the technical deviation of the [O‑rings] from performance predictions was redefined as an acceptable risk.” Repeated over time, this behavior became routinized into what organizational psychologists call a “script.” Engineers and managers “developed a definition of the situation that allowed them to carry on as if nothing was wrong.” To clarify: They were not merely acting as if nothing was wrong. They believed it, bringing to mind Orwell’s concept of doublethink, the method by which a bureaucracy conceals evil not only from the public but from itself.

If that comparison sounds overwrought, consider the words of Denny Gioia, a management professor at Penn State who, in the early 1970s, was the coordinator of product recalls at Ford. At the time, the Ford Pinto was showing a tendency to explode when hit from behind, incinerating passengers. Twice, Gioia and his team elected not to recall the car—a fact that, when revealed to his M.B.A. students, goes off like a bomb. “Before I went to Ford I would have argued strongly that Ford had an ethical obligation to recall,” he wrote in the Journal of Business Ethics some 17 years after he’d left the company. “I now argue and teach that Ford had an ethical obligation to recall. But, while I was there, I perceived no strong obligation to recall and I remember no strong ethical overtones to the case whatsoever.”

What, Gioia the professor belatedly asked, had Gioia the auto executive been thinking? The best answer, he concluded, is that he hadn’t been. Executives are bombarded with information. To ease the cognitive load, they rely on a set of unwritten scripts imported from the organization around them. You could even define corporate culture as a collection of scripts. Scripts are undoubtedly efficient. Managers don’t have to muddle through each new problem afresh, Gioia wrote, because “the mode of handling such problems has already been worked out in advance.” But therein lies the danger. Scripts can be flawed, and grow more so over time, yet they discourage active analysis. Based on the information Gioia had at the time, the Pinto didn’t fit the criteria for recall that his team had already agreed upon (a clearly documentable pattern of failure of a specific part). No further thought necessary.

Sometimes a jarring piece of evidence does intrude, forcing a conscious reassessment. For Gioia, it was the moment he saw the charred hulk of a Pinto at a company depot known internally as “The Chamber of Horrors.” The revulsion it evoked gave him pause. He called a meeting. But nothing changed. “After the usual round of discussion about criteria and justification for recall, everyone voted against recommending recall—including me.”

The most troubling thing, says Vaughan, is the way scripts “expand like an elastic waistband” to accommodate more and more divergence. Morton-Thiokol, the NASA contractor charged with engineering the O-rings, requested a teleconference on the eve of the fatal Challenger launch. After a previous launch, its engineers had noticed O-ring damage that looked different from damage they’d seen before. Suspecting that cold was a factor, the engineers saw the near-freezing forecast and made a “no launch” recommendation—something they had never done before. But the data they faxed to NASA to buttress their case were the same data they had earlier used to argue that the space shuttle was safe to fly. NASA pounced on the inconsistency. Embarrassed and unable to overturn the script they themselves had built in the preceding years, Morton-Thiokol’s brass buckled. The “no launch” recommendation was reversed to “launch.”

“It’s like losing your virginity,” a NASA teleconference participant later told Vaughan. “Once you’ve done it, you can’t go back.” If you try, you face a credibility spiral: Were you lying then or are you lying now?

But back to Volkswagen. You cannot unconsciously install a “defeat device” into hundreds of thousands of cars. You need to be sneaky, and thus deliberate. To understand that behavior, we have to turn to a more select subset of examples, such as the Air Force brake scandal of 1968, when B. F. Goodrich built an aircraft brake that many employees knew would fail. When it was tested at Edwards Air Force Base, the brake melted. As in, became molten.

Like Volkswagen’s actions, this would seem an act of madness, pure and simple. (“It’s almost like they painted a bull’s-eye on themselves,” Joseph Badaracco, an ethics professor at Harvard Business School, says of VW.) But the final decision to deceive was, on an individual level, rational—the logical end to a long sequence.

It started, as Volkswagen’s problems apparently did, with a promise that should not have been made. Goodrich, which was desperate to regain an Air Force contractor’s favor as a supplier after a previous delivery of shoddy brakes, promised a brake that was ultracheap and ultralight. Too light, in fact. When first tried out in a simulation at the company’s test lab, the prototype glowed cherry red and spewed incendiary bits of metal. But by the time a young engineer discovered that the source of the problem was the design itself (a more senior engineer had gotten his math wrong), the wheels were quite literally in motion. Brake components from other suppliers were arriving. The required redesign would wreck the promised timetable. The young engineer was told to keep testing.

The brake kept failing. During the 13th set of tests, an all-out effort was made to nurse the brake through the required 50 simulated landings. According to Kermit Vandivier, a data analyst at the test lab who later testified at a Senate hearing, fans were brought in for cooling. Warped components were machined back into shape between stops. Test instrumentation was deliberately miscalibrated. But even these cheats weren’t enough. In one simulation, the wheel rolled some three miles before coming to a stop. Nevertheless, Vandivier and several colleagues were told to prepare a report showing that the brake qualified.

“The only question left for me to decide,” Vandivier later wrote, “was whether or not I would become a party to the fraud.” Refusal would mean losing his job. He’d be 42, with seven children, a new house, and a clear conscience. “But,” he wrote, “bills aren’t paid with personal satisfaction, nor house payments with ethical principles.” He spent nearly a month crafting the falsified report. (The Air Force eventually asked to see the raw test data. Vandivier resigned and became a newspaper reporter.)

This sequence of events fits a pattern that appears and reappears in corporate-misconduct cases, beginning with the fantastic commitments made from on high. NASA officials had promised a “routine and economical” shuttle program that would launch 60 times a year—a target that proved hopelessly optimistic. Ford’s president, Lee Iacocca, had wanted a car weighing no more than 2,000 pounds and costing no more than $2,000 to be ready for production in 25 months. To hasten the process, production equipment was developed concurrently with the car itself; repositioning the Pinto’s gas tank would have required a redesign of the factory equipment, too. All of which placed personnel in a position of extreme strain.

We know what strain does to people. Even without it, they tend to underestimate the probability of future bad events. Put them under emotional stress, some research suggests, and this tendency gets amplified. People will favor decisions that preempt short-term social discomfort even at the cost of heightened long-term risk. Faced with the immediate certainty of a boss’s wrath or the distant possibility of blowback from a faceless agency, many will focus mostly on the former.

This reaction isn’t excusable. But it is predictable. What James Burke, Johnson & Johnson’s CEO, did was anticipate the possible results of these pressures, well before they built up. He shared Henry James’s “imagination of disaster.” And it’s why he introduced, if you will, a set of counterscripts. It was a conscious effort to tinker with the unconscious criteria by which decisions at his company were made. The result was an incremental descent into integrity, a slide toward soundness, and the normalization of referencing “Our Credo” in situations that might otherwise have seemed devoid of ethical content.

What we know of Ferdinand Piëch, Volkswagen’s chairman before the scandal, is that he was no James Burke. At a 2008 corruption trial that sent one VW executive to jail, Piëch referred to alleged widespread use of VW funds on prostitutes as mere “irregularities,” and chided a lawyer for mispronouncing Lamborghini. (“Those who can’t afford one should say it properly” were his precise words.) This was around the time the emissions cheating began.

“Culture starts at the top,” a businessman recently said in an interview with the Association of Certified Fraud Examiners. “But it doesn’t start at the top with pretty statements. Employees will see through empty rhetoric and will emulate the nature of top-management decision making … A robust ‘code of conduct’ can be emasculated by one action of the CEO or CFO.” The speaker was Andrew Fastow, the former CFO of Enron, who spent more than five years in federal prison. He got one thing right: Decisions may be the product of culture. But culture is the product of decisions.

Jerry Useem is a contributing writer at The Atlantic and has covered business and economics for The New York Times, Fortune, and other publications.