Toyota Unintended Acceleration and the Big Bowl of 'Spaghetti' Code (2013)
Source: Hacker News
November 7, 2013
Background
Last month, Toyota settled an Unintended Acceleration (UA) lawsuit hours after an Oklahoma jury found the automaker acted with “reckless disregard” and returned a $3 million verdict for the plaintiffs. The jury had not yet ruled on punitive damages.
The case stemmed from a September 2007 UA event involving a 2005 Toyota Camry. Jean Bookout and her passenger Barbara Schwarz were exiting Interstate 69 in Oklahoma when the throttle became uncontrollable. The service brakes failed to stop the vehicle; Bookout applied the parking brake, leaving a 150‑foot skid mark from the right rear tire and a 25‑foot skid mark from the left. The Camry continued down the ramp, struck an embankment, and Schwarz died. Bookout survived with head and back injuries after a five‑month recovery.
Attorney Graham Esdale (Beasley Allen) noted that the black skid marks were a key element of the verdict:
“Toyota just couldn’t explain those away. The skid marks showed that she was braking.”
Trial and Jury Reaction
The jury remained engaged despite the technical nature of the testimony. After learning of the settlement, jurors asked Judge Patricia Parrish if they could stay to discuss the trial. A dozen jurors, the judge, and the plaintiffs’ lawyers convened, and Esdale observed that the conversation made it clear the jury was prepared to punish Toyota for its conduct and alleged cover‑up.
Expert Testimony
Two software experts—Phillip Koopman and Michael Barr—provided detailed analysis of Toyota’s software development process and source code. Their findings highlighted numerous deficiencies, including possible bit flips, task deaths disabling failsafes, memory corruption, single‑point failures, inadequate protection against stack and buffer overflows, and the use of thousands of global variables.
Michael Barr
Barr, an embedded‑software specialist, spent over 20 months reviewing Toyota’s source code in a secure, guarded environment. He produced an 800‑page report and testified about specific code problems. Key excerpts from his testimony:
“There are a large number of functions that are overly complex. By the standard industry metrics some of them are untestable, meaning that it is so complicated a recipe that there is no way to develop a reliable test suite or test methodology to test all the possible things that can happen in it. Some of them are even so complex that they are what is called unmaintainable, which means that if you go in to fix a bug or to make a change, you’re likely to create a new bug in the process.
…
The failsafes that they have contain defects or gaps. But on the whole, the safety architecture is a house of cards. It is possible for a large percentage of the failsafes to be disabled at the same time that the throttle control is lost.”
Barr also noted that a Toyota programmer described the engine‑control application as “spaghetti‑like” in an October 2007 document.
Phillip Koopman
Koopman, a Carnegie Mellon professor and safety‑critical systems specialist, criticized Toyota’s engineering process. He explained that the industry‑wide coding standards set by the Motor Industry Software Reliability Association (MISRA) in 1995 link rule violations to software bugs: roughly three minor bugs and one major bug per 30 violations.
- NASA engineers, reviewing parts of Toyota’s code for NHTSA in 2010, found 7,134 MISRA‑C violations in the sections they could access.
- Barr’s own check against the 2004 MISRA edition uncovered 81,514 violations.
Toyota had rejected the MISRA standards in favor of its own process, which overlapped little with the industry norm. Even within its own guidelines, programmers frequently broke rules and failed to document the departures—a standard safety‑engineering practice.
Koopman emphasized that safety must be built into the development “recipe” from the start; it cannot be added later.
“You have to exercise great care when you’re doing safety‑critical software. You can’t just wing it. And Toyota exercised some care, but they did not reach the level of accepted practice in how you need to design safety‑critical systems.”
Specific Software Deficiencies
Single‑Point Failures
Koopman highlighted that Toyota allowed single‑point failures—components whose failure can render the entire system unsafe. He testified:
“If there is a single point of failure, by every safety standard I have ever seen, it is by definition unsafe, and no amount of countermeasures, no amount of failsafes will fix that. They will reduce how often it happens, but it won’t completely fix it. Because we have millions of vehicles out there, it will find a way to fail that you didn’t think of, and it will fail.”
Excessive Global Variables
The academic standard for safety‑critical code is zero global variables. Toyota’s code contained more than 10,000 global variables. Koopman remarked:
“And in practice, five, ten, okay, fine. 10,000, no, we’re done. It is not safe, and I don’t need to see all 10,000 global variables to know that that is a problem.”
Lack of Peer Review and Missing Source Checks
Both experts noted the absence of peer code reviews and Toyota’s failure to inspect the source code of its second CPU, supplied by Denso—despite executive assurances to Congress and NHTSA that engine software could not be the cause of UA.
Task Deaths and “Kitchen‑Sink” Task
Barr testified that the death of a critical task (referred to at trial as Task X) likely caused Bookout’s UA event. He described Task X as the “kitchen‑sink” task because it controlled many vehicle functions, including throttle and cruise control.
Conclusion
The testimony of Phillip Koopman and Michael Barr painted a picture of a software development process riddled with violations of established safety standards, excessive complexity, and inadequate safeguards. Their findings suggest that the root cause of the 2005 Camry’s unintended acceleration was embedded deeply in the vehicle’s software architecture, rather than isolated hardware faults.