Making critical software safer
Fail-safe: June Andronick uses mathematical analysis to create crash-proof software.
When a computer controls critical systems in vehicles and medical devices, software bugs can be disastrous: “unnecessarily risky” programs could put lives in danger, says June Andronick, a researcher at NICTA, Australia’s national IT research center. As a result of one recently discovered software vulnerability, she notes by way of example, “a car could be controlled by an attack on its stereo system.” She is trying to reduce these risks by making the most important part of an operating system—the core, or kernel—in such a way that she can prove it will never crash.
The currently favored approach to creating reliable software is to test it under as many conditions as time and imagination allow. Andronick instead is adapting a technique known as formal verification, which microchip designers use to check their designs before making an integrated circuit: they create a mathematical representation of the chip’s subsystems that can be used to prove that the chip will behave as intended for all possible inputs. Until now, formal verification was considered impractical for large programs such as operating systems, because analyzing a representation of the program code would be too complicated. But in a computing and mathematical tour de force, Andronick and her colleagues, working in Gerwin Klein’s lab at NICTA, were able to formally verify the code that makes up most of the kernel of an operating system designed for processors often found embedded in smart phones, cars, and electronic devices such as portable medical equipment. Because this code is what ultimately passes software instructions from other parts of the system to hardware for execution, bulletproofing it has a major impact on the reliability of the entire system.
“The work is hugely significant,” says Lawrence Paulson, a computer science professor at the University of Cambridge. Beyond showing that there’s no bug in the kernel that could cause it to crash, he says, the verification guarantees that the kernel will perform, without error, every function it was programmed to perform.
The task was made a little easier by the choice to develop a so-called microkernel. Microkernels delegate as many functions as possible—such as handling input and output—to software modules outside the kernel. Consequently, they are relatively small—in this case, about 7,500 lines of C code and 600 lines of assembler. “That’s really small for a kernel, but really large for formal verification,” Andronick says. The analysis was targeted at the thousands of lines of C code; new software and mathematical tools had to be developed for the task. The kernel was released in February, and the team is working on another version designed for the popular line of x86 processor chips.
Andronick doesn’t expect that the technique will scale to much larger software, but she doesn’t think it has to. Using verified code in key subsystems would allow developers to make sure that bugs in less rigorously examined programs—such as those used to interface with a car stereo—can’t affect critical hardware. It could also prevent a computer from locking up if it encounters a problem. Andronick wants more software developers to embrace formal verification “in fields where safety and security really matter,” she says. “We show that it is possible.”