Securing the Hardware

We talk a lot, with good reason, about seemingly ubiquitous software security threats, whether from malicious code, like viruses and worms, or from mistakes and flaws in code.  The “Danger Room” blog at Wired has an article about a potentially more severe and insidious problem, hacked computer hardware.

Last week, retired Gen. Michael Hayden, the former CIA and NSA chief, called the hazard of hacked hardware “the problem from hell.”

This is not an entirely new concern.  The US Defense Department is a very large purchaser of digital electronics, to supply its “smart” weapons, pilotless aircraft, and “stealth” technologies.  Most of these components are manufactured outside the US, and have been a source of concern for some time.  (I wrote about “Trojan Horse 2.0” back in 2009.)   The Defense Advanced Research Projects Agency [DARPA] has been working on protecting against this threat.

Over the past two months, Darpa, has awarded nine contracts totaling $49 million for its Integrity and Reliability of Integrated Circuits (IRIS) program to check for compromised chips.

A key aim of the IRIS project is to develop techniques for reverse-engineering components to discover everything they can do, not just what is in the specification.  This is a challenging problem that becomes more difficult as the complexity of the components, and the systems that use them, increases.  (The Center for Technology Innovation at the Brookings Institution has published a report, Ensuring Hardware Cybersecurity [PDF], discussing some of the issues involved.)

Software flaws are difficult to find, and hardware defects are even more elusive.  In general, lower-level bugs (closer to the “bare metal”) are harder to find, a point Ken Thompson made many years ago in his address, “Reflections on Trusting Trust”, given at his acceptance of the Turing Award from the Association for Computing Machinery in 1983:

No amount of source-level verification or scrutiny will protect you from using untrusted code. In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, a loader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well installed microcode bug will be almost impossible to detect.

The risk does not just apply to military equipment, of course.  Equipment used in industrial control systems (like those attacked by the Stuxnet worm) is potentially just as vulnerable, and in general subject to much less scrutiny.

Comments are closed.

%d bloggers like this: