I’m sure most readers are familiar with the story of the Trojan Horse, told most notably in Virgil’s Latin epic, The Aeneid, in which the Greeks used a clever trick — a giant horse with soldiers hidden inside — to overcome the city of Troy. In the computing world of today, a Trojan Horse is a malicious program disguised as a program that does something useful.
The New York Times has an interesting article on what may be another, even more dangerous, contemporary manifestation of the Trojan Horse. Most of us have seen some of the graphic videos, which first appeared widely in the first Gulf War, of the use of “precision” munitions (e.g., smart bombs) to attack targets in a selective way; and we have heard about “stealth” aircraft and other technological gizmos.
All of these sensors and weapons are made possible by technology, of course, and in particular by the products of the same development of solid-state electronics (Moore’s Law, and all that) that has given us smart phones, portable GPS receivers, and netbooks. However, if you take a close look at one of those consumer products “under the covers”, you’ll see that not very much of it was made here in the USA; and the same is true for many defense systems:
Despite a six-year effort to build trusted computer chips for military systems, the Pentagon now manufactures in secure facilities run by American companies only about 2 percent of the more than $3.5 billion of integrated circuits bought annually for use in military gear.
We talk a lot, for good reason, about the security threats posed by malicious computer software; but malicious hardware components, although probably beyond the ability of the average computer Bad Guy to make, can be just as dangerous, and in general are harder to detect. According to the article, those who have studied the problem do not regard the threat as purely hypothetical:
Counterfeit computer hardware, largely manufactured in Asian factories, is viewed as a significant problem by private corporations and military planners. A recent White House review noted that there had been several “unambiguous, deliberate subversions” of computer hardware.
There have been at least a couple of incidents in which the US, or one of its allies, is alleged or suspected to have used “booby-trapped” hardware.
In 2004, Thomas C. Reed, an Air Force secretary in the Reagan administration, wrote that the United States had successfully inserted a software Trojan horse into computing equipment that the Soviet Union had bought from Canadian suppliers. Used to control a Trans-Siberian gas pipeline, the doctored software failed, leading to a spectacular explosion in 1982.
As the equipment used becomes more and more complex, it becomes correspondingly harder to verify that it works as it should, and only as it should. The general problem was described in Ken Thompson’s address, “Reflections on Trusting Trust”, given at his acceptance of the Turing Award from the Association for Computing Machinery in 1983:
No amount of source-level verification or scrutiny will protect you from using untrusted code. In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, a loader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well installed microcode bug will be almost impossible to detect. [emphasis added]
All of this is fairly worrying. There is no reason that I know of to expect that the hardware design process will be any less subject to security problems than the software design process; when you add the possibility of malevolent actors into the mix, the resulting picture is far from reassuring.