Homeland Insecurity

September 11, 2010

Since it was established in 2002, partly in reaction to the attacks of September 11, 2001, the US Department of Homeland Security has absorbed the functions of many previously distinct government agencies.  One of its roles  is acting as lead agency in protecting the government’s civilian IT infrastructure, and coordinating cybersecurity efforts with the private sector.    This job is the responsibility of the National Cyber Security Division [NCSD] within DHS; its operational unit is US-CERT, the US Computer Emergency Readiness Team.

According to an article in Wired this week, a report just released by the DHS’s Inspector General says that its recent security audit of US-CERT revealed some serious deficiencies in the agency’s own security implementation.  (There is also an article in Government Computer News about this audit.)   Some of the systems used by US-CERT, including its public Web site, its Web portal, and its intrusion detection system, called Einstein, received high marks for security.  However, the audit found that the core system, the Mission Operating Environment [MOE], used by US-CERT for the storage, analysis, and dissemination of vulnerability data, had some serious deficiencies.  The auditors performed a security scan of the network, using the Nessus security scanner.  They found over 1000 instances of 202 different high-risk security vulnerabilities, and several hundred lower-risk flaws.   Most of these problems resulted from systems that were missing security patches, or had patches incorrectly applied; most of the patch problems were related to applications, such as Adobe Reader and Acrobat, Microsoft Office, and Oracle/Sun’s Java.

To ensure the confidentiality, integrity, and availability of its cybersecurity information, NCSD needs to focus on deploying timely system-security patches to mitigate risks to its cybersecurity program systems, finalizing system security documentation, and ensuring adherence to departmental security policies and procedures.

The complete report [PDF, 35 pages] is available for download.

This situation will probably not come as a complete surprise to security professionals elsewhere.  Anyone who has had the job of keeping security patches up to date on even a moderate-sized network knows what a thankless job it is.  Even with the more structured patch process now used by Microsoft and other vendors, obtaining, testing, and scheduling the installation of patches is a difficult problem.  Bruce Schneier has a good essay on his “Schneier on Security” blog about Microsoft’s patch schedule, in which he writes:

Patching is essentially an impossible problem. A patch needs to be incredibly well-tested. It has to work, without tweaking, on every configuration of the software out there. And for security reasons, it needs to be pushed out to users within days — hours, if possible.

One fairly predictable type of response to the DHS report will be to poke fun at those dumb government computer guys who can’t get anything right.  I have worked in a few large, bureaucratic organizations (in the private sector, by the way), and I know that there will be some truth in the descriptions of buck-passing and generally sclerotic decision making.  But we shouldn’t lose sight of the larger point: the way we deal with software security today, while better than it used to be, is still fundamentally broken.  It relies on a model that means the horse will usually have been stolen before anyone fixes the lock on the stable door.

%d bloggers like this: