Fixing Forensic Science

February 18, 2013

If you are a fan of television shows like CSI or NCIS, you know that, at least in that world, forensic science always produces conclusive evidence that helps catch the bad guys.  The reality, as is so often and tediously the case, is a bit messier.  Many of the forensic techniques that are used were developed originally to aid investigation; collecting rigorous evidence of their validity was a distinctly secondary concern.  Many crime labs are controlled by law enforcement agencies, hardly a motivating force for impartial science.  I’ve written here before about some of the problems with fingerprint evidence, with biometrics in general, and even with DNA evidence, regarded in both the TV and real worlds as the “gold standard” of forensic science.

Many of these problems stem from two basic causes:

  • The validity of the evidence in question is ultimately based on a statistical analysis; that is, we may be able to say that the odds are 100 to 1 that a given DNA sample matches the DNA from a particular person.   The underlying statistical analysis is sometimes not as good as it should be, and is also often not disclosed completely.  It should be obvious that it is no more possible to prove that fingerprints are unique than it is to prove that no two snowflakes are alike.
  • Even if the basic analysis is sound, the evidence has to be collected and analyzed by people.  Often, what is collected is imperfect; smeared or partial fingerprints from a crime scene are not as easily classified as the illustrations in the textbooks.  Ordinary blunders can occur, too: evidence may be contaminated, mislabeled, or lost.

Though some suggestions have been made to improve the underlying statistical analysis (as I mentioned in some of those earlier posts), making progress on them has been incomplete, at best.  In any case, the propensity of people to make mistakes is not likely to disappear.

Thus I think it is good news that, as the Washington Post reported in an article this weekend, that the federal government will set up a new National Commission on Forensic Science to guide improvements in forensic science practice, with technical assistance provided by the National Institute of Standards and Technology (NIST).

The new 30-member commission will be co-chaired by Justice Department and NIST officials. It will include forensic scientists, researchers, prosecutors, defense attorneys and judges, and will meet several times a year as a federal advisory committee subject to open government requirements.

The initiative may also lead to replacement or reorganization of some of the ad hoc groups of practitioners that act as informal governing bodies for forensic work.

This step is one that should be welcomed by anyone who wants the criminal justice system to be as fair as possible.  Back in 2009, the National Research Council published a report critical of the current state of forensic science in the US.

It is clear that change and advancements, both systematic and scientific, are needed in a number of forensic science disciplines to ensure the reliability of work, establish enforceable standards, and promote best practices with consistent application.

As the report says, there are many talented, dedicated people doing excellent work in forensic science.  They, and the others affected by this work, deserve to have adequate resources and research to draw upon.


Virus-Infested Hospitals

October 20, 2012

Most readers, I suspect, will have run across news stories or other reports of nasty infections sometimes acquired by hospital patients.  According to a report at Technology Review, there is another worrying category of infection proliferating in hospital environments: computer virus infections of medical equipment.

Computerized hospital equipment is increasingly vulnerable to malware infections, according to participants in a recent government panel. These infections can clog patient-monitoring equipment and other software systems, at times rendering the devices temporarily inoperable.

The advent of the microprocessor and Moore’s Law has meant the introduction of digital technology, often replacing electro-mechanical control systems, in everything from toasters to “fly-by-wire” aircraft.  It should come as no surprise that many medical devices are now controlled by software as well.  This of course means that all the problems of software, including program bugs, security vulnerabilities, and malware, are part of the package.  Also, as with industrial control [SCADA] systems, the undoubted convenience of linking these devices to a network provides a convenient vector for malware infections.  (The direct connection may be to an internal network, but there is often a path to the Internet lurking somewhere in the background.)  In addition, hospital personnel, like workers in other fields, bring in personal laptops, USB memory sticks, and other devices, sometimes with some undesirable extras.

Another difficulty with medical equipment is also reminiscent of the SCADA case.  For obvious reasons, the vendors and users of these devices place a high value on availability — the machine should be ready for use whenever it is needed.  This means that scheduling downtime for, say, installing software patches is not popular.  In addition, some manufacturers do not allow any modifications to their equipment or its software, even to install security fixes.  This stems in part from the requirement that the devices have to be approved by the FDA; rightly or wrongly, some vendors believe that installing such fixes might require the device to be re-certified.

In a typical example, at Beth Israel Deaconess Medical Center in Boston, 664 pieces of medical equipment are running on older Windows operating systems that manufactures will not modify or allow the hospital to change—even to add antivirus software—because of disagreements over whether modifications could run afoul of U.S. Food and Drug Administration regulatory reviews, Fu says.  [Prof. Kevin Fu, associate professor of computer science at the University of Massachusetts, Amherst]

These security issues were the focus of a meeting last week of the Information Security & Privacy Advisory Board at the National Institute of Standards and Technology [NIST].   Prof. Fu was one of the attendees, as was Mark Olson, Chief Information Security Officer at Beth Israel Deaconess Medical Center in Boston MA.

At the meeting, Olson also said similar problems threatened a wide variety of devices, ranging from compounders, which prepare intravenous drugs and intravenous nutrition, to picture-archiving systems associated with diagnostic equipment, including massive $500,000 magnetic resonance imaging devices.

Hospitals have not, historically, had to focus very much on computer security.  With today’s equipment, though, they have become security administrators whether they like it or not.  As with SCADA systems and many others, there is some catching up to do.


BIOS Security for Servers

August 24, 2012

Back in 2011, the US National Institute of Standards and Technology (NIST) published a set of guidelines for achieving better security in the PC BIOS, the initial firmware executed when the PC begins its boot sequence.  An article at Ars Technica reports that the NIST has now issued, in draft form, a similar set of guidelines specifically for servers, BIOS Protection Guidelines for Servers [SP800-147b] [PDF].

The new guidelines mostly parallel those laid out in the 2011 report; the key components are:

  • Authenticated Update Modifications to the BIOS code and data areas must be done through a controlled mechanism, which verifies authentic updates by cryptographic signatures.
  • Optional Local Update An optional update mechanism may be provided that allows any update (signed or not) to be installed provided that the administrator is physically present at the server (this might employ a keyed switch, for example).
  • Firmware Integrity The system must protect its firmware from modification other than by an approved update process.
  • No Bypassing Security It should not be possible to bypass any of the protection mechanisms.

The document goes on to discuss examples of how these rules might be implemented.

NIST’s Computer Security Division is requesting comments on the draft guidelines:

 NIST requests comments on draft NIST SP 800-147B by September 14th, 2012. Please submit all comments to 800-147comments@nist.gov.

As I’ve noted before, attacks that modify the BIOS, although requiring some skill to pull off, are potentially very dangerous, since they can give the attacker complete control of the machine.  Even a complete re-installation of the machine’s operating system will typically not remove them.  So more attention to security in this area is definitely a good thing.

 


NIST Requests Comments on Digital Signature Standard

April 23, 2012

As the good folks over at the SANS Internet Storm Center have noted in a diary post, the National Institute of Standards and Technology [NIST] is requesting comments for changes it is proposing to make to the Digital Signature Standard, FIPS 186-3 [PDF], and has published a notice in the Federal Register.   The key changes are clarification of implementation instructions for the approved algorithms, and approval of addition pseudo-random number generators for key generation.  You can get the complete summary of proposed changes here [PDF].

Comments can be submitted up until May 25, 2012.   The Federal Register notice page has a comment submission link; comments can also be submitted by E-mail to the address in the notice.


%d bloggers like this: