We simply don't know enough about the accuracy of a number of forensic techniques.
Last year, the US Department of Justice released a report that involved some painful self-examination. The DOJ looked at its own performance when it came to the analysis of hair samples—these were once used to identify potential suspects, but the FBI discontinued that practice in 1996. In looking over past cases, however, the feds discovered that agents had systematically overstated the method's accuracy in court, including at least 35 death penalty cases.

Now, in response to this and other reports on problems with forensic analysis, the President’s Council of Advisors on Science and Technology (PCAST) has issued an analysis that extends to half a dozen forensic techniques, including fingerprinting. The report finds that all of the techniques have problems when it comes to operating on a firm scientific footing, so PCAST makes strong recommendations for how to get forensic science to take its name seriously.

The history that got us here is more than a bit ironic. DNA testing, which we now consider nearly foolproof, started appearing in court in the 1980s, before the people doing it had a strong grip on how to use it effectively. The ensuing chaos eventually led to it being ruled inadmissible in a case in New York. This prompted reforms and analysis that eventually put the field on firm scientific footing. Its use to reanalyze older cases, however, revealed problems in many convictions based on other forensic techniques.

Rather than prompting equivalent scientific revolutions among the practitioners of those other techniques, however, these practices continued to be used unchanged. This situation ultimately prompted a 2009 report on forensics from the National Academies of Sciences that was critical of a number of evidentiary approaches. Problems continued to be uncovered, ultimately resulting in President Obama asking PCAST to determine what could be done to improve matters.

The results of that request are what have now been released. The report looks at seven different forensic approaches, finding potential or serious problems with all of them. In no case does it find the techniques live up to the sort of language that has been used to describe them in court: “100 percent certain” or having an error rate that's “zero,” “essentially zero,” or “negligible.”

Hair analysis: Here, the new report simply evaluates the science behind the DOJ's report. Even though this document candidly admitted problems, PCAST felt it didn't go far enough in recognizing the technique's flaws. In the future, it suggests, these sorts of evaluations are best left to a scientific agency, rather than a legal one.

DNA analysis: The basic technique was found to be scientifically sound, but the role of operator error is often downplayed during testimony. As the PCAST authors put it, "Although the probability that two samples from different sources have the same DNA profile is tiny, the chance of human error is much higher." Investigators are now starting to try to identify suspects using samples where the DNA of several individuals has become mixed. Early work has established it can be effective for a mixture from up to three people, but its effectiveness in practice hasn't been tested.

Fingerprints: This is a subjective process and one where studies have shown that investigators can be subject to confirmation bias if they think they have a match and can be swayed by other evidence from the case. There are currently no good estimates of error rate, though, and proficiency testing tends to be lacking. PCAST was optimistic about the prospects for computerizing the analysis and converting it from a subjective to objective method.

Firearm markings: Different guns are thought to leave distinctive marks on cartridges fired in them. PCAST finds that we only have a single study rigorous enough to define this technique's error rate, which may be as high as one-in-46. That's a far cry from practitioners' claims that the technique "has near-perfect accuracy." More studies are needed to define an accuracy rate that can be used in testimony, and again, computerized image analysis may convert this from a subjective to an objective technique.

Footwear analysis: The PCAST authors didn't question whether a footprint could lead investigators to the type and size of shoe that left the impression. Instead, they looked for studies that showed distinctive marks from well-worn shoes could be transferred to prints. There are none. That, obviously, needs to be corrected.

Bitemark analysis: This approach fares the worst in all of the ones examined. It's a subjective analysis and is one where the examiners can't even agree: "Available scientific evidence strongly suggests that examiners not only cannot identify the source of bitemark with reasonable accuracy, they cannot even consistently agree on whether an injury is a human bitemark." PCAST's conclusions are worth quoting in full:

We note that some practitioners have expressed concern that the exclusion of bitemarks in court could hamper efforts to convict defendants in some cases. If so, the correct solution, from a scientific perspective, would not be to admit expert testimony based on invalid and unreliable methods but rather to attempt to develop scientifically valid methods. But, PCAST considers the prospects of developing bitemark analysis into a scientifically valid method to be low. We advise against devoting significant resources to such efforts.
Overall, the PCAST report finds that most of the forensic techniques it looked at needed to be put on a more firm scientific foundation. For subjective ones, this would involve testing trained practitioners to determine their error rate; for objective ones, we'd need studies that showed the underlying principles behind the technique actually apply to it. This work, PCAST suggests, is best carried out by the National Institute of Standards and Technology. Separately, it calls on the FBI Laboratory to improve the design of black box (blinded) studies of proficiency, as well as ongoing proficiency testing of practitioners. The FBI Lab should also pioneer the development of objective identification methods.

In the meantime, PCAST recommends things we can do now while waiting for that to happen. The group calls on the attorney general to instruct its court experts on how to make sure the current state of the science is reflected in their testimony. Federal judges, meanwhile, should be made aware of what the success of these techniques really is and should be ready to block testimony that doesn't accurately reflect it.

Unfortunately, these recommendations focus on federal agencies. While it will take some time to see any of them implemented at that level, it will take even longer for the recommendations to filter down to state and local courts.