By David McAleer, ACLR Featured Blogger
Over 250 innocent people have been exonerated through DNA evidence since it first appeared in criminal courts in the early 1980's. Using this popular hook, David Harris engages more broadly with forensics in his book Failed Evidence, exploring why law enforcement and prosecutors have shown such marked reluctance to incorporate a modern understanding of the scientific method.
DNA evidence is not 100 percent certain. However, it was developed through extensive laboratory testing, with peer-reviewed findings and repeatable testing techniques. DNA technicians match certain markers from two different DNA samples, and, given acceptable statistical and population norms, an error rate can be calculated in specific cases of testing. Although the error rate would be vanishingly small if enough markers are positively compared, in practice standard techniques compare about thirteen markers. This makes the error rate something in the neighborhood of 99.9 percent, meaning that in a population of 1 million, a positive outcome could be caused by any of 1,000 people. Poor laboratory technique, faulty equipment or human error can increase the error rate. Nonetheless, DNA evidence derives its usefulness and validity from peer-reviewed repeatable testing, calculable error rates, the development of rigorous standards to reduce or eliminate human subjectivity and cognitive bias, and the fact that DNA matching incorporates scientifically accepted statistical analysis. By contrast, ballistics, shoeprint and tire marks, hair and fiber analysis, and odontology (bite marks) do not live up to any of these standards.
Among other areas of “forensics,” fingerprint analysis is the closest to being scientifically founded. However, fingerprinting necessarily incorporates significant subjectivity. Because it originated from fieldwork experience rather than the laboratory, it does not incorporate rigorous agreed-upon standards or repeatable results. Experts cannot accurately explain the error rate. Disturbingly, the fingerprint examiners' professional organization actually forbids testimony that acknowledges the probabilistic and subjective nature of the results. Its guidelines only require an examiner to state that there is a match (defendant is the only possible source of the prints), there is no match, or the test is inconclusive.
However, Harris is more interested in exploring the reasons that modern science has met resistance from law enforcement. He also explores changes that could be made to forensic processes to increase scientific certainty. His main suggestions for fingerprint analysis are blind verification by a second examiner (who is not informed about the initial examiner’s results), incorporation of more rigorous procedures eliminating subjectivity, and an acknowledgment of the inherent probabilistic nature of the results during testimony. He does not offer similar suggestions for other forensic disciplines, instead suggesting that they must first undergo further scientific investigation to confirm their validity. Harris also tackles the cognitive biases that have been demonstrated in standard eyewitness identification and police interrogation techniques, and discusses what should be done to eliminate them. Citing well-documented and verifiable studies, Harris asserts that traditional techniques in both areas have been shown to create significant bias which could be greatly reduced without significant hardship to law enforcement.
Harris argues that the traditional line-up, as still performed in most jurisdictions, causes significant bias in two ways. First, it is usually performed by an officer who knows the actual identity of the suspect. Social scientists, and indeed scientists in all areas, have long understood a basic human tendency to project subtle signals, either telegraphing the “expected” answer or reinforcing the choice of a witness who chooses “correctly” (thus making them feel and appear more certain during later testimony). This issue could be resolved by ensuring that the person running the lineup is not the investigating officer. Second, the traditional lineup, where the witness chooses from five or six “suspects” simultaneously, encourages a subtle comparison flaw. It implicitly asks the witness to choose which of the available choices is closest to their memory, which has been shown to contribute to false identifications. This can be reduced significantly by showing the witness the same people (or pictures of them) sequentially rather than all at once. This way, the witness will ask, “Is this the person I saw?” rather than, “Which one of these people should I pick?”
Standard police interrogation is based on the nine-step Reid Technique, created in the 1940’s and 1950’s by two lawyers, John Reid and Fred Inbau, in their book Criminal Interrogation and Confessions (4th ed. 2001). The basic tenet of this technique is the belief that the suspect is guilty, and denials of that guilt should be overcome by aggressively breaking down the suspect’s resistance. Fabricating evidence (including nonexistent DNA or eyewitness evidence) and statements implying that the suspect really has no choice but to confess are accepted parts of the technique. Harris discusses several studies which indicate that this aggressive questioning can lead to a significant number of false confessions, even for very serious crimes. In fact, 25 percent of the DNA exonerations on record involved “innocent defendants who made incriminating statements, delivered outright confessions, or pled guilty.” In addition to his fundamental criticism of the Reid Technique, Harris recommends several methods to improve the questioning process. These include videotaping interrogations (now done by a growing, but still limited, number of jurisdictions), limiting interrogation to no longer than two to four hours (because grueling question tends to psychologically wear out even the innocent), and not allowing police to fabricate evidence.
Harris provides a thoughtful analysis of the scientific bases underlying forensics, current evidentiary and investigatory problems, and possible solutions. His suggestions are particularly well thought-out because they consider the problems faced by law enforcement when implementing ideal solutions in the real world. Harris considers the economic, cultural, and institutional implications of introducing change within the insular police and prosecution communities.