by David McAleer, ACLR Featured Blogger
Over 250 innocent people have been exonerated through DNA evidence since it first appeared in criminal courts in the early 1980's. Using this popular hook, David Harris engages more broadly with forensics in his book Failed Evidence, exploring why law enforcement and prosecutors have shown such marked reluctance to incorporate a modern understanding of the scientific method.
DNA evidence is not 100 percent certain. However, it was developed through extensive laboratory testing, with peer-reviewed findings and repeatable testing techniques. DNA technicians match certain markers from two different DNA samples, and, given acceptable statistical and population norms, an error rate can be calculated in specific cases of testing. Although the error rate would be vanishingly small if enough markers are positively compared, in practice standard techniques compare about thirteen markers. This makes the error rate something in the neighborhood of 99.9 percent, meaning that in a population of 1 million, a positive outcome could be caused by any of 1,000 people. Poor laboratory technique, faulty equipment or human error can increase the error rate. Nonetheless, DNA evidence derives its usefulness and validity from peer-reviewed repeatable testing, calculable error rates, the development of rigorous standards to reduce or eliminate human subjectivity and cognitive bias, and the fact that DNA matching incorporates scientifically accepted statistical analysis. By contrast, ballistics, shoeprint and tire marks, hair and fiber analysis, and odontology (bite marks) do not live up to any of these standards.