Promoters of forensic DNA testing have done a good job selling the public, and even many criminal defense lawyers, on the idea that DNA tests provide a unique and infallible identification. DNA evidence has sent tens of thousands of people to prison and, in recent years, has played a vital role in exonerating men who were falsely convicted. Even former critics of DNA testing, like Barry Scheck, are widely quoted attesting to the reliability of the DNA evidence in their cases. It is easy to assume that any past problems with DNA evidence have been worked out and that the tests are now unassailable.
The problem with this assumption is that it ignores case- to-case variations in the nature and quality of DNA evidence. Although DNA technology has dramatically improved since it was first used just 15 years ago, and the tests have the potential to produce powerful and convincing results, that potential is not realized in every case. Even when the reliability and admissibility of the underlying test is well established, there is no guarantee that a test will produce reliable results each time it is used. Case-specific issues and problems often greatly affect the quality and relevance of DNA test results. In those situations, DNA evidence is far less probative than it might initially appear.
When DNA evidence was first introduced, a number of experts testified that false positives are impossible in DNA testing45. This claim is now broadly recognized as wrong in principle46 and it has repeatedly proven wrong in practice47 . But it has been mentioned frequently, without skepticism, in appellate court opinions48.
Why did experts offer this questionable testimony? One commentator has suggested that avid proponents of DNA evidence sought to allay judicial concerns about the potential for error by engaging in “a sinister semantic game”49 . They were able to deny that a DNA test could produce an error by excluding consideration of human error in administering or interpreting the test. Sinister or not, it is misleading to exclude considerations of human error in DNA testing when humans are necessarily involved in the administration and interpretation of DNA tests. For those who must evaluate DNA evidence, it makes little difference what causes a false match, what matters is how often false matches might be expected.
False positives have occurred in proficiency tests50 and in actual cases51. For example, the Philadelphia City Crime Laboratory recently admitted that it had accidentally switched the reference samples of the defendant and victim in a rape case. The error led the laboratory to issue a report that mistakenly stated that the defendant was a potential contributor of what the analysts took to be “seminal stains” on the victim’s clothing52 . The report also stated that the defendant’s profile was “included” in a mixed sample taken from vaginal swabs. After the sample switch came to light, the laboratory reassessed the evidence and concluded that the “seminal stains” were actually bloodstains that matched the victim’s DNA profile and that the defendant was excluded as a potential contributor to the vaginal sample53.
In 1995, Cellmark Diagnostics made a similar error when it reported, incorrectly, that a rape defendant’s DNA profile was found in what was characterized as a semen stain from a rape case. In fact, Cellmark had found the rape victim’s own profile in the stain (which obviously was not semen), but had misinterpreted its own results by mixing up the defendant’s and victim’s profiles while recording the test results. This error was undetected when a second analyst at Cellmark reviewed the first analyst’s work. It came to light only after a Cellmark witness had presented erroneous testimony about the false match in a pretrial hearing in the case. Cellmark issued a revised report that stated that the evidentiary sample matched the victim’s own DNA profile and that the defendant was excluded as a potential donor54.
False positives can also arise due to misinterpretation of test results. One such error led to the false conviction of Timothy Durham55. In 1993 a Tulsa Oklahoma jury convicted Durham of the rape of an 11- year-old girl. He was sentenced to 3000 years in prison. The prosecution presented three pieces of evidence against him: the young victim’s eyewitness identification, testimony that Durham’s hair was similar (in microscopic examination) to hair found at the crime scene, and a DNA test (DQ-alpha) that reportedly showed that Durham’s genotype matched that of the semen donor. Durham presented eleven witnesses who placed him in another state at the time of the crime, but the jury rejected his alibi defense. Fortunately for Durham, post-conviction DNA testing showed that he did not share the DQ-alpha genotype found in the semen. He was also excluded at several other genetic loci in multiple tests. The initial DNA test result that helped convict Durham was proven to have been a false positive. The error arose from misinterpretation. The laboratory had failed to completely separate male from female DNA during differential extraction of the semen stain. The victim’s alleles, when combined with those of the true rapist, produced an apparent genotype that matched Durham’s. The laboratory mistook this mixed profile for a single source result, and thereby falsely incriminated an innocent man. Durham was released from prison in 1997.
In 2003 another DNA false positive came to light. Josiah Sutton, a 16 year old from Houston was falsely convicted of rape in 1996 and sentenced to 25 years in prison based on a misinterpreted DNA test. The error came to light when one of the authors of this chapter was reviewing casework from the Houston Police Department DNA/Serology laboratory at the request of a Houston television station. Retesting using STRs proved conclusively that Sutton was innocent.56
Although experience has shown that false positives can occur, the rate at which they occur is difficult to estimate on the basis of existing data. Most laboratories participate in periodic proficiency tests, which can cast some light on the potential for error. European forensic laboratories have carried out collaborative exercises involving analysis of stains from known sources. However, this work is designed more to test the uniformity of DNA test results among laboratories using the same protocol than to determine the rate of errors. In the United States, TWGDAM guidelines call for each analyst to take two proficiency tests each year57 and proficiency testing is a requirement for laboratory certification under the program administered by ASCLAD-LAB58 . However, these tests generally are not well designed for estimating the rate of false positives. The tests typically are not blind (i.e., the analysts know they are being tested), they involve limited numbers of samples, and the samples may be easier to analyze than those encountered in routine casework.
45 See, Thompson WC. Forensic DNA Evidence. In: Black B, Lee P, editors. Expert Evidence: A Practitioner's Guide to Law, Science and the FJC Manual. St. Paul, Minn.: West Group, 1997;195-266; Koehler JJ. Error and exaggeration in the presentation of DNA evidence. Jurimetrics 1993;34:21-39.
46 See, Natl. Res. Councl. DNA Technology in Forensic Science. Washington, D.C.: National Academy Press, 1992; Kaye D. DNA evidence: probability, population genetics, and the courts. Harv J Law Technol 1993;7:101-72; Jonakait RN. Stories, forensic science and improved verdicts. Cardozo L Rev 1991;13:343- 52; Koehler JJ. DNA matches and statistics: important questions, surprising answers. Judicature 1993;76:222- 9; Thompson WC. Comment. In Roeder K., DNA fingerprinting: a review of the controversy. Stat Sci 1994;9:263-6.
47 See, Thompson WC. Subjective interpretation, laboratory error and the value of forensic DNA evidence: three cases studies. Genetica 1995;96:153-68; Koehler JJ. The random match probability in DNA evidence: irrelevant and prejudicial? Jurimetrics 1995;35:201-19; Thompson WC. Accepting Lower Standards: The National Research Council's Second Report on Forensic DNA Evidence. Jurimetrics 1997;37(4):405-24..
48 See Kaye, supra; Thompson (1997) supra.
49 Koehler, 1993, supra.
50 See, Thompson WC, Ford S. The meaning of a match: Sources of ambiguity in the interpretation of DNA prints. In: Farley J, Harrington J, editors. Forensic DNA Technology. New York: CRC Press, Inc, 1991; Thompson WC. Subjective interpretation, laboratory error and the value of forensic DNA evidence: three cases studies. Genetica 1995;96:153-68; Koehler JJ. DNA matches and statistics: important questions, surprising answers. Judicature 1993;76:222-9; Thompson WC. Comment. In Roeder K., DNA fingerprinting: a review of the controversy. Stat Sci 1994;9:263-6; Koehler JJ. The random match probability in DNA evidence: irrelevant and prejudicial? Jurimetrics 1995;35:201-19; Thompson WC. Accepting Lower Standards: The National Research Council's Second Report on Forensic DNA Evidence. Jurimetrics 1997;37(4):405-24; Mueller L. The use of DNA typing in forensic science. Acct in Res 1993;3:1- 13; Roeder K. DNA fingerprinting: a review of the controversy. Stat Sci 1994;9:222-47.
51 Thompson, 1997, supra; Scheck B, Neufeld P, Dwyer F. Actual Innocence. New York: Doubleday, 2000.
52 Brenner L, Pfleeger B. Investigation of the Sexual Assault of Danah H. Philadelphia (PA): Philadelphia Police Department DNA Identification Laboratory; 1999 Sept. 24. Lab No.: 97-70826.
53 Brenner L, Pfleeger B. Amended Report: Investigation of the Sexual Assault of Danah H. Philadelphia (PA): Philadelphia Police Department DNA Identification Laboratory; 2000 Feb. 7. Lab No.: 97-70826.
54 Cotton RW, Word C. Amended Report of Laboratory Examination. Germantown (MD): Cellmark Diagnostics; 1995 Nov 20. Case No.: F951078. A transcript of testimony in this case, in which a Cellmark expert admits to the error, can be found at www.scientific.org.
55 Thompson, 1997, supra; Scheck, Neufeld & Dwyer, supra.
56 Several articles about this case can be found at www.scientific.org
57 Technical Working Group on DNA Analysis Methods (TWGDAM). Established guidelines for a quality assurance program for DNA testing laboratories; including RFLP and PCR technologies. Crime Lab Dig 1995;18:44-75.
58 Natl Res Councl. The Evaluation of Forensic DNA Evidence. Washington, D.C.: National Academy Press, 1996.
Innocence Project (2003). Junk science.
Kelly, J. and Wearne, K. (1998). Chapter 1: Examining the examiners. Tainting Evidence: Inside the scandals at the FBI crime lab. The Free Press.
Knowledge Solutions. (2003). Forensic fraud. An on-line archive of cases involving alleged, admitted, and/ or demonstrable forensic fraud.
Maier, Timothy W. (2003). Inside the DNA labs. Insight on the News. Issue 8/5/03.
Thompson, W., Taroni, F., and Aitken, C. (2003). How the probability of a false positive affects the value of DNA evidence. Journal of Forensic Science. 48(1).
Return to main page