Skip to content

Brief of Amici Curiae, State v. McPhaul, No. 421PA17

Published: 2018
Primary Author: Brandon Garrett

As the field of forensic science grows, new techniques are developed, and our justice system becomes more dependent on the proper application of these techniques, courts play an increasingly crucial role in ensuring that only valid, reliable expert testimony is admitted as evidence. As a result, courts have an obligation to ensure that unscientific forensic testimony is excluded lest it undermine the integrity of the proceedings and, more broadly, the justice system as a whole. Even in the case of forensic techniques that are widely recognized as valid, it is incumbent on courts to ensure that those techniques are performed reliably before admitting the resulting conclusions as evidence against a defendant in a criminal case. As such, we write to emphasize the correctness of the Court of Appeals ruling concerning the inadmissibility of the latent fingerprint evidence, as applied in this case, which correctly enforced this crucial principle. State v. McPhaul, 2017 N.C. App. LEXIS 924, 808 S.E.2d 294 (2017). Importantly, we do not address the issue of the reliability or admissibility of latent fingerprint evidence in general. Indeed, a leading scientific body has recently described latent fingerprint analysis as a foundationally valid technique, but only when applied in a reliable fashion. For such a technique, it becomes even more critical to ensure that the methods underlying such analysis are properly applied. The potential for unreliability and error, which can have devastating effects, lies in the improper application of this technique by particular analysts and in particular cases. In this case, defendant Juan McPhaul was convicted of, inter alia, attempted murder, assault, and conspiracy to commit robbery based, in part, on the testimony of a forensic analyst who described evidence from latent fingerprints. However, the analyst could not provide even a modicum of an explanation for how the fingerprints were compared, for how long, based on what features or criteria, or with any documentation, and could therefore not provide any objective reason to support her conclusion. The trial judge pressed the analyst for more details concerning the analysis, but none were forthcoming. Despite repeated questioning from counsel and from the trial judge, the analyst could not provide a verbal or documentary explanation of: which features of the prints were examined; which features matched; what standard was followed or threshold used to determine that matches were significant; what steps taken to safeguard against known biases in subjective analysis; what training or research supported the analysis; what authority cited for conclusions, including the scientifically indefensible assertion that the latent fingerprints in fact came from the defendant, using terminology that has been definitively rejected in the field of latent fingerprint analysis for years. As a result, the Court of Appeals correctly concluded under Rule 702 of the North Carolina Rules of Evidence that reliable methods and principles were not applied to the evidence in this particular case.

Related Resources

How do Labs Ensure Quality? A Nationwide Review of SOPs for Latent Print Examination

How do Labs Ensure Quality? A Nationwide Review of SOPs for Latent Print Examination

This presentation is from the 108th International Association for Identification (IAI) Annual Educational Conference, Reno, Nevada, August 11-17, 2024. Posted with permission of CSAFE.
Statistics and its Applications in Forensic Science and the Criminal Justice System

Statistics and its Applications in Forensic Science and the Criminal Justice System

This presentation is from the 2024 Joint Statistical Meetings (JSM), Portland, Oregon, August 3-8, 2024.
Silencing the Defense Expert

Silencing the Defense Expert

In the wake of the 2009 NRC and 2016 PCAST Reports, the Firearms and Toolmark (FATM) discipline has come under increasing scrutiny. Validation studies like AMES I, Keisler, AMES II,…
Demonstrative Evidence and the Use of Algorithms in Jury Trials

Demonstrative Evidence and the Use of Algorithms in Jury Trials

We investigate how the use of bullet comparison algorithms and demonstrative evidence may affect juror perceptions of reliability, credibility, and understanding of expert witnesses and presented evidence. The use of…