Skip to content

How do latent print examiners perceive proficiency testing? An analysis of examiner perceptions, performance, and print quality

Journal: Science & Justice
Published: 2020
Primary Author: Sharon Kelley
Secondary Authors: Brett Gardner, Daniel C.Murrie, Karen D.H.Pan, Karen Kafadar
Research Area: Latent Print

Proficiency testing has the potential to serve several important purposes for crime laboratories and forensic science disciplines. Scholars and other stakeholders, however, have criticized standard proficiency testing procedures since their implementation in laboratories across the United States. Specifically, many experts label current proficiency tests as non-representative of actual casework, at least in part because they are not sufficiently challenging (e.g., [1], [2], [3], [4]. In the current study, we surveyed latent print examiners (n = 322) after they completed a Collaborative Testing Services proficiency test about their perceptions of test items. We also evaluated respondents’ test performance and used a quality metric algorithm (LQMetrics) to obtain objective indicators of print quality on the test. Results were generally consistent with experts’ concerns about proficiency testing. The low observed error rate, examiner perceptions of relative ease, and high objective print quality metrics together suggest that latent print proficiency testing is not especially challenging. Further, examiners indicated that the test items that most closely resembled real-world casework were also the most difficult and contained prints of the lowest quality. Study findings suggest that including prints of lower quality may increase both the difficulty and representativeness of proficiency testing in latent print examination.

Related Resources

What’s in a Name? Consistency in Latent Print Examiners’ Naming Conventions and Perceptions of Minutiae Frequency

What’s in a Name? Consistency in Latent Print Examiners’ Naming Conventions and Perceptions of Minutiae Frequency

Fingerprint minutia types influence LPEs’ decision-making processes during analysis and evaluation, with features perceived to be rarer generally given more weight. However, no large-scale studies comparing examiner perceptions of minutiae…
An alternative statistical framework for measuring proficiency

An alternative statistical framework for measuring proficiency

Item Response Theory, a class of statistical methods used prominently in educational testing, can be used to measure LPE proficiency in annual tests or research studies, while simultaneously accounting for…
Examiner variability in pattern evidence: proficiency, inconclusive tendency, and reporting styles

Examiner variability in pattern evidence: proficiency, inconclusive tendency, and reporting styles

The current approach to characterizing uncertainty in pattern evidence disciplines has focused on error rate studies, which provide aggregated error rates over many examiners and pieces of evidence. However, decisions…
Statistical Interpretation and Reporting of Fingerprint Evidence: FRStat Introduction and Overview

Statistical Interpretation and Reporting of Fingerprint Evidence: FRStat Introduction and Overview

The FRStat is a tool designed to help quantify the strength of fingerprint evidence. Following lengthy development and validation with assistance from CSAFE and NIST, in 2017 the FRStat was…