Skip to content

Creating fingerprint databases and a Bayesian approach to quantify dependencies in evidence

Journal: Online repository of theses and dissertations in the University of Virginia Libraries
Published: 2018
Primary Author: Maria Tackett
Secondary Authors: Under advisement of Dan Spitzner
Research Area: Forensic Statistics

In 2009, the National Research Council issued “Strengthening Forensic Science in the United States: A Path Forward” about the need for more scientific rigor in forensic science. Since then, there has been an effort to make the methods used to analyze forensic evidence more objective, in part through the use of statistics to interpret the evidence. With Lindley (1977) as a guide, this research focuses on two aspects of statistics in forensic science. The first is the creation of large databases that can be used for the development and implementation of statistical methods. We propose a theoretical framework for fully-resourced databases that contain sufficient information to be used for these purposes and demonstrate their use in statistical inference, specifically how the databases can be used to systematically obtain prior information in the Bayesian framework. Recommendations are provided for the type of information that can be included in such databases in the context of fingerprint evidence.

The second aspect is quantifying and interpreting the weight of evidence when multiple candidates are examined as the source of a mark recovered from a crime scene. We propose accounting for the dependencies that exist in the weight of evidence for multiple candidates by imposing a constraint on the set of plausible models, and we examine the properties that exist under this constraint. This research is used to inform guidelines for the examination of multiple candidates identified by a fingerprint matching system such as the Automated Fingerprint Identification System (AFIS).

Related Resources

Towards a likelihood ratio approach for bloodstain pattern analysis

Towards a likelihood ratio approach for bloodstain pattern analysis

In this work, we explore the application of likelihood ratio as a forensic evidence assessment tool to evaluate the causal mechanism of a bloodstain pattern. It is assumed that there…
An Open-Source Implementation of the CMPS Algorithm for Assessing Similarity of Bullets

An Open-Source Implementation of the CMPS Algorithm for Assessing Similarity of Bullets

In this paper, we introduce the R package cmpsR, an open-source implementation of the Congruent Matching Profile Segments (CMPS) method developed at the National Institute of Standards and Technology (NIST)…
Error Rate Methods for Forensic Handwriting Identification

Error Rate Methods for Forensic Handwriting Identification

Presentation is from the 106th International Association for Identification (IAI) Annual Educational Conference
Measuring Proficiency among Latent Print Examiners: A Statistical Approach from Standardized Testing

Measuring Proficiency among Latent Print Examiners: A Statistical Approach from Standardized Testing

This presentation is from the 74th Annual Scientific Conference of the American Academy of Forensic Sciences