- This event has passed.
Webinar: IRT for Forensics: A re-analysis of the FBI “Black Box” Study
Thursday, April 8, 2021 at 1:00 pm - 2:00 pm CDT
FreeThis event took place on April 8, 2021. A recording of the event can be found below.
CSAFE invites researchers, collaborators, and members of the broader forensics and statistics communities to participate in our Spring 2021 Webinar Series on Thursday, April 8, 2021, from 1:00pm-2:00pm CST. The presentation will be “IRT for Forensics: A re-analysis of the FBI “Black Box” Study.”
Presenter:
Amanda Luby
Assistant Professor in Statistics, Swarthmore College
Presentation Description:
In this webinar, Amanda Luby will explore how Item Response Theory (IRT), a class of statistical methods used prominently in educational testing, can be used to measure participant proficiency in error rate studies. Using the FBI “Black Box” data, Luby will illustrate the strengths of an IRT-based analysis over traditional “percent correct” scoring.
The FBI “Black Box” Study (Ulery, 2011) was designed to estimate casework error rates of latent print comparisons in the United States, and additional “Black Box” studies have been called for in other pattern evidence disciplines to estimate error rates. While such studies provide error rate estimates aggregated over all examiners, we cannot directly compare individual examiners’ error rates, since each participant typically is asked to evaluate a random subset of comparison tasks (items) and some items are more difficult than others.
IRT estimates proficiency among participants while simultaneously accounting for varying difficulty among items. Using an IRT-based analysis, we find that the largest variability in examiner decisions occurs in print quality assessments and inconclusive decisions. We also find some participants were likely to over- or under-report difficulty even after accounting for their proficiency, item difficulty, and other participants’ reported difficulty; and examiners who report items to be more difficult perform similarly to examiners who report items to be easier. These results underscore the importance of better understanding the cognitive factors involved in latent print examination decisions.