CSAFE’s October Webinar will Feature New Black Box Study on Bloodstain Pattern Analysis

Figure 1 from the article shows examples of bloodstain patterns used in the study.

A new study assessing the accuracy and reproducibility of practicing bloodstain pattern analysts’ conclusions will be the focus of an upcoming Center for Statistics and Applications in Forensic Evidence (CSAFE) webinar.

The webinar Bloodstain Pattern Analysis Black Box Study will be held Thursday, Oct. 14 from 11 a.m.–noon CDT. It is free and open to the public.

During the webinar, Austin Hicklin, director at the Forensic Science Group; Paul Kish, a forensic consultant; and Kevin Winer, director at the Kansas City Police Crime Laboratory, will discuss their recently published article, Accuracy and Reproducibility of Conclusions by Forensic Bloodstain Pattern Analysts. The article was published in the August issue of Forensic Science International.

From the Abstract:

Although the analysis of bloodstain pattern evidence left at crime scenes relies on the expert opinions of bloodstain pattern analysts, the accuracy and reproducibility of these conclusions have never been rigorously evaluated at a large scale. We investigated conclusions made by 75 practicing bloodstain pattern analysts on 192 bloodstain patterns selected to be broadly representative of operational casework, resulting in 33,005 responses to prompts and 1760 short text responses. Our results show that conclusions were often erroneous and often contradicted other analysts. On samples with known causes, 11.2% of responses were erroneous. The results show limited reproducibility of conclusions: 7.8% of responses contradicted other analysts. The disagreements with respect to the meaning and usage of BPA terminology and classifications suggest a need for improved standards. Both semantic differences and contradictory interpretations contributed to errors and disagreements, which could have serious implications if they occurred in casework.

The study was supported by a grant from the U.S. National Institute of Justice. Kish and Winer are members of CSAFE’s Research and Technology Transfer Advisory Board.

To register for the October webinar, visit https://forensicstats.org/events.

The CSAFE Fall 2021 Webinar Series is sponsored by the National Institute of Standards and Technology (NIST) through cooperative agreement 70NANB20H019.

CSAFE researchers are also undertaking projects to develop objective analytic approaches to enhance the practice of bloodstain pattern analysis. Learn more about CSAFE’s BPA projects at forensicstats.org/blood-pattern-analysis.

Latent print quality in blind proficiency testing: Using quality metrics to examine laboratory performance

Calls for blind proficiency testing in forensic science disciplines intensified following the 2009 National Academy of Sciences report and were echoed in the 2016 report by the President’s Council of Advisors on Science and Technology. Both practitioners and scholars have noted that “open” proficiency tests, in which analysts know they are being tested, allow for test-taking behavior that is not representative of behavior in routine casework. This study reports the outcomes of one laboratory’s blind quality control (BQC) program. Specifically, we describe results from approximately 2.5 years of blind cases in the latent print section (N = 376 latent prints submitted as part of 144 cases). We also used a widely available quality metrics software (LQMetrics) to explore relationships between objective print quality and case outcomes. Results revealed that nearly all BQC prints (92.0%) were of sufficient quality to enter into AFIS. When prints had a source present in AFIS, 41.7% of print searches resulted in a candidate list containing the true source. Examiners committed no false positive errors but other types of errors were more common. Average print quality was in the midpoint of the range (53.4 on a 0-to-100 scale), though prints were evenly distributed across the Good, Bad, and Ugly categories. Quality metrics were significantly associated with sufficiency determinations, examiner conclusions, and examiner accuracy. Implications for blind testing and the use of quality metrics in routine casework as well as proficiency testing are discussed.

Judges and forensic science education: A national survey

In criminal cases, forensic science reports and expert testimony play an increasingly important role in adjudication. More states now follow a federal reliability standard, which calls upon judges to assess the reliability and validity of scientific evidence. Little is known about how judges view their own background in forensic scientific evidence, and what types of specialized training they receive on it. In this study, we surveyed 164 judges from 39 different U.S. states, who attended past trainings at the National Judicial College. We asked these judges about their background in forensic science, their views concerning the reliability of common forensic disciplines, and their needs to better evaluate forensic science evidence. We discovered that judges held views regarding the scientific support for different forensic science disciplines that were fairly consistent with available literature; their error rate estimates were more supported by research than many estimates by laypersons, who often assume forensic methods are nearly infallible. We did not find any association between how judges rate forensic reliability and prior training. We did, however, find that training corresponded with judges’ views that they should, and do in fact, take on a more active gatekeeping role regarding forensics. Regarding the tools judges need to vet forensic experts and properly evaluate forensic science evidence, they reported having very different backgrounds in relevant scientific concepts and having forensic science education needs. Judges reported needs in accessing better material concerning reliability of forensic science methods. These results support new efforts to expand scientific evidence education in the judiciary.

OSAC Public Update Meeting Set for Wednesday, Sept. 29

OSAC LOGO

Plan to attend the Organization of Scientific Area Committees (OSAC) for Forensic Science Public Update Meeting on Sept. 29, 2021, from 1–4:30 p.m. EDT.

This virtual event will feature presentations from the chairs of OSAC’s Forensic Science Standards Board and seven Scientific Area Committees. Each presenter will describe the standards their unit is working on and discuss research gaps, challenges, and priorities for the coming year. Attendees will have the opportunity to ask questions and provide feedback. There is no fee to attend, but registration is required.

OSAC works to strengthen forensic science by facilitating the development of technically sound standards and promoting the use of those standards by the forensic science community. OSAC’s 800-plus members and affiliates draft and evaluate forensic science standards through a transparent, consensus-based process that allows for participation and comment by all stakeholders. For more information about OSAC and its programs, visit https://www.nist.gov/osac.

The meeting agenda and registration information is available on the OSAC website.

Science Bench Book for Judges: Section 4 Introduction to Statistical Thinking for Judges

Alicia Carriquiry, the director of the Center for Statistics and Applications in Forensic Evidence (CSAFE), and Eryn Blagg, a doctoral student in statistics at Iowa State University, wrote a section on statistics for the newly released second edition of the Science Bench Book for Judges.

The Science Bench Book for Judges was created by the Justice Speakers Institute and The National Judicial College with funding by the State Justice Institute. It is available online, downloadable, searchable and free. It is meant to guide and assist judges during pre-trial, trial and post-trial proceedings in both civil and criminal cases.

The bench book provides judges with an overview of legal procedure involving validity, reliability and admissibility of evidence. It introduces research terminology, concepts and scientific methods and includes case citations, relevant legal authority and evidentiary rulings. It was prepared under the guidance and written by appellate justices, judges, legal scholars and lawyers.

Carriquiry and Blagg’s section, Introduction to Statistical Thinking for Judges, explains some of the common concepts used in statistical analysis. They start by defining the basics, including populations and samples, before moving on to different types of data that might arise in legal proceedings. They also discuss various approaches to collecting data, design of studies, summarizing sample information and making conclusions about a population using information from a sample. Carriquiry and Blagg finish the section by discussing how to assess the quality of data from a sample or a study.

To download the complete bench book, visit http://justicespeakersinstitute.com/science-bench-book/.

To download Carriquiry and Blagg’s section, go to http://justicespeakersinstitute.com/wp-content/uploads/2021/01/4-Statistics.pdf.

OSAC Registry Implementation Survey

OSAC LOGO

The Organization of Scientific Area Committees for Forensic Science (OSAC) is asking forensic science service providers to complete an online survey to understand how organizations are using standards on the OSAC Registry and what support they may need to improve standards implementation.

According to the OSAC Registry Implementation Survey webpage, “OSAC wants to better understand how the standards on the Registry are currently being used, the challenges around standards implementation, and what support is needed to improve it. The OSAC Registry Implementation Survey will be a tool we use to collect this information on an annual basis.”

OSAC says that after the survey closes on Aug. 31, they will analyze the responses, and the results will be published in OSAC’s fall newsletter at the end of October.

Forensic science service providers across the country are encouraged to complete this survey (one response per location). It will take approximately 15-45 minutes to complete and must be done in one sitting.

More information and a link to the survey are available at https://www.nist.gov/osac/osac-registry-implementation-survey.

GAO Releases a Second Report on Forensic Science Algorithms

From GAO Report 21-435
GAO-21-435 — Forensic Technology: Algorithms Strengthen Forensic Analysis, but Several Factors Can Affect Outcomes
GAO-21-435 — Forensic Technology: Algorithms Strengthen Forensic Analysis, but Several Factors Can Affect Outcomes

In July, the U.S. Government Accountability Office (GAO) released the report, Forensic Technology: Algorithms Strengthen Forensic Analysis, but Several Factors Can Affect Outcomes.

This is the second report in a two-part series of technology assessments responding to a request to examine the use of forensic algorithms in law enforcement. The first report, Forensic Technology: Algorithms Used in Federal Law Enforcement (GAO-20-479SP), described forensic algorithms used by federal law enforcement agencies and how they work.

In this report, GAO conducted an in-depth analysis of three types of algorithms used by federal law enforcement agencies and selected state and local law enforcement agencies: latent print, facial recognition and probabilistic genotyping. The report discusses

  1. the key performance metrics for assessing latent print, facial recognition and probabilistic genotyping algorithms;
  2. the strengths of these algorithms compared to related forensic methods;
  3. the key challenges affecting the use of these algorithms and the associated social and ethical implications; and
  4. options policymakers could consider to address these challenges.

GAO developed three policy options that could help address challenges related to law enforcement use of forensic algorithms. The policy options identify possible actions by policymakers, which may include Congress, other elected officials, federal agencies, state and local governments and industry.

In conducting this assessment, GAO interviewed federal officials, select non-federal law enforcement agencies and crime laboratories, algorithm vendors, academic researchers and nonprofit groups. It also convened an interdisciplinary meeting of 16 experts with assistance from the National Academies of Sciences, Engineering, and Medicine; and reviewed relevant literature. CSAFE co-director Karen Kafadar, professor and chair of statistics at the University of Virginia, participated in the meeting, as well as Will Guthrie, a CSAFE Research and Technology Transfer Advisory Board member. Guthrie is chief of the Statistical Engineering Division at the National Institute of Standards and Technology.

Read More:

Learn More:

CSAFE researchers are developing open-source software tools, allowing for peer-reviewed, transparent software for forensic scientists and researchers to apply to forensic evidence analysis. These automatic matching algorithms provide objective and reproducible scores as a foundation for a fair judicial process. Learn more about CSAFE’s open-source software tools.

NIST Extends Deadline for Comment on Draft Report on DNA Mixture Interpretation Methods

Credit: N. Hanacek/NIST

The National Institute of Standards and Technology (NIST) has extended the deadline for public comments on NIST Internal Report 8351-DRAFT (DNA Mixture Interpretation: A Scientific Foundation Review). The new deadline is Aug. 23, 2021.

This report, currently published in draft form, reviews the methods that forensic laboratories use to interpret evidence containing a mixture of DNA from two or more people. To read more about the report or to submit comments, visit https://www.nist.gov/dna-mixture-interpretation-nist-scientific-foundation-review.

In case you missed it, NIST hosted the webinar, DNA Mixtures: A NIST Scientific Foundation Review. The webinar reviewed the contents of and findings in the NISTIR 8351-draft report, discussed feedback received up to that point in the public comment period, and provided an opportunity for interested parties and stakeholders to ask additional questions or seek clarification on the draft report. The recording of the webinar can be viewed at https://www.nist.gov/news-events/events/2021/07/webinar-dna-mixtures-nist-scientific-foundation-review.

DNA Mixtures: A Forensic Science Explainer
NIST has also published a webpage explaining DNA mixtures and why are they are sometimes difficult to interpret. https://www.nist.gov/feature-stories/dna-mixtures-forensic-science-explainer

Handwriting Examiners in the Digital Age

Forensic handwriting examiners can only compare writing of the same type. In this case, only the second known sample can be compared to the questioned handwriting. Credit: NIST

From National Institute of Standards and Technology (NIST) News
Published June 3, 2021


 

As people write less by hand, will handwriting examination become irrelevant?

NIST considers the answer to that question in a recent news article. NIST suggests the answer is no, but only if the field of forensic handwriting examination changes to keep up with the times.

The article focuses on some of the recommendations from the NIST updated report, Forensic Handwriting Examination and Human Factors: Improving the Practice Through a Systems Approach. It recommends more research to estimate error rates for the field which will allow juries and others to consider the potential for error when weighing an examiner’s testimony.

The report also recommends that experts avoid testifying in absolute terms or saying that an individual has written something to the exclusion of all other writers. Instead, experts should report their findings in terms of relative probabilities and degrees of certainty.

Melissa Taylor, the NIST human factors expert who led the group of authors, said that the report provides the forensic handwriting community with a road map for staying relevant. But the threat of irrelevance doesn’t come only from the decline in handwriting. Part of the challenge, she says, arises from the field of forensic science itself.

“There is a big push toward greater reliability and more rigorous research in forensic science,” said Taylor, whose research is aimed at reducing errors and improving job performance in handwriting examination and other forensic disciplines, including fingerprints and DNA. “To stay relevant, the field of handwriting examination will have to change with the times.”

To read the full article, visit https://www.nist.gov/news-events/news/2021/06/handwriting-examiners-digital-age.

For more information about CSAFE’s handwriting analysis research, visit https://forensicstats.org/handwriting-analysis/.

The CSAFE handwriting team has developed an open-source software tool called handwriter. This R package utilizes a variety of functions to identify letters and features from handwritten documents. Learn more at https://github.com/CSAFE-ISU/handwriter.

DNA Mixture Interpretations: A Q&A With NIST’s John Butler

John Butler with DNA mixture data. Credit: NIST

From Taking Measure, the official blog of the National Institute of Standards and Technology (NIST)
Published July 28, 2021


 

Whether from skin cells, saliva, semen or blood, DNA from a crime scene is often collected and tested in a lab to see if a suspect’s DNA is likely a contributor to that sample or not. But every DNA sample tells a different story, and some samples are easier to interpret than others. The simplest type of DNA profile to interpret is one where the sample includes hundreds of cells from only one person. When two or more people have contributed to a sample, it’s called a DNA mixture. Some mixtures are so complicated that their stories remain a mystery even to the best forensic DNA experts.

John Butler, a Fellow at the National Institute of Standards and Technology (NIST), and a team of authors have recently completed a draft scientific foundation review of the different methods forensic laboratories use to interpret DNA mixtures. The team urged for more interlaboratory participation in studies to demonstrate consistency in methods and the technology used in DNA mixture interpretation, as well as a need for sharing data publicly. In this interview with NIST’s Christina Reed, Butler — who has over 30 years of experience with DNA profiling, is the author of five books on the subject, and has led training workshops on interpreting DNA mixtures — answers some basic questions about the importance of this fast-growing field of forensic science.

Read the full interview at https://www.nist.gov/blogs/taking-measure/dna-mixture-interpretations-qa-nists-john-butler.

Download DNA Mixture Interpretation: A Scientific Foundation Review at https://www.nist.gov/dna-mixture-interpretation-nist-scientific-foundation-review.

NIST held a webinar on DNA mixtures on July 21, 2021. The webinar was recorded and will be made available for on-demand viewing approximately 10 days after the event. For more information, visit https://www.nist.gov/news-events/events/2021/07/webinar-dna-mixtures-nist-scientific-foundation-review.