Could Your Lab Benefit from Standardized Latent Print Examination? OSAC Process Maps Provides Insight

fingerprints and eyeglass

How does your latent print examination process compare to other laboratories? OSAC released a newly revised process map of friction ridge examination that could provide valuable insight.

Use this illustration of conventional processes to identify areas for improvements and discover where standards could benefit your program.

Are you interested in learning more about the steps of fingerprint examination? This tool walks readers through the process known as ACE-V:

  • Analysis
  • Comparison
  • Evaluation
  • Verification

The OSAC process map serves as a baseline for the complexity of fingerprint examinations and is a tool for implementing standardized protocol across the nation.

Learn more from NIST news.

NIJ Final Report Explores Discrepancies in Quantifying the Weight of Evidence

Through the National Criminal Justice Reference Service, NIJ has made available a final technical report of the research project, “Foundational Research into the Quantification of the Value of Forensic Evidence for Complex Evidential Forms arising from Impression and Pattern Evidence.”

Funded by the National Institute of Justice, and led by researchers Cedric Neumann, Christopher P. Saunders, the four-year project investigated the validity, accuracy and computational complexity of methods designed to quantify the weight of complex evidence forms, such as pattern and trace evidence.

Researchers took a closer look at source identification using two formal frameworks” the “common source” and “specific source.” The goals were to uncover why scientists may have differing opinions when quantifying the weight of forensic evidence.

The Project’s Four Phases:

  1. The validity, accuracy and computational complexity of several methods for quantifying uncertainty in forensic conclusions was studied using a toy glass example, where the ground truth weight of evidence was known;
  2. Two frameworks for using similarity measures (as a means to reduce the complexity of the problem) in the quantification of the weight of evidence were developed, and their reliability and accuracy were studied;
  3. The two frameworks were applied to a variety of complex forms of evidence, such as fingerprints and FTIR analyses of fibers, and are currently being implemented for handwriting and powder residues;
  4. The reliability of the proposed frameworks, when applied to these evidence types, was

A Snapshot from the Report:

“The project relies on a mixture of analytical proofs and statistical simulations to develop and study numerical methods to assign the weight of forensic evidence. Instead of just considering the observations made on the trace and control material, we also explicitly consider the observations made on samples from the population of potential sources. This paradigm change allows for more formal development of numerical methods aimed at quantifying the weight of forensic evidence and for a more rigorous study of their convergence.”

Interested in reading the full report and its potential benefits to the criminal justice system? Learn more through the National Criminal Justice Reference Service. Discover how the National Institute of Justice continues to advance the field of forensic science on the NIJ website.

The Go-To Podcast for Everything Forensic Science

Your morning commute, lunch break or stop to the gym is a great time to check out the Just Science podcast. Designed for anyone interested at an inside look at crime laboratories, tune in to learn how taking steps to improve accuracy and efficiency solves more crimes. This effort by RIT International’s Center for Forensic Sciences covers a wide range of interesting topics, spanning every type of forensic discipline. We recommend the 2019 season “Forensic Advancement,” which dives into challenges facing leaders in the forensic community.

CSAFE tackles several issues addressed in this season. View the following episodes to hear our colleagues’ perspectives:

  1. Just Cognitive Bias Awareness
  2. Just Blind Proficiency Testing

Find more seasons on new technologies and broader challenges for science and public security on the Just Science site.

How Much Information is Too Much? Explore the Role of Contextual Bias in Forensic Science

Forensic Examiners in Laboratory

Is it possible for a forensic examiner to receive too many details about a piece of evidence? Contextual information such as where the evidence was collected and the environment at the scene may be helpful, or it could create unintentional bias.

Dr. William Thompson of the University of California, Irvine collaborated with the National Institute of Justice to examine contextual bias. In a report published in April 2019, Thompson explores strategies for ensuring forensic examiners have enough detail to perform a rigorous scientific examination while also shielding them from potentially biasing contextual information.

Thompson Explored Three Potential Solutions:

  • The Case Manager Model:
    • Functions in the laboratory are separated between case managers and examiners. Managers are fully informed about context, while analysts are provided with only the information needed for specific analytical tasks
  • Sequential Unmasking:
    • Sequencing the order of various analytic tasks to ensure examiners make key analytics judgments before being exposed to biasing information
  • Blind Re-Examination:
    • Key judgments of an initial non-blind examiner are replicated by a second examiner who has not been exposed to potentially biasing information.

Learn more about these strategies, and the role contextual information plays in an examiner’s decision-making process in “Developing Effective Methods for Addressing Contextual Bias in Forensic Science.”

For an inside look at bias, review our interview with CSAFE psychologists. The Innocence Project is also tackling the effects of bias in forensic science. Find more details in our guest post.

The Forensic Community Looks Forward to Faster, More Efficient Standards Evaluation in OSAC 2.0

OSAC Logo

The National Institute of Standards and Technology’s (NIST) Organization of Scientific Area Committees (OSAC) for Forensic Science is implementing new strategies to streamline the forensic standards evaluation process.

In September 2019 OSAC announced new measures to improve efficiency when evaluating standards, including reorganizing its committees and inviting public comment earlier in the process.

OSAC is the nation’s primary professional group dedicated to advancing technically sound standards in forensic science. Its members serve in forensic laboratories and other institutions around the country, contributing expertise in 25 forensic science disciplines, as well as scientific research, measurement science, statistics, law and policy. The transparent, consensus-based process helps define minimum requirements, best practices, standard protocols, and more to help ensure that the results of forensic analysis are reliable and reproducible.

Learn more about updates to organizational structure, membership and the standards approval process at OSAC, and explore the latest standards news on the OSAC website.

Proposed Legislation for Forensic Algorithms Aligns with CSAFE Mission of Transparency

How fair are forensic algorithms? These tools prove helpful in matching evidence such as a fingerprint or a gun barrel, but without transparent access to the source code behind the software, concerns arise.

U.S. lawmaker Mark Takano introduced new legislation in September 2019 aimed at providing defendants facing federal criminal charges equal access to forensic algorithms. The bill also requires the makers of computational forensic software to meet minimum standards set forth by NIST.

Takano calls on NIST to test forensic software, gauging what the limitations are, what the science says about data and explaining how these algorithms work.  He highlights how critical it is that technology companies are transparent about their algorithm testing data, and potential error rates. “Intellectual property rights should not be able to trump due process,” Takano said.

The CSAFE mission directly aligns with this new legislation. Our team is committed to open-source, repeatable and reproducible research with publicly available data.

“We believe in complete openness and transparency.  CSAFE wants to have open data so other people can have access to the same kind of information. We are implementing new features so they are accessible to everyone, not just a selected group,” CSAFE researcher Heike Hofmann said.

Visit the CSAFE Data Portal to access datasets in a variety of forensic disciplines. CSAFE automatic matching algorithms are also freely available to the public on the CSAFE Tools page.

Review more details of Takano’s bill in Science Magazine.

 

Fixing the Field of Forensics: The Washington Post Asks the Experts

Since the landmark National Academy of Science report published a decade ago, many people are willing to raise questions regarding the issues with forensics in the courtroom.

Washington Post contributor Radley Balko dug deeper into these concerns in a six-part series, interviewing a panel of experts in law, science and forensics. CSAFE researchers Brandon Garrett and Simon Cole were among the contributors.

What are these issues? A few include subjectivity in evidence analysis procedures, lack of standards for methods and cognitive bias.

Feasible solutions that fit within the context of the U.S criminal justice system are not easy to find, but Radley explores new ideas.

He asked the following six questions of 14 panelists:

  1. Who should determine what expertise a jury will and won’t be allowed to hear at trial? We need some way of assessing the reliability of scientific and expert testimony. What would the ideal system look like?

 

  1. What, other than single-source DNA testing, can be used in a criminal trial? Are critics of modern forensics saying that other fields don’t have value in front of a jury? How do we ensure that juries are accurately accounting for the shortcomings in these fields?

 

  1. How do we ensure that the justice system operates on reliable information? Is it even possible to “fact check” our courts in a way that enforces accountability, or are we simply stuck hoping that appeals court judges will admit and correct their mistakes?

 

  1. Do you agree that the qualities and characteristics of a good scientist are contradictory to, or even incompatible with, the sorts of experts that juries tend to find persuasive? If so, what can be done to address this problem?

 

  1. How much interaction between law enforcement and a forensic analyst is appropriate, and what safeguards can be put in place to minimize cognitive bias?

 

  1. Can you suggest three forms that would improve the quality of expert testimony in criminal cases?

Learn what the experts have to say by heading over to the Washington Post. Discover how CSAFE is working directly solve these problems by exploring our research and training initiatives.

Countering Bad Science With Open Access

An international group of researchers investigated the openness of forensic science research by taking a closer look at 30 forensic science journals. What they found is concerning.

In a new paper published in the Journal of Law and the Biosciences, Dr. Jason Chin, law and psychology lecturer at Sydney Law School and his colleagues discovered that much of forensic science research operates behind closed doors, making verification of published results nearly impossible.

Researchers found that many journals do not require authors to post their data online for others to scrutinize. Despite safeguards of research protocols such as validation testing and blinding, individual biases and more can still taint research findings.

Yet the prosecution often relies on these results and often the accused doesn’t have an equal opportunity to review the information. Without open access, criminal verdicts are vulnerable to distortion. For example, The Innocence Project found that nearly half of wrongful convictions in the US overturned by DNA evidence are based on invalidated or improper forensic science.

Chin states, “Openness is one fix for this. Removing journal paywalls, for example, can prompt more widespread verification of results.”

CSAFE reiterates Chin’s suggestions. “Open source data for the forensic community is important because it allows solutions that use the data to be benchmarked by not only the team providing the solution but other community members. This enables the community to benefit from thorough testing, to find weaknesses and strengths of the tools and also of the data itself,” CSAFE researcher Jennifer Newman said.

Working to counteract the roadblocks to open-source data in forensic science, CSAFE released a  Data Portal, providing public access to forensic science datasets for anyone to implement in their analysis techniques. Read more in our recent news story, and learn about newly proposed U.S. legislation to increase transparency in forensic science.

Access the full paper by Dr. Chin: https://academic.oup.com/jlb/advance-

How Can Cognitive Bias Impact Forensic Evaluations? New Innocence Project Review Article Takes a Closer Look

Examiner swabbing evidence

**This is a guest post from The Innocence Project Researchers**

Can cognitive biases, which are a common feature of human decision-making, affect the outcome of forensic evidence analysis? An April 2019 review article by researchers from the Innocence Project shows that yes, even well trained and experienced forensic scientists may be susceptible to confirmation bias.

“Confirmation bias” is the tendency we all have to look for and remember information that matches our initial impressions or beliefs and to discount contradictory information. “Cognitive bias research in forensic science: A systematic review” by Glinda Cooper and Vanessa Meterko examines confirmation bias in the context of the evaluation of forensic evidence.

The review, published in Forensic Science International, encompasses 29 studies covering 14 different forensic disciplines. These studies explored, for example, whether case information irrelevant to the forensic testing influenced analysts’ conclusions. For instance, one study asked whether the type of clothes found with a skeleton could affect forensic anthropologists’ conclusions regarding the sex of the skeleton based on their analysis of the bones.

Other studies examined the process used to choose samples for comparison, like whether a crime scene hair sample is compared to hair from a single suspect or to a “line-up” of samples from several people, similar to a line-up used for eyewitness identification.

Study results indicate that laboratories can take preventative steps to avoid situations that can make analysts vulnerable to confirmation bias. Limiting access to unnecessary information, such as whether a suspect confessed, is one such strategy. Other measures include using multiple comparison samples, and blinding analysts to any previous evidence evaluation results.

One of the key recommendations of the 2009 National Academy of Sciences report, Strengthening Forensic Science in the United States: A Path Forward, was to encourage research on human observer bias and human error. Researchers have heeded this recommendation, evidenced by this review article ranking among the most downloaded articles in this journal.

CSAFE researchers are also investigating techniques to limit the impact of human factors, and distinguish between task-relevant and irrelevant information for forensic scientists. We look forward to collaborating with the forensics community as we work together to promote increased accuracy in forensic evidence analysis.

Ensuring Accurate Analysis of Biological Evidence: The NIST Human DNA Standard

DNA

DNA profiles can play an important role in criminal investigations. However, ensuring accuracy when developing genetic profiles is key.

In July 2019, NIST researchers released the latest version of a human DNA standard to help crime labs get the technique right. With this standard, labs get not only the DNA itself but also an accurate DNA profile for comparison.

DNA labs use the NIST standard to double-check their instruments and methods. The goal of using a standard is to promote quality control, prevent wrongful convictions due to faulty processes. Researchers explain that this way if questions come up in court, experts can say “We’ve properly calibrated our instruments using the NIST standard.”

NIST forensic DNA scientist Becky Steffen takes a closer look at the human DNA standard in her Q&A article.

CSAFE researchers utilize other NIST standards such as the standard bullet to calibrate our instruments and promote accuracy. Learn more about how CSAFE uses the NIST standard bullet in our firearms and ballistics analysis.