Proposed Legislation for Forensic Algorithms Aligns with CSAFE Mission of Transparency

How fair are forensic algorithms? These tools prove helpful in matching evidence such as a fingerprint or a gun barrel, but without transparent access to the source code behind the software, concerns arise.

U.S. lawmaker Mark Takano introduced new legislation in September 2019 aimed at providing defendants facing federal criminal charges equal access to forensic algorithms. The bill also requires the makers of computational forensic software to meet minimum standards set forth by NIST.

Takano calls on NIST to test forensic software, gauging what the limitations are, what the science says about data and explaining how these algorithms work.  He highlights how critical it is that technology companies are transparent about their algorithm testing data, and potential error rates. “Intellectual property rights should not be able to trump due process,” Takano said.

The CSAFE mission directly aligns with this new legislation. Our team is committed to open-source, repeatable and reproducible research with publicly available data.

“We believe in complete openness and transparency.  CSAFE wants to have open data so other people can have access to the same kind of information. We are implementing new features so they are accessible to everyone, not just a selected group,” CSAFE researcher Heike Hofmann said.

Visit the CSAFE Data Portal to access datasets in a variety of forensic disciplines. CSAFE automatic matching algorithms are also freely available to the public on the CSAFE Tools page.

Review more details of Takano’s bill in Science Magazine.

 

Fixing the Field of Forensics: The Washington Post Asks the Experts

Since the landmark National Academy of Science report published a decade ago, many people are willing to raise questions regarding the issues with forensics in the courtroom.

Washington Post contributor Radley Balko dug deeper into these concerns in a six-part series, interviewing a panel of experts in law, science and forensics. CSAFE researchers Brandon Garrett and Simon Cole were among the contributors.

What are these issues? A few include subjectivity in evidence analysis procedures, lack of standards for methods and cognitive bias.

Feasible solutions that fit within the context of the U.S criminal justice system are not easy to find, but Radley explores new ideas.

He asked the following six questions of 14 panelists:

  1. Who should determine what expertise a jury will and won’t be allowed to hear at trial? We need some way of assessing the reliability of scientific and expert testimony. What would the ideal system look like?

 

  1. What, other than single-source DNA testing, can be used in a criminal trial? Are critics of modern forensics saying that other fields don’t have value in front of a jury? How do we ensure that juries are accurately accounting for the shortcomings in these fields?

 

  1. How do we ensure that the justice system operates on reliable information? Is it even possible to “fact check” our courts in a way that enforces accountability, or are we simply stuck hoping that appeals court judges will admit and correct their mistakes?

 

  1. Do you agree that the qualities and characteristics of a good scientist are contradictory to, or even incompatible with, the sorts of experts that juries tend to find persuasive? If so, what can be done to address this problem?

 

  1. How much interaction between law enforcement and a forensic analyst is appropriate, and what safeguards can be put in place to minimize cognitive bias?

 

  1. Can you suggest three forms that would improve the quality of expert testimony in criminal cases?

Learn what the experts have to say by heading over to the Washington Post. Discover how CSAFE is working directly solve these problems by exploring our research and training initiatives.

Insights: What do Forensic Analysts Consider Relevant to their Decision Making?

INSIGHT

What do Forensic Analysts Consider Relevant to their Decision Making?

OVERVIEW

Forensic analysts make critical judgments that can play a crucial role in criminal investigations, so it is important that their decisions are as objective as possible. However, they often receive information that may not be relevant to their work and can subconsciously bias their analyses.
Researchers surveyed analysts from multiple forensic disciplines to see what information they consider relevant to their tasks.

Lead Researchers

Brett O. Gardner
Sharon Kelley
Daniel C. Murrie
Itiel E. Dror

Journal

Science and Justice

Publication Date

September 2019

Publication Number

IN 101 IMPL

Goals

1

Discover what information analysts consider relevant

2

Evaluate whether there is a general consensus across disciplines

3

Determine if these opinions match the National Commission of Forensic Science’s definition of task-relevance

The Study

The National Commission of Forensic Science (NCFS) defines task-relevant information as:

“Necessary for drawing conclusions: 1) about the propositions in question, 2) from the physical evidence that has been designated for examination, [and] 3) through the correct application of an accepted analytic method by a competent analyst.”

The team surveyed 183 forensic analysts among four primary forensic disciplines: Biology, Pattern Evidence, Chemistry, and Crime Scene Investigation. The survey contained 16 different types of information regarding either a case, suspect, or victim.

The analysts categorized the importance of each type of information to their specific tasks, labeling them as either:

Essential

Irrelevant

Would Review If Available

Results

1

Among four forensic science disciplines and 16 types of information (resulting in 64 total ratings for task-relevance), the analysts only reached 100% consensus three times. In fact, in 45 of 64 items, opinions between analysts directly contradicted each other.

2

However, in 36 ratings, the analysts reached a near-consensus where over 75% agreed. Pattern evidence analysts had the highest rate of consensus and crime scene investigators had the most disagreement.

3

Most analysts, apart from crime scene investigators, agreed that personal information regarding a suspect or victim was irrelevant to their tasks. This is consistent with the NCFS’s guidelines for task relevance.

4

The opinions of crime scene investigators were distinct from the other disciplines, as their task is to gather information rather than analyze it.

Focus on the future

 

While the survey contains which types of information the analysts consider relevant, it does not explain why they made these decisions.

It is important to remember that people do not always know the full reasoning behind their decision making.

Even within the same forensic disciplines, different laboratories may not have the same guidelines for what they consider relevant.

The forensic disciplines must reach a general consensus on what information is task-relevant.

Countering Bad Science With Open Access

An international group of researchers investigated the openness of forensic science research by taking a closer look at 30 forensic science journals. What they found is concerning.

In a new paper published in the Journal of Law and the Biosciences, Dr. Jason Chin, law and psychology lecturer at Sydney Law School and his colleagues discovered that much of forensic science research operates behind closed doors, making verification of published results nearly impossible.

Researchers found that many journals do not require authors to post their data online for others to scrutinize. Despite safeguards of research protocols such as validation testing and blinding, individual biases and more can still taint research findings.

Yet the prosecution often relies on these results and often the accused doesn’t have an equal opportunity to review the information. Without open access, criminal verdicts are vulnerable to distortion. For example, The Innocence Project found that nearly half of wrongful convictions in the US overturned by DNA evidence are based on invalidated or improper forensic science.

Chin states, “Openness is one fix for this. Removing journal paywalls, for example, can prompt more widespread verification of results.”

CSAFE reiterates Chin’s suggestions. “Open source data for the forensic community is important because it allows solutions that use the data to be benchmarked by not only the team providing the solution but other community members. This enables the community to benefit from thorough testing, to find weaknesses and strengths of the tools and also of the data itself,” CSAFE researcher Jennifer Newman said.

Working to counteract the roadblocks to open-source data in forensic science, CSAFE released a  Data Portal, providing public access to forensic science datasets for anyone to implement in their analysis techniques. Read more in our recent news story, and learn about newly proposed U.S. legislation to increase transparency in forensic science.

Access the full paper by Dr. Chin: https://academic.oup.com/jlb/advance-