All human activities carry a risk of error, and handwriting examination is no exception. To reduce errors in this field, NIST convened the Expert Working Group for Human Factors in Handwriting Examination. This expert panel sponsored by NIJ and NIST examined strategies to improve handwriting evaluation methods and outline best practices.
The Group produced a new report, Forensic Handwriting Examination and Human Factors: Improving the Practice Through a Systems Approach. The document takes a closer look at how human factors impact all aspects of handwriting examination, from documenting discriminating features, reporting results and testifying in court.
In the report, you’ll also find a discussion of education, training, certification, and the role of quality assurance, quality control, and management in reducing errors.
CSAFE Resources for Improving Handwriting Evaluation
CSAFE researchers are also working to improve objectivity and reduce errors in handwriting analysis. Our work aims to rigorously assess the role of complexity in signature analysis and relate complexity to examiner performance. We are also developing open-source software and publicly available statistical algorithms for writing comparison to help handwriting examiners integrate quantitative approaches in their work.
The CSAFE Handwriting Database is an interactive, public database designed for the development of statistical approaches to forensic handwriting evaluations.
CSAFE automatic matching algorithms provide objective and reproducible scores as a foundation for a fair judicial process. This R package utilizes a variety of functions to identify letters and features from handwritten documents.