Beyond the OMG Era: The Assessment of Forensic Genetic Genealogy as a Practical Investigative Tool

Beyond the OMG Era: The Assessment of Forensic Genetic Genealogy as a Practical Investigative Tool

Forensic Genetic Genealogy (FGG) has been used over the last four years to solve hundreds of cold cases, many dating back decades.  Each day the media reports still another violent crime being solved or a set of unidentified remains being identified through FGG.   As more cases move forward to successful resolution, the capabilities of FGG are becoming more well characterized, tempering expectations and mitigating the risk of over-using a technique once regarded as a miracle cure for the common cold case.

The first agencies to follow up on the initial success of FGG were those who were willing to risk time and money on a brand-new investigative technique that was obviously a game-changer, but where the probability of success had not yet been established.  However, as the catalog of FGG cases has expanded into the hundreds, a great deal more has been learned about why certain cases succeed, how long they take to solve, and the reasons why many cases are proving intractable.  While the requirements placed by FGG processing techniques on the quantity and quality of a DNA sample are open to parametric study and are usually supported by a pre-processing quality check, much of the uncertainty associated with a case’s solvability is dependent on the compatibility of the violent offender or unidentified remains with genetic genealogy database membership. This cannot be known a priori; however, it is possible to make general predictions about a case’s solvability through statistical analyses of solve rates and solve times for cases that have been cleared, and by studying the characteristics of “FGG cold cases” that have yet to be resolved.

Such analysis is important to support both domestic as well as international casework.  For example, developing more streamlined and therefore more efficient methods of addressing cases with high levels of endogamy here in the US could lead to the development of best practices for using FGG on mass graves, where the surviving population and the victims exhibit high intermarriage.  The same can be said for developing policies about target testing individuals who are not knowledgeable about DNA.

This talk will present statistics on FGG solve rates, solve times and other performance metrics as a function of database composition, that can lead to a more cost-effective and efficient use of FGG.  It will provide insight into why and how cases have or have not been solved using FGG.   The talk will end with suggestions of how information revealed on even the most challenging FGG cases can provide investigative leads valuable to conventional investigations.

Forensic Genetic Genealogy (FGG) has been used over the last four years to solve hundreds of cold cases, many dating back decades.  Each day the media reports still another violent crime being solved or a set of unidentified remains being identified through FGG.   As more cases move forward to successful resolution, the capabilities of FGG are becoming more well characterized, tempering expectations and mitigating the risk of over-using a technique once regarded as a miracle cure for the common cold case.

The first agencies to follow up on the initial success of FGG were those who were willing to risk time and money on a brand-new investigative technique that was obviously a game-changer, but where the probability of success had not yet been established.  However, as the catalog of FGG cases has expanded into the hundreds, a great deal more has been learned about why certain cases succeed, how long they take to solve, and the reasons why many cases are proving intractable.  While the requirements placed by FGG processing techniques on the quantity and quality of a DNA sample are open to parametric study and are usually supported by a pre-processing quality check, much of the uncertainty associated with a case’s solvability is dependent on the compatibility of the violent offender or unidentified remains with genetic genealogy database membership. This cannot be known a priori; however, it is possible to make general predictions about a case’s solvability through statistical analyses of solve rates and solve times for cases that have been cleared, and by studying the characteristics of “FGG cold cases” that have yet to be resolved.

Such analysis is important to support both domestic as well as international casework.  For example, developing more streamlined and therefore more efficient methods of addressing cases with high levels of endogamy here in the US could lead to the development of best practices for using FGG on mass graves, where the surviving population and the victims exhibit high intermarriage.  The same can be said for developing policies about target testing individuals who are not knowledgeable about DNA.

This talk will present statistics on FGG solve rates, solve times and other performance metrics as a function of database composition, that can lead to a more cost-effective and efficient use of FGG.  It will provide insight into why and how cases have or have not been solved using FGG.   The talk will end with suggestions of how information revealed on even the most challenging FGG cases can provide investigative leads valuable to conventional investigations.

Workshop currently at capacity. A waitlist is available to join on our registration page.

Brought to you by

Worldwide Association of Women Forensic Experts

Submit Question to a speaker