In this insightful panel discussion at ISHI 34, experts Amy McGuire, Bruce Budowle, and David Gurney delve into the complex and evolving intersection of genetic privacy and crime solving.
Bruce, with over 40 years of experience including a significant tenure at the FBI, shares his insights on the crucial need to regulate the use of genetic technology in solving serious crimes.
Amy, the Director of the Center for Medical Ethics and Health Policy, brings to the table two decades of experience in biomedical research. She emphasizes the importance of public trust and ethical considerations in the use of consumer genetics for crime solving.
David, an assistant professor, provides a legal perspective, focusing on state laws and the necessity of limiting investigative genetic genealogy to individuals who have opted in.
This panel offers a comprehensive view of the challenges and potential of genetic technology in forensic science, underscoring the balance between technological advancements and the imperative of protecting genetic privacy. Join us for a deep dive into how these experts envision the future of genetics in law enforcement and the ethical implications that come with it.
Laura: Well, welcome, Amy. Bruce. David, thank you so much for joining us. We’re here at ISHI 34. This is our annual video series. You presented on a panel for us on genetic privacy. Let’s get this right, exactly. Maintaining genetic privacy while solving crimes. We really appreciated it. So nice to have you here. Before we get into details about your panel, why don’t we each go around the room and say a little bit about your background? Bruce, you want to start?
Bruce: Well, I’m going to start. Okay. You’re close. Okay, well, I’ve been in this field for over 40 years and was 26 years at the FBI. After I was paroled from the federal government, I went to the University of North Texas Health Science Center and ran the program there as the Director of the center for Human Identification. About a year and a half or so ago, I decided to retire from that. And now my daughter told me, I don’t understand the concept of retirement, which is true. So, I’m building capacity in Africa right now and working on forensic genetic genealogy as well.
Laura: Fantastic. And, Amy, how about you?
Amy: So, I’m the director of the center for Medical Ethics and Health Policy at Baylor College of Medicine, and I’ve been conducting research for about 20 years on emerging technologies, novel therapeutics, and in biomedicine. Much of my research is focused in the area of genetics, and I’ve done a lot of work in the area of direct-to-consumer genetics. So, I got very interested in the use of genetic technologies to help solve crimes about 5 or 6 years ago when the use of investigative genetic genealogy kind of brought together the world of forensic science and direct to consumer genetic databases. And so that’s the research that we’ve been doing. And we have some funding to do some empirical research looking at perspectives and policies around investigative genetic genealogy.
Laura: Fantastic. Thank you.
David: Hi. So, my name is David Gurney, and I’m an assistant professor of law and society at Ramapo College. And I’m also the director of the Investigative Genetic Genealogy Center there. And we sort of have three goals at the center. We work on education. We have educational programs for people who want to learn IGG. We do casework entirely pro bono. And we’re really pushing towards a focus on wrongful convictions for our casework. And then we do some research and expansion of IGG as well.
Laura: Fantastic. So, Amy, you were the moderator. Maybe you can tell us a little bit about the panel and the topics and why they were chosen?
Amy: So, we wanted to do a panel that was focused on the forensic use of DNA in a variety of different circumstances and some of the privacy issues that that raises. And then sort of how can we address some of those privacy issues? And so, our first panelist really kind of gave us a background of how DNA is used in investigative techniques. And then we had Bruce talking about sort of informed consent as one of the kind of levers that we can use to try to address privacy considerations or concerns. David talked about a second lever, which is through regulation. So particularly specifically state law and how that’s being used to address privacy issues. And then our last panelist was reporting data from our study. That was Christie Guarini, and she was reporting data on the work that we’ve been doing, looking at stakeholders and members of the public’s views about genetic privacy, particularly as it relates to investigative genetic genealogy.
Laura: And I’m sure that was fascinating. Yeah, I’d be very interested to see that study at some point. Your interests lie in ethics and the legal implications of human genetics. How is this evolved over the years? And can you tell us a little bit more about that?
Amy: Yeah, absolutely. So, over the last, you know, 20 years, the use of genetics has become so much more ubiquitous, right? As we’ve completed the Human Genome Project, the technology’s advanced so, so dramatically. And there’s been so many additional ways to use genetic information, some of which is good and some of which is not so good, right? And so, we’ve seen a lot of evolution, both in the ways in which genetic technologies are being used by various actors, though policies that are being enacted to continually try to address issues around genetic privacy as well as public attitudes about genetics and how we think about genetics and, you know, the personal nature of our genome and how it’s used. And I think particularly sort of especially in this field, the recognition of just how much we can use our own genetic information to link us to genetic relatives has really been kind of game changing, and how we think about our identity as human beings, our connection to each other, those sorts of things.
Laura: It’s really remarkable watching the changes that have happened just over the last couple of years and what’s to come. Maybe this is asking too much, but is that really, to paraphrase what the study found, just very simply like the how people feel about it.
Amy: So there’s various components to the study that we’re currently working on, but in terms of the general public, we just completed a survey and we’re still in the process of sort of doing some of the analysis. Some of the key takeaways are that, I think, not surprisingly, there seems to be general support for the use of investigative genetic genealogy for a variety of use cases. But there is concern about genetic privacy, and there’s a desire for sort of guardrails to be put in place. And so, I think the real work is now figuring out what are the appropriate guardrails and how do we enact those?
Laura: Absolutely. Well, that segues nicely, David, you were talking about IGG, and there are definitely some huge benefits and also some concerns out there. It’s certainly been in the media almost constantly. So maybe you can tell us a little bit about what you spoke about and how those concerns might be addressed too.
David: Yeah. So, I was really looking at the sort of legal landscape. So how have state legislatures tried to regulate the practice of IGG? Have they tried to regulate it? What are the differences between the states? I talked a little bit about the four primary laws that exist right now that directly regulate the practice of IGG. Those are in Montana, Maryland, Utah, and Florida. And I talked about how really when we look at the legal landscape right now in some states, they seem to be focused on protecting the privacy not only of genetic relatives (people who are in the databases), but the suspect in a criminal case themselves. So, the person whose DNA is found at the crime scene. But other states are more interested in protecting just the rights of the relatives of that individual. So, there’s some discrepancies there. There’s differences in what states are interested in protecting. And I also talked about how there are some concerns with some of these laws. So one of the questions that Amy asked at the end, actually, I think is significant.
So, in Maryland, you have to have a warrant to start IGG in the first place. In Utah you don’t. Right? So, there are restrictions on how you can conduct IGG in Utah, but you don’t need a warrant. And I think one of the things that people think about when they consider, well, what’s the strongest way to protect people’s privacy? We think if there’s a warrant that does it. But really, when you’re talking about a warrant, when police try to get a warrant, they’re going to try to get as much information as possible. And I think there’s a risk that if a warrant is required for this, police are going to use that to try to access everyone in the genetic genealogy databases, even those people who are not opted in. Whereas if we had state laws that simply restrict the use of the technique to opted ins and have other provisions that restrict IGG practitioners, I think that’s a better framework. So, I’m curious to see how additional states are going to move forward with their laws.
Laura: That definitely adds a lot of complexity with the state-by-state regulation. What do you think the average person should consider as they’re, you know, deciding whether they want to opt in?
David: I mean, I think they should… I mean, that’s a really good question. I think that the benefits in opting in (I’m just going to give you my view here) are that you can help families whose relatives have been missing who are unidentified. You can help those people find closure. And you can also help bring really serious violent criminals to justice. We’re not talking here about people who shoplifted from Walmart. We’re not even talking about people who committed minor felonies. We’re talking about people who are violent offenders who commit horrible murders, rapes, and in many cases, have gotten away with this for decades. I mean, the Golden State Killer, of course, is the poster child for this. You know, it’s so good, I think that he was the first one to be caught because it showed the power of this. And as long as it remains limited to that category of crime, I think people don’t have anything to worry about. I think that’s (me personally) that is the thing that I think needs to be protected the most, that we restrict this technique only to those very serious crimes, and everyone is on board with that. As far as I know, everyone in the community and the public in general. So, I think people need to be aware of that. They need to feel comfortable that when they’re submitting their DNA, when they’re opting in, that that’s what it’s going to be used for. If you are deeply concerned that your DNA, and not even your DNA, but you know your genetic profile is submitted to these databases, which contains relatively limited amounts of information, if you’re concerned that that’s somehow going to be misused in whatever way, then you shouldn’t do it.
But I think you should think really hard through what are what am I actually worried about? What am I actually concerned is going to happen to my genetic profile? Because I think when you do that, you will find that the concerns are maybe less than your initial reaction. I know that was true for me. When I got into this, my thought was, wow, this is, problematic. This data could be used in all kinds of ways. But then as I spoke to more people, as I thought through it, the list of ways, like nefarious ways that could be used narrowed down to almost nothing in my mind. I know not everyone agrees with that. I won’t have shares that, but I would just encourage people to think about that. What are the concerns you really have and weigh those against the benefits? Because really, this is important to say. One person can help solve a case. There are cases that languish for months or years because the matches are not good matches. We had a case like this, and one person takes a DNA test over the weekend. Solves the case. All right. So, you alone can make a difference.
Laura: I mean, it is remarkable. Every day now, we’re seeing cold cases that have languished for decades, as you said, being solved. And it’s remarkable for that person who is concerned, what can law enforcement see? I mean, I like the way that you framed it. You know, that it’s for a certain level of case. But if we got into a bit more detail, is there a succinct way to explain, you know, what they are able to access?
David: So I think there’s a distinction between what the lab sees, what law enforcement sees, and what the IGG practitioner sees, assuming those are all separate entities. At Ramapo, when we’re working cases, what we see is the final upload to FamilyTreeDNA and GEDmatch. Right. So, we’re not involved in processing the DNA. We’re not involved in even the upload itself. We just see the match list. That’s all we have access to. Now, the data that is uploaded to those databases, in theory can be downloaded, and you can get sort of the raw file that includes the actual SNPs and how you code for those SNPs. So, I am a very big advocate of laws that restrict that, or at least industry practices that restrict that. My communication with people who do this work is that nobody is accessing the raw file, at least nobody is intentionally doing that. They’re not downloading it. I would like to see the databases themselves restrict that use for law enforcement practices so that IGG practitioners, if you had some rogue practitioner, they wouldn’t be able to do it even if they wanted to.
Laura: That makes a lot of sense. It probably would offer a measure of comfort to people who want to opt in. You did a great job of summarizing what’s happening on a state level, what’s happening outside of the US? What direction is that taking if there’s anything you can speak to there.
David: So, the US has certainly been the most proactive in this area. There are other countries that are working on regulation. Australia is one in particular is working really hard on this. Great Britain is trying to do this as well. Each country has its own challenges and I think it’s going to be really interesting to see the differences that come to exist between the use of IGG in the United States and other countries. There will be some countries, France, for example, that I would predict will never allow the use of IGG in any case, because they’re so protective of genetic information in that country. But there will always be pressure on that, right? Because when you see more and more cases being solved and there are, of course, unsolved cold cases in France, perhaps that will exert pressure on them. But, you know, it’s going to be up to those individual countries and the sort of propensities of people who live in those countries; what do they value? How do they make that balance?
Laura: Thank you. That’s a great answer to, I think, a question that a lot of people are asking. And it’s hard to have that global picture when we’re very, you know, focused on what’s happening here now. Bruce, you spoke from a different perspective, talking about consent. Maybe you can tell us a little bit about what?
Bruce: Well, I think it builds on both what Amy and David said as Amy’s talking about developing guardrails. It’s talking about being understanding what the risks may be. And so if we do what the people we deal with, whether it’s the targeted people in IGG or it’s the analyst in the lab who donates their sample for studies, or whatever it may be, missing persons, reference samples, we have a responsibility to respect them, to explain things to them, get them to be informed and ensure that they’re informed so they understand the risks. And we have a responsibility. We think about benefits, but we often don’t think about risks. And if we think about the risks, we can build the guardrails in place that Amy talked about to help us do better and that so if we understand them and then we convey them, we get more feedback as well. And then I think it smooths over. Now David talked about, you know, the people. He doesn’t see any risks to that. There are people who are going to see risk to it because we have the world we live in. There are people on one side or the other. But education is always an important point because I remember in the 1970s there was this invention that was there was this public outcry and was a was an abomination against God. And you know what it was? It was in vitro fertilization.
And today that’s just dinner conversation. So, educating people so they can understand seeing it in action and seeing the benefits. But there were some risks in the early days. Most of them were perceived and not realized. But, because we have that these are the kind of things we want to do here. And, you know, we have to look at perspectives. I actually take a different perspective than David did. In one regard is we have parallels and CODIS and FIGG, although managed by differing groups. The CODIS when we started, violent crimes, sexual assault, murders. Why was that. Because that’s what people would support no matter what. The other crimes were not there. But then people found out that 40 to 50 to 60% of the people who committed these violent crimes started at lesser crimes. So, if you really want to address violent crime, I don’t agree that you should only look at violent crimes now, because we have knowledge and history, that people start at lesser crimes, and we really want to get them off the streets. We should be investing this as early as possible as a viable tool, either as an adjunct or to supplant what we have now, if it’s if it meets those kind of qualifications we would want for a tool in our laboratory system.
Laura: I really love a couple of things that you were saying. I mean, any game changing technology, I mean, we could say the same thing about the conversations happening around AI right now. You know, anything, there’s going to be a lot of fear before and now in vitro is just common practice for everybody. What do you think in terms of privacy and opting in, what’s missing? What’s missing from consent practices right now that you would like to see?
Bruce: Well, I’m a strong advocate of opt in, and opt ins should be the only acceptable approach, because that and I think people should document and that they understand what they’re opting in for, because there are people that… Well, we have we have two database systems, one has an opt in, one an opt out. Well, an opt out, someone could have been in that database 15 years ago, could be deceased that we don’t know. They certainly can’t make a decision at that point. Or they may have, you know, lost interest and they’re not even following up on it yet. They’re being used. So always informing people and letting them make a decision specifically to the task at hand or the purpose is much better way to go. And then we just have to ensure that we have… You know, it’s sort of like the days of when recombination first started. We couldn’t control it, but we developed ways to teach people what was acceptable behavior or practices. And I think it’s the same thing here. It’s not enough just to have regulations. We need to educate people to think what is appropriate in this arena, as opposed to just trying to solve the case, because we have cases that don’t go to court. Some cases get thrown out of court, not because of the guilt or innocence, because of due process.Tthese are values in our world. And that and so it’s really important that we have to educate all the players in here. And some of these players are very recent to the forensic and litigation world or the judicial world. And so they may not be well prepared to appreciate the ramifications of what they do. And so, we probably could do more in that way as well to improve the processes.
Laura: Now what happens…? Are there implications and what are the implications if practices are changed in terms of consent on past research, past cases, past anything?
Bruce: I think that’s a really tough question, because when we look at some things, there’s still this proportionality of benefit. There have been some samples collected prior to something going in place, and some have been removed from whatever was collected for, and others where they say, well, you know there’s still this great benefit that outweighs the risk of or the infringement, we’ll say in that case. And so, it’s sort of a case-by-case basis in some of these. But generally, I think the better approach is to try to look ahead, try to at least predict what might be some of them. And I think we have good knowledge today of what those would be. We won’t get it all right, because things change and that. But just thinking about it I think can put a lot of place. But if you ask people, I want to use it for this purpose for FIGG and they sign off on it, I think then you’re really in a safe place and then, you know, attitudes may change, but as long as they’ve signed off on it. You may want to revisit it some day or not, you know, and that and that’s another question we’ll have to deal with some time in the future.
Laura: Okay, well, along those lines, here’s a here’s a question for everybody because we don’t want to miss anything. How do you see this area being explored in the future? Certainly, you’ve touched on some of the things, but, um, we could go around and hear your these speculative but your best guesses on where this is going over the next couple of years.
Bruce: We have multiple things here. One is there’s a technology of using DNA tools and things to get more information. I think that’s a positive thing for the forensic field and that. So being able to get more information on less sample to degraded samples is fundamental for us to deal with the most challenging of samples or evidence, as the case may be. We also have to have better…, again, good governance and accountability that I think will be framed out. It’s sort of… Some things have happened recently, and so people are bent and broken, but sometimes when you’re bent and broken, you get to really reshape better because you’ve learned from that. So, I think we’re going to see better thinking about it, but better conditioning.
I expect that there’s going to be… This is to me, this is a real shift. And we’re going to see more of it. Because what we have is a world where if we talk to people in CODIS, and I helped start CODIS, so I’m a proponent of it. Well, we can’t change anything because we got 20 million profiles in it. That’s a very naive way to think, because most of those profiles are depreciating, because, you know, as you get older, you can’t commit crimes that you can’t when you’re younger. So those you put in in 1998 aren’t the hits of 2023. So, we’re investing in something where most of them are of no value. But the question is, is, well, 20 years from now there’s still going to be crime. Do you want to use a 40-year-old technology or do you have these advancements? So, we have to start thinking about what are the best tools as we go forward, even though there are all these profiles in the past, but it may not be so painful to move ahead.
Laura: Thank you.
Amy: So I’ll go next. I would say I’m not entirely a technology determinist, but I think Bruce makes a really good point that if we look at all past technologies, they usually come with a lot of trepidation in the beginning. And I think we’re in that and for good reason, right? Because you want to be able to think clearly about what the parameters around this should be and what protections need to be in place for this to be implemented appropriately. So, I think we’re in that phase now with investigative genetic genealogy. But we were talking earlier about the, you know, the collection of fingerprints for human identification. That used to be extremely controversial. Some people still find it controversial, but for the most part, it’s accepted. And it’s not considered, you know, that outrageous as well as tons of other types of technologies. You mentioned IVF, but there’s plenty. So, I think this will become more widely accepted. But, I do think it’s really, really critical that in these early days we put in the appropriate safeguards. I would really love to see a focus on downstream accountability for bad actors.
Because, I think that, you know, there’s two ways to kind of address concerns. Right? There’s you can take it from the front end and say, we’re going to limit access. We’re going to put stringent informed consent requirements. We’re going to put stringent, regulations of who can access, how they can access, what they can do with information. And you can also address it from the back end and saying, even if we have some reasonable guardrails around the access, we want to make sure that those who are accessing aren’t using information in ways that are harmful to individuals or inappropriate. And we really, in our society, have done a really poor job, I think, of creating the accountability structures at the back end. So, I’d love to see more of that. I don’t know exactly what that would look like. It’s a hard problem that nobody has been able to solve yet, but I think it’s really important for the field moving forward. So that’s where I think we’re headed.
Laura: Thank you. That’s wonderful.
David: So, I can sort of tell you some areas where I think things are going to move and I can tell you some areas where I hope that they move. I think that sometimes those come together and sometimes they diverge. I think we’re going to see a regulatory framework that develops around this field that is going to be multifaceted. So, I think there’s going to be, you know, a patchwork, really. You’re going to have state laws that interact with federal laws that interact with self-regulation. I think all of these things are going to be really important to the field. There are certain areas where state laws can do the best to protect privacy that can that can make that balance. Federal law can perhaps provide some sort of a floor. And I think practitioners themselves also need to regulate. There are aspects of the regulation, and I talked about these a little bit in my talk, that are so specific to the work that it would be hard to implement those in state laws, and it would be very hard to implement them in a way that could be updated as the methodologies changing as new tools become available. So, with those kinds of things, I think you need self-regulation. You need this robust framework in order for everyone to be accountable.
And I think it’s really great to see that we’re moving in that direction. In 2018, when this field kind of burst into the scene, things were really in the Wild West, and they remained there for a couple of years. You know, in the last two years or so, there’s been a huge move not only from state legislators, not only from academics, not only from privacy advocates, but from practitioners themselves, who are the ones who sort of started this field, who were in there and then beginning but recognized that there were issues, even some issues that they were involved with, that they thought were detrimental for the field and that were not making the appropriate balance. And so, I think everyone is starting to come together to converge on the best set of, of standards for this field. And it’s really great to see. I mean, there’s been a lot of robust discussion is perhaps the polite way to put it, but I think that has to happen at the beginning of any enterprise. You’re going to have disagreements. As long as people are acting in good faith, and I think 99% of people are, I think you can come to some sort of consensus now. So that’s where I think it’s going. I think when it comes to the databases themselves, we’re going to see changes there, because even though I’m an advocate of opt in, I think that that’s how the system should work, that is not how it works, because law enforcement is not beholden to terms of service. Right? So, unless they’re in one of the few states that specifically excludes that practice, they can upload to My Heritage, which is forbidden. And, you know, there’s no penalty for them legally. So, even when people think that they’re opted out, you can’t know. Even if 49 states have robust laws, if there’s one state or if the federal government doesn’t have a law… So at the FBI and other federal actors are not beholden to those state laws. People can’t really know if their information is being used against their wishes. I would love to see, and this is really a hope, every database just say, if you’re going to be part of our database, you have to expect that it’s going to be used for IGG, and if you don’t want that, you shouldn’t be part of our database. I don’t think they want to go that direction because they don’t want to lose users, but I think that would be the safest way so that everyone knows exactly what they’re getting into.
Laura: Extremely wonderful, clear way to describe that, because I think that’s what I’m seeing. I’m not a forensic scientist, but I see the media side and where before all the solves were what everyone wants to talk about, now they want to talk about, you know, what direction is this going? Let’s talk about how we’re going to regulate this or not. You know, what are the guardrails? You know, what does opt in mean? That’s what everybody is interested in talking about. So yeah, and we’ve all touched on it. So, I appreciate that. So, before we let you go and enjoy your evening, is there anything from each of you that we’ve missed that you really want to make sure we include in the video that we haven’t talked about? Maybe something from the questions that came up during your panel. I’m sure there were many. I can only imagine. You did cover a lot of ground.
David: So I mean, I would say, everyone who feels comfortable should take a DNA test, upload a family tree, upload to GEDmatch and make sure you opt in to GEDmatch. And one research project that my colleague and I did, we interacted with hundreds of people asking them about their experiences, asking them to participate in DNA databases and opt in. So many of them, in fact, the majority believed that they were already helping because they had taken an ancestry DNA test. So, this is just an example of the public’s misunderstanding. We had to tell them, no, we don’t have access to Ancestry matches. You need to download and upload to GEDmatch. So, I would just really encourage people to think about the fact that you alone could help, you know, bring closure to a family that’s suffering and to help get a dangerous person off the streets and to help exonerate the wrongfully convicted, which I think this is going to move more and more in that direction.
Laura: Something we’ve been talking about a lot with the video series that we did this year. Yeah, for sure. Amy, anything to add?
Amy: I mean, I think just going back to the focus of, of our study and what we’ve been working on. You know, what’s become really clear is that this whole field is motivated by public trust and public opinion on because these are databases that people have to opt into. And if there’s a lack of public trust or if there’s a perception of abuse or misuse, they’ll pull their kids out and the technique won’t be available. So, I think it’s really, really important to try to figure out how to build a trustworthy system, and also to do more engagement with the public around what is it that they want. Because one of my inclinations is that I think in some respects, I don’t know. I mean, I think people who are in this field may be misunderstanding what the public actually wants. And obviously, you can’t talk about the public as a whole. People have differences of opinion, and we need to respect those differences of opinion. But I think in a lot of cases, as Dave was pointing out, you know, a lot of people would not be opposed to participating. They just don’t really kind of understand what they need to do in order to do that. They don’t have the activation to take the steps to do it. Or for those who would be opposed to participating, they equally need to understand how they can prevent their data from being included.
Laura: Yeah, absolutely. Thank you.
Bruce: So I gotta say something now, I think, you know, if you look at any discipline and we can use DNA and forensics back in the 80s and onward, they grew without any regulations to start. They all had this. You could call it the Wild West at the beginning as well. And as you formalize fields, you get to remove the poor performers anyway through quality and regulation and accrediting and checking and balances and such, just that. This is a very powerful technology. I think it has a lot of value, and just because some things have not gone right here, don’t look at those as an end all to squelch it, just to really think about the power of it and how we can improve upon it. So, the things that went wrong early on aren’t going wrong later on, right?
Laura: Don’t throw the baby out with the bathwater. Okay, final question before we let you go. I love to ask this of everybody, especially people who have been to ISHI before and have come for many years in a row. And we’re coming up on our 35th anniversary. So yes, next year, 35, get ready for some fun in San Antonio. But before then, just would love to hear you know why do you come to ISHI? Why is it important for you to present here or attractive for you to present here? Just your general thoughts or a fun story that from the many years.
Bruce: I can tell you this, this is a fact, and I do think I should get a prize from Promega. There’s only one person who’s been to every ISHI.
Laura: Is that you?
Bruce: That’s me. Wow. So, I guess I really, really should get one. I think a really substantial award, not just, you know, I mean, I’ve been to every one. And I remember the first one because it was, again, early on, DNA and the controversies and all these things that were involved. And it was in Madison, Wisconsin, in December. It was five degrees outside in the morning and two, two people just meeting. There was only like 80 or 90 people in the entire meeting. Yeah, yeah, over a thousand. Only two people went out jogging that morning. One was from Finland, I get it, the other was an idiot. That was me.
But it was again the starting point and these same kind of questions and criticisms and all the issues were being framed. There was just from a different perspective of the technology at the time. And we’re seeing it just continue. And this has been a great forum for that, and it brings people together. We had someone from the Legal Aid Society talking about her perspective. You’ve got something from the victim’s perspective, you’ve got people from the crime labs’ perspectives. And we don’t see that in others of these kind of forensic meetings. They’re very only tech oriented, and they don’t get into the really important part. Why are we doing this and what is it for? And it’s more than just solving crime. It’s, you know, preventing crime. It’s exonerating the innocent. It’s bringing dignity and respect to victims and their families or the survivors of sexual assaults. And it’s even helping with, you know, combating gender-based violence. And it’s a really powerful thing. And this place does it, at least for our country, more so than we’ve seen in the other countries in the world.
Laura: I really appreciate that. Thank you for sharing. Do you have every t-shirt? I have to ask.
Bruce: Well, no, my wife says if I bring another t shirt home, she’s going to kill me.
Laura: Yeah, okay.
Bruce: I’ve got one in the bag already and I’ll probably get two more. So anyway.
Laura: Well, we do have mugs now too. And other things so we can, you know, we’ll see what we can do.
Laura: Okay. How about you?
Amy: I don’t know how I go after that. That was great. Yeah, I mean, similarly, as an academic, it’s very rare that I get to go to conferences where there are people in the trenches doing the work, and I feel like I learned so much from hearing them talk from the various perspectives of what it’s like to be doing the work and what are the, you know, real everyday challenges that they are facing and how are they addressing them and those sorts of things. So, you know, this conference is really special from that perspective. I’m used to going to conferences where people just exchange ideas and, you know, they’re not based in sort of what’s actually happening. So, it’s very, very… I learn a lot. It’s very educational for me to be able to participate.
Laura: I’m so glad you’re here. I can’t believe we haven’t interviewed yet. And I’m very excited. You’re a part of this panel. Thank you. Dave.
David: Yeah. I mean, I can sort of echo what Bruce and Amy have said. Last year was my first year attending, and prior to that, every conference I’d been to, I’d been an academic conference. And I just can’t tell you how different those are in so many ways. And so it was just when I came last year, it was just really eye opening and wonderful to see the community of forensic scientists, law enforcement, defense attorneys, all of these people coming together to, you know, share ideas, but also sort of question each other. Like Bruce said, I think that’s something that that doesn’t happen enough. And it’s so important, especially as new methods, new technologies are being developed, that people are open to criticism, that people are open to rethinking and that that they’re, you know, not allowing their stubbornness to get in the way of doing what’s best for the field and what’s best for the public. It’s a humbling conference to come to. To be able to talk to so many people who are, you know, pioneers of so many aspects of forensic science. It’s just wonderful. I mean, where else could you do that? Where you could meet Bruce one day and Greg Hampikian the other day, you know, a pioneer in the Innocence Project. And so, it’s just wonderful.
Laura: Well, I have to tell you, we are so honored that you you’re willing to come and and present such a fantastic panel and be with us. And we hope that it continues for years to come. Bruce, I hope we can keep you not only at the 35th, but at the 40th, but.
Bruce: Well, maybe someday. I’m going to probably opt out someday. But not just not yet.
Laura: We need you next year. We need you in San Antonio. Well, thank you all so much. I really appreciate it. It’s been wonderful.
Bruce: Thanks for having us.
WOULD YOU LIKE TO SEE MORE ARTICLES LIKE THIS? SUBSCRIBE TO THE ISHI BLOG BELOW!