fbpx
Back To Top

AI in Criminal Justice: A Panel Discussion.

❝The word Privacy does not appear in the US Constitution❞ ~Joe Murchado.

❝Just as a defendant has a heartbeat, a prosecutor has to have a heartbeat as well❞ ~ William Ferrier

20
80

In Europe.

❝A right to privacy, you know it's a fundamental human right.❞ ~ Joe O'Murchadha

AI in Criminal Justice: A Panel Discussion.

Collapse

In this Ethics4NextGen AI, Inaugural Summit Panel Discussion on AI in Criminal Justice participants explore AI in specific areas pertaining to Criminal and Privacy law. The panel discusses their experience working with data scientists and the data sets used to train the AI models, and the big question that always seems to be overlooked: whether the data set that’s actually being used for the AI model is appropriate? The panelists also explore what role privacy rights play in Criminal law.  What are the expectations for privacy when it comes to things like social media as opposed to privacy rights in the criminal justice arena? The talk then concludes with a final discussion around the various ways that AI is being used in criminal justice right now and some of the challenges posed by these applications.

Collapse

TRANSCRIPT

00:01 My name is Sanjeev Bajwa and I’m a patent attorney and founder of Illuminaeip solutions, we focus on securing patents and trademarks for clients which include large and small technology companies that implement AI, so today we’ll be discussing AI in specific areas pertaining to criminal and privacy, now before I start off the discussion as a patent practitioner many times I work with data scientists to get their inventions patented and many times we start with the data set that are used to train the AI model. So one of the questions that always seems to be overlooked or maybe it’s not the correct place for what I do but maybe as data scientists you maybe should start thinking about this is, um whether the data set that’s actually being used for your models is appropriate like what types of data should be used and what types of data shouldn’t be used, and why, so those are the types of questions that we are going to be looking at. We’re also going to be looking at what role does privacy rights factor into this, like what individual privacy rights do people have and what are those expectations for privacy when it comes to things like social media, as opposed to things in terms of the criminal justice arena and so lastly we’re going to transition to discussing the various ways that AI is being used in criminal justice right now and some of the challenges with those uh, with those uses so I’d like to start off by introducing our panelists today.

1:45  So first we have uh Joe O’Murchadha, did I say pronounce that correctly? perfectly good, yeah Joe is a corporate attorney specializing in compliance and privacy and risk management, his current role as head of corporate compliance and ethics and as well as a data protection officer, Joe has led the establishment of the compliance ethics and privacy group at Little US which is the fourth-largest retailer in the world – in addition to being an attorney Joe is also a certified privacy professional and a compliance officer, as well as a fraud examiner, so I think we got the right guide for privacy.

2:24  On our panel today we also have Preethi Bajwa she’s an attorney at the California attorney general’s office, where she practices complex civil litigation in the field of correctional law, prior to joining the attorney general’s office uh Preethi was a criminal defense attorney working at the public defender’s offices in Santa Clara County as well as Sacramento county as well as being a criminal defense attorney in private.

2:52  We also have Ian Ferguson, Ian is a business systems analyst for the district attorney at San Luis Obispo County, Ian is in charge of assessment development and implementation of technology tools within the DA’s office, and lastly, we have uh Will Ferrier. He works for the Yolo County district attorney’s office here in California. Will is the chief of the Yolo county innovation division, where he focuses on driving efficiency and continued transparency between the DNA’s office as well as the community, in addition to his role in the Yolo county DA’s office, Will has advised and co-founded various startups um so I’d like to jump right in since we do have an uh a short time frame here so Joe if you could just start us off by giving an introduction to privacy and how it how privacy works and what rights do individuals have and how that interacts with the data that’s collected for many models.

3:58  Perfect, happy to do that I think in some ways i’ve got the the easy straw out of the panelists maybe to start with but I think a lot of people probably do know but a good base position and then uh we can we can work the conversation from that point, so I think it’s important to have a conversation about you know what is the standard data set we’re normally working on and I guess especially for those folks who are going to take part in the hackathon tomorrow and you know I see this as a kind of a part of a learning strategy, you know your standard data set comes from your commercial records that were voluntarily provided when somebody bought something, public records that are available let’s say you know records are gathered through e-commerce you know you’re kind of standard digital um combination of various different sets together and our kind of basic concept that I’m sure most of the listeners are very familiar with. So you know you’re kind of best practice your key pillars the idea of that you have some kind of notice that you understand what it is you’re consenting to or what’s going to happen but of course, you have a choice you’re not forced to give the data – it’s you know a choice you make let’s say that there’s some kind of aspect of security that’ll be some effort made to at least you know secure that data with whoever is is collecting it your kind of access control that it won’t be maybe given to everybody again that’s distinct from if you’re publishing something on social media let’s say but in your normal scenario some kind of access control or access rights kind of uh there um

05:27 The other one then kind of a bigger topic in the last years is kind of purpose limitation so this idea that well I gave it to you for this reason and and that’s what I assume you’re going to do with it and that’s kind of what I consented to, interesting one because it’s quite different between Europe and America um but we can come back to that later and then obviously then the final one from your kind of key pillars of data privacy concepts is maybe concept of accountability for onward transfer but I have this data but then if I you know if I hand it on to somebody else what might they do with it and you know I still have that accountability back for that promise ultimately that I made you know when I gave the notice in the first place, so in a lot of ways you know it’s kind of really important to think about what you think we have a right to privacy and certainly in Europe you know it’s a fundamental human right, uh you know quite spelled out uh different history and culture there of course but you know that led to that belief but it’s important to remember that you know the word privacy doesn’t appear in the US constitution you know it is not there per se, you know of course we all have a right to privacy and that came I guess uh you know the fourth amendment kind of more reasonable concept of kind of a unreasonable search and seizure that people would be more than familiar with and then obviously the the Griswold case later in in the 60s kind of this idea that you have a you know reasonable expectation of privacy

06:47 And and I can see how that made sense at the time, but but that’s the crux of it what does reasonable mean and I think you know a lot of a lot of folks uh you know who are used to working with a data set, data analyst,data scientists, that reasonableness is often spelt out quite clearly you know when you look at well what notice was given when the data was collected, if you voluntarily gave the data uh because and you read the notice or at least the notice was available to you then, you know uh whatever was said in that notice that’s what you reasonably expect to happen maybe a kind of a simplistic example might be you know if you walk into a store uh you probably have a reasonable expectation that there will be CCTV recording in that store, you definitely have a reasonable expectation that there’s a sticker in the door with a camera sign or CCTV recording in here um but but you know and that makes sense I mean that’s pretty clear but you wouldn’t obviously have a reasonable expectation of the cameras in the toilets or something kind of you know way out there um

07:50 Obviously then you know the issue here becomes that it’s not these basic data sets that we’re talking about in the scenario for these conversations today instead it’s kind of an amalgamation of things that are public record things that are available may be information that was given over voluntarily and I let some of the colleagues talk about that a little bit more and but also things and was really interesting the previous panel immediately before us Elizabeth mentioned the [Amazon] Ring community in particular, that it may have been handed over voluntarily but not necessarily by you, by your neighbor or somebody uh somebody up the road and then obviously yeah the third party sets so uh

08:30 Obviously at this point uh I um it makes more sense to hand over to Preeti to get into a little bit more detail, obviously, you know if we’re looking at government action you know it must be something that’s considered to be a search and then we’re back to that main point again what is reasonable? uh you know we all have a kind of our simplistic ideas of probable cause you know somewhere halfway between suspicion and uh and proof, um and I guess at that point, I let Preeti take over to take this a little bit deeper into the world of criminal justice and out of the basic maybe more corporate data sets that some of our watchers might be more familiar with

09:13 great thank you Joe, um yes so I think a good a good place to follow up um a Joe from where you add is you know what is considered reasonable and in a criminal setting um as the courts have described it that the seminal case is cats and it’s followed up by burger but in cats really the question that that that was posed is whether there’s a reasonable expectation of privacy, and it’s a matter that is determined on a case-by-case basis and is very fact specific so in cats the focus was was uh was regarding um electronic surveillance and um the court said that to the extent that uh a person has a reasonable expectation of privacy in in what they’re engaging the activity they’re engaging in and it seems reasonable to society’s norms that would be, that would be reasonable um now uh taking the example that Joe gave us of um uh if you’re walking to a store and there’s ct is CCTV going on and you’ve been put on notice obviously if you shoplift something that’s going to be recorded on camera your reasonable expectation of privacy is gone and you’re also in a public space right um but what if you are a guest in somebody’s home right, um the house obviously is inherently sacred and the courts have held that um so as a homeowner you have an expectation of privacy the police cannot come into your home without a warrant

10:51 um however if you are a guest in that home you do not have a reasonable expectation of privacy in that home so if you’re engaging in criminal activity you’ve got some drugs there or whatever cops come in with a warrant or the owner of the house consents for them to search his home anything that they find that’s your property in the guest bedroom that you’re staying in is it you don’t have an expectation of privacy there

11:17 Um that uh and then and then the other point that I wanted to touch on which I think is relevant to our discussion today is uh this idea of third party and giving consent to third parties to um that by giving consent to a third party you’ve given up your privacy there um most recently in a case smith uh the court actually uh visited this issue in terms of cell phone tracking, um in in that case the question is that you have um you have uh you have a cell phone you’ve obviously consented to the gps tracking and those this gps tracking with the cell phone can ping off of many towers and we can really find out where anybody has gone for any length of time just by following the cell phones um but the court declined to extend that uh consent that was given in in when you give up your your right when you turn over and and say to a cell phone carrier yes go ahead um you can uh use my gps and then decline to extend it and said that it will it’s only for a limited time that your whereabouts can be tracked by its cell phone so it is still very much a case by case basis um and so we have to stay in this this kind of arena reasonable

12:37 The last thing you want to touch on if I turn it over to Ian is um what happens to persons who have committed uh who have been convicted and are either on probation or parole meaning that they’ve completed the sentences and they’re now free but they’re still under um they’re still under on probational parole and therefore under the authority of some law enforcement agency and also those who are still incarcerated um in terms of persons on probation and parole, in California at least pretty much 100% of the time one of the terms and conditions is that you give up your right to be free from search and seizure at any time, um, therefore, you do have a diminished expectation of privacy um in terms of persons who are incarcerated in a prison or a jail that too is also very, those are very case state-specific, because each state will have their own set of laws and regulations that apply to and how and discuss how they handle privacy for example. In the California correctional system, we have title 15 regulation governing how um how uh information or any documents pertaining to any inmate are distributed and there is a section that says we do not, we will not give out any information on an inmate unless uh one of these criteria met is for litigation there’s a subpoena some sort and they certainly will not just turn over that information to anybody to any member of the public

14:04 Um so those are all going to be more state and state-specific um so as we’re moving along um and as we’re trying to see how any AI that we come up with and some of the solutions that Ian and William want to talk about um that it certainly should be in the back of our minds as to what are the privacy interests and how are they being handled um in the various states where this the technology may be utilized and with that, I’ll turn that over to Ian

14:34 Actually um I just wanted to jump in you made a very interesting point here is that when data from a government agency uh would be used we have to take certain measures to uh protect those persons though the privacy of those persons so and that that could be I know uh what will you want to touch upon this uh uh with some specifics but that could be masking their identifier viable characteristics like their name their social security number any sort of demographic information and trans uh translating it into some sort of code so that at least that event or instance can be used but you’re not letting demographic information get in there

15:26 Well what do you, uh I know that you specifically deal with this issue um yeah yeah absolutely, uh and good afternoon everybody I hope you’re all doing well so you know privacy it’s a big issue right it’s something that um you know on the prosecutorial side you know even though we’re prosecuting someone we still respect their right to privacy and when we work with various research partners and we try to as an organization get better that involves you know sharing information and so one of the things that we’ll typically do when we have a case we assign a person id to someone it’s a number that uh you know we’re the only ones who hold the key between the person id and who that person is and so what we’ll typically do is remove anything that could possibly identify that person and use a person id

16:19 And so that way as a research group what you can do is you can then see okay this person id had you know uh or that maybe they were convicted of this crime or that crime or what have you um and this was the, uh you know the sentence right so it’s super important to us um, but I’d like to turn back to Ian to introduce some of the ways that ai is actually being used right now so Ian could you uh touch upon that topic

16:53 Yeah I’d love to uh so I’ll kind of start out real briefly ai is currently used by law enforcement agencies in a number of different ways really three primary ways there’s location identification which can be heat maps of where crimes have occurred in the past suspect identification which can be as advanced as facial recognition software and we can talk about all the flaws involved in that as well and then also victim identification a lot of times this is for vulnerable populations such as like elder abuse and things of that nature you can identify potentially vulnerable individuals and intervene to avoid a crime from being committed but there are also more specific uses on the d.a side is kind of will was uh alluding to as well uh

17:49 A lot of it has to do with research and investigations uh just like Preeti was referencing particular cases like cats or Griswold v Connecticut there are natural language processing algorithms that can crawl through court cases and create summaries on their own that reduces the time that actual attorneys have to do a lot of that legwork on research and there’s also even simple applications of um cluster models that can sift through lots of financial data to identify fraud much in the same way that your credit card company will flag a charge that they think could be fraudulent a lot of uh economic crime units can do the same thing for searching for uh potential crimes in those as well

18:42 Um and you were saying that uh let me see here under investigative research um I know you have there’s simple fraud detection or pattern recognition but also some of the what are some of the um uh more complex models that uh the da’s offices are using and and maybe like where where are they getting that data is it all proprietary just their own data that they’ve collected from um uh from incidences and police incidences is so yeah some of it is our own data that we’ve collected from law enforcement agencies a lot of it also is these large repositories from things like west law that have access to many different cases that could be similar and have similar charges and that kind of leads into some of the problems and complexities with that and I think our original speaker brewery wallace touched on some of that that some of the problems are institutional and legal rather than technical and a lot of that has to do with the access to some of this data because there is this kind of paywall of being able to access which what is technically public domain of a lot of these court cases some courts are charging as much as 10 cents per page for court documents even though they’re considered public domain

20:17 And that is very prohibitive for any kind of third party solution or startups to actually go in and make sense of this data and and help optimize these systems and then there was another point that was brought up earlier in the discussion today about a lot of the natural language processing algorithms being in English primarily and that causes problems for um minorities and other uh other institutions that may not be have English as the first language and then also i’ll just bring up really quick the other another issue that’s big is the fact that not all the charging languages as universal as we would expect and think so i live in california we’re currently working on a master charge code for the state of california but right now it can vary from from jurisdiction to jurisdiction which means that if you’re charged with a particular crime um in san luis obispo county for example may not be the same as uh crime code is in yolo county and to marry those two data sets together and make sure that they’re they’re representing the same information is very difficult for uh for data scientists because you’re going to have to uh first match things together that that aren’t uh necessarily the same thing

21:42 That’s a very interesting uh will I know that you have uh specific experience working um uh along the lines of the topic of recidivism and um do you have those issues where uh data standardization tends to be an issue or what are some of the challenges that you have there yeah and also certainly what are you actually looking to get out of using ai in terms of with recidivism

22:13 yeah thank you um we certainly have issues with data standards um I know that uh that master charge uh project that’s been going on for years right the team has been working towards this goal um it’s super complex so you know in our county which is again here in California um you know there’s a couple of things that were we’re really sensitive to the right so just as um you know let me start by saying our prosecutors go through an unconscious bias training and I think that that is a really good foundation because as we’re talking about AI we’re talking about recidivism one of the things that for the people who are on the line if you’ve ever read the book weapons of math destruction one of the things that we want to be super you know cognizant of is that we’re not taking unconscious bias that we may have as individuals building that into an AI model and then applying it towards people who need as much help as they can get right so

23:14 Um you know just as a defendant has a heartbeat the prosecutor has to have a heartbeat as well and so these um you know AI technologies and algorithms they need to support the process but ultimately um the prosecutor needs to be making you know the decision at the end of the day based on the facts of the case right um yeah I just wanted to interject one little thing you make a very important point thereof using ai as a supplement to the decision that the da is making um and that’s a very important point because one of the common themes that we’ve been already going through uh in this uh summit that we’re having is uh the topic of accountability you know if you’re having the decision-maker be an ai system it makes it a lot easier to pass that accountability off whereas if the da is still making the decision and he’s using the outcome of AI to help with his decision he still has to be accountable is that right

24:20 yeah absolutely absolutely and um and you know I wish I could say that we were as advanced as as it relates to AI and criminal justice as AI is with you know the the battle against cancer um but we’re just not in fact you know we’re far behind and um we need as much help as we can get we we have a huge issue as you mentioned with um you know data and how it’s you know the various kind of information that we get it’s it’s different every single time um you could have three defendants who are all accused of committing the same crime and each one of them has a different story one may have mental health issues one may have um a drug you know uh issue another person may be you know in sudden poverty right could be poverty against all three and so these AI models that we’re discussing have to be smart enough to look at the person’s background to you know whatever extent is possible and then make suggestions to that prosecutor about what would drive the most positive outcome because at the end of the day we want to get this person out of the criminal justice system we don’t want to see them you know back in in it right

25:38 Right and this actually I know that we’re running pretty much to our time but if we get to the last slide which talks about Ian you may want to touch upon some of the challenges that you knew as well as will have faced with uh implementing the AI systems uh in the criminal justice arena

25:59 sure and uh first quote garbage and garbage out I think we all know that one pretty well the quality of the data that we’re using is is questionable at times and using historic data can also perpetuate historic biases so we want to be very conscious of that moving forward and utilizing these tools will touched on identifying the root cause I think is really critical that’s an important part that I don’t think it is we’re putting enough emphasis on in general um and while it’s difficult to identify true root cause I think that should be the goal of a lot of these these systems and then i talked a little bit earlier about the problems with access to data and the standardization across jurisdictions because i think that’s a primary block to a lot of these these being viable really and there is work being done on it but it is it is historically a very difficult world to kind of break into and um i don’t see that necessarily going away without substantial changes into uh things like law right

27:13 right um so I know that we’re pretty much around of time here I know this session was it almost seems like the whole session was a bit of a primer on to this subject of AI and criminal justice so if anybody has any uh questions or follow-up questions uh feel free to contact me or any of the panelists via LinkedIn or through uh through the chat and the groups uh on data ethics for all uh I thank you very much thank you to all the panelists and children for uh putting this together and I hope uh everybody got something great out of it because I certainly did thank you thanks everybody thank you Sanjeev thanks panelists Ian joe William Priti this was an awesome discussion thank you for tuning intake care bye-bye.

Panelists

William Ferrier

DataEthics4All Community Member,  Innovation Manager Yolo County District Attorney

ian-ferguson-ai-criminal-justice-panel_discussion

Ian Ferguson

DataEthics4All Community Member,  Business Systems Analyst, SLO County District Attorney 

Joe-Murchadha

Joe O’Murchadha

DataEthics4All Community Member,  Head of Corporate Compliance & Ethics, Data Protection Officer, Attorney, Lidl

Preeti-Bajwa Deputy Attorney General at Department of Justice

Preeti Bajwa

DataEthics4All Community Member,  Deputy Attorney General Department of Justice 

Moderator

Sanjeev-Bajwa-DataEthics4All Ethics4NextGenAI Hackathon subject matter expert

Sanjeev Bajwa

DataEthics4All Community Member,  Patent Attorney, and Founder of Illuminae IP Solutions