Three Ethical Concerns Over AI-Powered Law Enforcement Transcription (And How to Overcome Them) 

By: Sarah Roberts
a police car in a city
Filters

Artificial intelligence is already changing how agencies approach law enforcement transcription. Tools like automatic speech recognition (ASR) technology can capture audio recordings and convert them into text quickly and efficiently.  

Despite the clear advantages of using ASR to transcribe recordings from body-worn cameras, 911 calls, interrogations and more, there are also reasons to be cautious. When it comes to identifying potential criminal activity and suspects, flaws with AI transcription can lead to some devastating outcomes. Fortunately, with the right oversight and a careful approach, it’s possible to reap the benefits and overcome these three challenges related to AI-powered law enforcement transcription.  

1. Biases leading to inaccuracies 

Biases in algorithms continue to appear. For instance, AI-powered image generators continue to show blatant biases. The Washington Post recently published an article detailing some examples, including a group of images of people with dark skin for the prompt “people at social services,” while the prompt for “a productive person” generated mostly images of white men.  

AI-powered transcription could cause some similar problems. The performance of ASR improves based on the data that feeds the algorithm. As a result, it might be less accurate when transcribing a recording from a person with a heavy accent or someone who uses certain slang or jargon. In cases where individuals speak in multiple languages, the results might be impacted even more. Due to such complications, more errors may appear in a transcript from an interrogation with one person than it would with another. Such discrepancies could negatively impact a person’s case, even leading to harmful legal outcomes.  

In fact, a “basic transcription error” once sent an innocent man, Carlos Ortega, to prison for a year and cost him hundreds of thousands of dollars in legal fees. The risk of faulty law enforcement transcripts is, therefore, significant.  

How to combat biases and inaccuracies 

One of the first things to consider is the quality of the ASR. Better technology that includes diverse training data or industry-specific data will produce more accurate transcripts. Selecting a provider that understands law enforcement use cases and designs its solutions to meet those needs will likely be a better fit than a more general ASR.  

Additionally, when the transcript is critical, having a human transcriber review the content and check for accuracy is the best approach. This combined method saves time and money and also adds an extra level of protection against errors.  

law enforcement officers with motorcycles

2. Security and privacy risks 

Using AI to transcribe sensitive information can potentially mean that data is vulnerable. One recent data scare involved a journalist who feared he may have exposed someone he was interviewing to a dangerous situation by transcribing an interview using an automated transcription service. That event didn’t result in consequences, fortunately, but served as a wake-up call for many relying on the convenience of free AI-powered transcription tools.  

Similar ethical concerns could impact law enforcement, who might have information from interviews that they don’t want to go public because of a risk to informants or the danger of releasing investigation information that could tip off a suspect.  

How to transcribe sensitive information 

The key to preventing these scenarios is to use a transcription provider, like Verbit, that understands the need to keep data secure. Turning to a company that works in industries that handle confidential information regularly is a good place to start. It’s also wise to check the provider’s credentials for things like SOC2 reports. Diving into this type of information can give a better picture of how careful the company is with the information they’re processing.  

Additionally, if the provider uses human transcribers to edit and ensure accuracy, there should be privacy protections in place, such as non-disclosure agreements if necessary.  

a police car with lights flashing

3. Questions about accountability  

One issue popping up related to AI tools, especially as they become more advanced, is who bears responsibility if something goes wrong. In the case of law enforcement transcription, if a transcript is incorrect, it can put an investigation, officers or regular citizens at risk. It’s not the technology that will end up taking the blame, though.  

In many cases, however, people are trusting AI-powered tools without taking the time to check their output. Already, we’ve seen some face legal consequences because of that trust and lack of oversight. 

How to use AI while staying accountable 

It’s important to be realistic about AI’s usefulness and its limitations. If a transcript is for personal use or a record of an internal meeting, using AI to generate it might be perfectly fine. In the case of something like digital evidence, it could be more important to have a human check that transcript carefully for any inaccuracies.  

Even if the information is critical to a case, using AI as a first step might be a great way to save time, as long as there are still professional transcribers reviewing the output and making corrections. At the end of the day, it’s extremely important to use a provider and a process that meet the demands of the industry.  

Partnering with Verbit for law enforcement transcription 

Verbit employs specialized legal transcription teams that work with our proprietary Legal ASR to produce accurate transcripts. Partnering with Verbit means having access to ASR-only transcription tools that are designed especially for the legal industry, as well as trained and certified professional transcribers. To learn more about our options for law enforcement transcription or to create a custom option that meets your needs, reach out to Verbit.