Grin logo
de en es fr
Shop
GRIN Website
Publish your texts - enjoy our full service for authors
Go to shop › Computer Science - Internet, New Technologies

Biased Algorithms in Law Enforcement Agencies. A Case Study of the LA Police Department

Title: Biased Algorithms in Law Enforcement Agencies. A Case Study of the LA Police Department

Case Study , 2023 , 14 Pages , Grade: 1,0

Autor:in: Julian Schoenemeyer (Author)

Computer Science - Internet, New Technologies
Excerpt & Details   Look inside the ebook
Summary Excerpt Details

This paper tries to answer the question, how the LA Police Department uses AI (Artificial Intelligence) algorithms to profile criminality. Can these algorithms always be fair and impartial?

In today’s society, we are constantly surrounded by artificial intelligence, especially by algorithms that try to profile us to predict our future behaviour. These algorithms are deeply embedded in our daily lives, whether we are watching Netflix, using Amazon music or do online shopping, they try to present us with suggestions to get us more involved with the platform to spend more time there and therefore more money. But these algorithms are not only operated by business companies alone, they are also implemented with public policy providers and law enforcement agencies, like police forces.

First, the author gives a detailed explanation of how these algorithms work, then he lists examples in which areas police forces might use these algorithms, such as facial recognition software and making suggestions about the sentencing of criminals. However, the paper also discusses critique of the use of technology by law enforcement. Many ascribe these algorithms a certain bias, especially racially, having mainly been trained on a dataset, which in itself is biased. The next chapter contains the author's methodology, which he then applies to his case study about the LA Police Department in the last part of his paper.

Excerpt


Table of Contents

1. Surrounded by Algorithms

2. Methodology

3. Case Study

4. Conclusion

Objectives and Topics

This essay explores the ethical and social implications of using algorithmic decision-making systems within law enforcement agencies, specifically focusing on the Los Angeles Police Department. The primary research goal is to investigate whether predictive policing technologies inherently foster discrimination and inequality, particularly against ethnic minority populations, and to consider the balance between operational effectiveness and civil rights.

  • Mechanisms and risks of algorithmic bias in public institutions
  • Predictive policing techniques (e.g., Operation LASER, Palantir)
  • Empirical analysis of crime-related data and police activity
  • Socioeconomic and ethnic disparities in algorithmic profiling
  • Transparency, accountability, and the future of data-informed policing

Excerpt from the Book

1. Surrounded by Algorithms

In today’s society, we are constantly surrounded by artificial intelligence, especially by algorithms that try to profile us to predict our future behaviour. These algorithms are deeply embedded in our daily lives, whether we are watching Netflix, using Amazon music or do online shopping, they try to present us with suggestions to get us more involved with the platform to spend more time there and therefore more money. But these algorithms are not only operated by business companies alone, they are also implemented with public policy providers and law enforcement agencies, like police forces. Can these algorithms be always fair and impartial? How are they used within the police force to profile criminality? These are the most important questions that I try to answer in this paper.

At first, the functioning of an algorithm has to be explained further. In one sentence, an algorithm is a set of commands that helps computers to interpret specific information that lead the system to a certain decision (Algorithmic Bias, 2023). The problem with algorithms being biased lies within the programming processes itself. First of all, it is based on existing available data within society that might be biased already. Take for example an algorithm that tries to match job offers with unemployed people. It draws first upon the data on what kind of CV the current people have who work in this specific job branch and tries to match these findings with possible applicants who match those criteria (Algorithmic Bias, 2023).

Chapter Summaries

1. Surrounded by Algorithms: This chapter introduces the prevalence of AI in modern society and highlights the fundamental risks of algorithmic bias when applied to law enforcement and criminality profiling.

2. Methodology: The author outlines a qualitative single case study approach, utilizing an interrupted time series design focused on the Los Angeles Police Department to examine AI implementation.

3. Case Study: This section empirically tests the effectiveness of predictive policing programs like Operation LASER and Palantir and discusses their role in reinforcing discriminatory feedback loops within specific ethnic neighbourhoods.

4. Conclusion: The final chapter summarizes findings on the trade-offs between crime reduction and social impact, advocating for greater transparency, accountability, and the integration of community-focused policing.

Keywords

Algorithms, Artificial Intelligence, Predictive Policing, Law Enforcement, Algorithmic Bias, Discrimination, LAPD, Operation LASER, Palantir, Social Inequality, Ethnic Minorities, Data Privacy, Surveillance, Transparency, Criminality Profiling

Frequently Asked Questions

What is the core focus of this research?

The research focuses on the deployment of algorithmic systems in law enforcement and the potential for these systems to propagate societal biases and discrimination against ethnic minorities.

What are the primary thematic areas?

Key areas include the mechanics of predictive policing, the use of historical arrest data for patrol routing, and the impact of these technologies on community trust and civil liberties.

What is the main finding regarding the research objective?

The essay finds that while predictive policing tools like Operation LASER can statistically reduce crime rates, they often create discriminatory feedback loops that disproportionately target minority communities.

Which methodology is employed in the study?

The author uses a qualitative single case study approach combined with an interrupted time series design to evaluate the historical impact of AI adoption in the LA Police Department.

What topics are covered in the main body?

The main body covers the technical functioning of algorithms, specific examples like predictive sentencing and patrol routing, and an analysis of how data bias influences output quality.

Which keywords define this work?

Key terms include Predictive Policing, Algorithmic Bias, LAPD, Law Enforcement, and Data-informed community-focused policing.

What is "Operation LASER"?

Operation LASER was an LAPD initiative designed to reduce gun violence by identifying chronic offenders and critical 'problem areas' using a point-based scoring system derived from arrest history.

How does the author propose to improve police-community relations?

The author suggests adopting a 'Data-informed community-focused policing' (DICPF) approach, which emphasizes transparency and closer collaboration with local residents to restore trust.

Why is the role of Palantir significant?

Palantir serves as a critical technological backbone for many law enforcement agencies by aggregating and connecting diverse datasets, which creates the foundation upon which other predictive programs operate.

What is the impact of biased data on algorithms?

The author explains that biased datasets lead to biased outputs; since algorithms often rely on historical crime reports that may reflect systemic societal prejudices, the resulting computer-generated decisions often perpetuate the same inequalities.

Excerpt out of 14 pages  - scroll top

Details

Title
Biased Algorithms in Law Enforcement Agencies. A Case Study of the LA Police Department
College
Charles University in Prague  (Faculty of Social Sciences)
Course
Security and Technology
Grade
1,0
Author
Julian Schoenemeyer (Author)
Publication Year
2023
Pages
14
Catalog Number
V1334829
ISBN (PDF)
9783346848789
ISBN (Book)
9783346848796
Language
English
Tags
AI Technology Security Algorithm Law Enforcement Los Angeles Bias
Product Safety
GRIN Publishing GmbH
Quote paper
Julian Schoenemeyer (Author), 2023, Biased Algorithms in Law Enforcement Agencies. A Case Study of the LA Police Department, Munich, GRIN Verlag, https://www.grin.com/document/1334829
Look inside the ebook
  • Depending on your browser, you might see this message in place of the failed image.
  • Depending on your browser, you might see this message in place of the failed image.
  • Depending on your browser, you might see this message in place of the failed image.
  • Depending on your browser, you might see this message in place of the failed image.
  • Depending on your browser, you might see this message in place of the failed image.
  • Depending on your browser, you might see this message in place of the failed image.
  • Depending on your browser, you might see this message in place of the failed image.
Excerpt from  14  pages
Grin logo
  • Grin.com
  • Shipping
  • Contact
  • Privacy
  • Terms
  • Imprint