Predictive policing: A criminal justice tool that threatens human rights

-By Sahajveer Baweja

Predictive policing is an application/tool that is supported by artificial intelligence technology. These tools are currently used by investigation wings and law and order agencies in many nations to curb the crime from taking place. Such applications work on the concept of using sets of data quantities which are processed by analytical techniques to analyse the available crime records data. The aim is to identify the likely targets of threats to mitigate possible risks and to forecast crimes and locations which are prone to them through such data analysis. Thus, in order to achieve the objective, these tools process the large sets of police data including historical crime data collected with the aim to flag the prospective hotbeds of crime. 

Once the hotbeds or likely miscreants (individuals or groups) are identified that can potentially commit a crime, the flagging is done followed by deployment of law enforcement agencies to prevent occurrence of crime. Hence, the whole purpose of these tools used in law enforcement mechanism is to tend the focus on preventing the crime from taking place on the first hand rather than curing the damage that has been caused after commission of crime. 

Issue of Human Biasedness

Although this tool is seen as an objective approach to visualize the possible outcomes for the future, there is a fundamentally flawed concept that is being promoted well within the framework of these applications. These applications run on the use of algorithms based on training data fed by humans that have human biases and therefore it significantly results in algorithm biasedness. It is easy to understand the historical human biases that have been or may get transmitted to these criminal justice tools because of the salient pattern that emerges from the biased data sets and its black box nature.   

In the U.S., the incarceration rate of black people is five times the rate of white people and in 12 U.S. states, more than half the prison population is Black. Further seven U.S. states maintain a Black/white disparity larger than 9 to 1. Certainly, when such data which on its face looks neutral but is heavily infected with racism is used in these tools, there is a risk that these applications will potentially reinforce the racial inequality in states where such racist trends are predominant. Similarly it may also widen the other historical inequalities which are based on different intersections according to the region. 

Undeniably, there is a significant risk that through the use of these applications, there can be amplification of prevailing biases and can also deepen the intersectional divide based on certain race, sexuality, color, age, etc. If in case such amplification occurs, the protected minorities would have to face unfair discrimination that shall result in violation of equal protection of law. The aftermath of such policing is a rise in the plight of people who belong to disadvantaged socioeconomic backgrounds as these algorithms calculate a greater criminal risk of their behavior without any logical matrix. 

Violation of right to privacy

Predictive policing also has the potential to violate the right to privacy. Such violation happens in cases where there is deep digging of information to extract training data that is further used to outline criminal backgrounds. This leads to uninformed interference within the social, economic and other spheres of an individual’s life. Nowadays, even the social network analysis has also become a powerful tool in the hands of police departments. One of such examples is of New Orleans, where the New Orleans police collaborated with Palantir in collecting and analysing data through their data mining tools that would help them in reducing crime. 

New Orleans police traced people who were related to gangs through outlining criminal histories, analysing their social media and predicting the likelihood of an individual committing violence or might become a victim. Although, such collection of data is itself an invasion of privacy however, the greater invasion was the fact that no resident of New Orleans was aware about such agreement between Palantir and New Orleans police as the agreement never passed through public procurement process and therefore, the whole functioning of this predictive policing tool was secret.  Using data available on social media applications without the assertive consent of an individual to process and prepare data-sets of for the purposes of predictive policing and is a grave threat to the right to life and privacy. When there is undue interference in the lives of people without even acknowledging their existence by not making aware of what data of them is used, such act is a sheer violation of privacy and is conceptually the birth of Orwellian era. 

Similarly, predictive policing tool is now being used in the jails of Uttar Pradesh, India. Through an app called JARVIS which is an AI-powered video analytics solution, that is trained to predict and flag any suspicious behaviour taking place within the prisons by analysing the body language of a person. Activities like frisking, unauthorized access, crow analysis, violence, etc. can be detected. This application runs with the help of 700 cameras positioned in 70 prisons. Although, the aim is to dethrone the criminal networks that are active within the jail, nonetheless, the question is whether such a measure is proportional to the object sought? Continuous monitoring of prisoners including female inmates, and under-trials may become a source of embarrassment for them. Moreover, their right to privacy is torn away just because now they are put into social dustbins by the society and their rights have lost their validation. Reports have suggested that presence of such type of monitoring not only violates privacy but causes paranoia and self-conscious reflection amongst prisoners which makes prisoners self-protective and rather more violent. 

In addition to this, these tools also have the potential to become an instrument of abuse and surveillance in the hand of political parties who are in the power in a given state.  Through such tools, the government can promote authoritarian regime and rather create a state of surveillance where every action is recorded, analysed and then used to predict the nature of criminality thus infringing the right to privacy. The civil rights activists who promote the idea of protecting civil liberties are concerned that predictive policing would result in more unlawful and non-stringent police interferences. Such interferences would enjoy the permissibility on account of using predictive policing and Big Data as a utopian model of objectivity that is free from biases. In cases of using these tools in the current environment, right to privacy is at the receiving end and is dependent on the colourable effect of applicability of such tools.

Need to rethink the viability

It can be framed that these applications at present have currently resulted out as a powerhouse of unconscionable abuses of human rights and thus institution needs a fundamental change. Presently, if the predictive policing tools are functioned uninterrupted and unquestioned, it would potentially result in a blunt mockery of human rights causing irreparable damage to the foundations of organic human society. As it stands today, it is just a tool that threatens the due process and human rights of the discriminated class of people. Usage of these tools in a scientific and technological environment might be the need of the hour however; such use of tools should be under complete scrutiny by an independent agency that keeps a check on its use and abuse in any given state. Moreover, its usage should only be limited to those spheres where there is no interference with the personal data of an individual. Thus a need arises to rethink whether artificial intelligence is something that should be incorporated within the criminal justice system or whether the transformative nature of criminal jurisprudence is obligated to follow the changing patterns without questioning its inherent drawbacks. 

The author is a law graduate.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s