Steven Spielberg was not wrong with Minority Report, A film released in 2002 in which a police department known as the prevented arrested criminals before committing a crime. While the film took the action to 2054, it has not been necessary to wait so long to know the operation of this technology.
This has been known by the newspaper The Guardianwhich has advanced this week that the United Kingdom is working on a tool capable of predicting homicides and, more specifically, of determining which people have the most chance of becoming murderers.
This will only be possible thanks to an algorithm, which promotes a project ordered by the former Minister Rishi Sunak and initially known as Homicide prediction project.
In it I know They will analyze in detail the data of thousands of people (between 100,000 and 500,000) Coming from several official sources, including probation service, to save others through behavioral predictions.
However, the idea raised by the Ministry of Justice (MOJ), the Interior Ministry, the Mercanster Police (GMP) and the London Metropolitan Police is very different from everything that has been seen so far and is an idea that would marry perfectly in any film or science fiction series.
How an algorithm can predict murders
The project to identify criminal behaviors through algorithms has been created with the purpose «to explore the potential of the data sets» managed by the British Ministry of Justice itself to evaluate the risk of homicide that occurs between certain users.
Once this data has gathered, the data science team of this ministerial portfolio «develops models» of artificial intelligence, which seek the most powerful indications of the risk of committing homicide «in that information, which brings together data from suspects, victims, witnesses, missing persons and others in vulnerability.
In this way, the project, now known as Data exchange to improve risk assessmentwill collect very sensitive data and will focus on what is called health markerswhich includes mental health problems of the people analyzed, If they have any addiction or if they have self -colored.
The authorities have insisted that, to formalize this proposal they have only used data from people who register, at least, a criminal conviction and that, for the moment, it is only an investigation and not a project that will materialize soon.
In any case, they fully trust their success and believe that this application of technology «will provide evidence to improve the risk assessment of serious crimes and, ultimately, will contribute to protect the public through a better analysis «, According to a spokesman for the Ministry of Justice.
The project endangers user privacy
The person in charge of bringing out the intentions of the British authorities has been Statewatch, an organization that promotes critical investigations, many of them policies, and supports research journalism, which has questioned the reliability and ethics of this AI application.
Activists, defenders of the right to privacy and members of Statewatch agree that this project is «chilling and dystopian», which endangers the privacy and integrity of users. Thus, they have insisted that these practices They would only favor biases in predictions against ethnic minorities and poor people.
«Again and again, investigations show that algorithmic systems to predict crimes have intrinsic failures. This latest model, which uses Data from our Police and the Ministry of Interiorinstitutionally racist, will reinforce and magnify the structural discrimination that supports the criminal legal system, «said Statewatch researcher Sofia Lyall.
The organization’s representative has insisted that The Ministry of Justice must «immediately» the development of this tool of murder prediction and invest in «genuinely solidarity» social welfare services.
«Cutting social assistance while investing in fast solutions will only undermine the safety and well -being of people,» the expert concluded.
Know How we work in NoticiasVE.
Tags: United Kingdom, Artificial Intelligence