It’s summer of 1994, police find 3 dead bodies in a house in a quiet neighborhood of Miami. At the scene of the crime, there is also a video camera containing the complete recording of the murder. With this, three weeks later they identify one of the alleged culprits: Pablo Ibar, a Spaniard who had been arrested for a crime related to weapons. With no evidence other than that blurry recording, Paul is tried and sentenced to death. His story comes to Movistar Series, told throughout these 25 years in which he has been trying to prove his innocence.
The case of Pablo Ibar was as media-driven in Florida as the crime of the Marquises of Urquijo in Spain. However, in our country, Pablo’s story did not arrive until a few years ago when Nacho Carretero, the journalist responsible for giving life to Fariña, was interested in the subject and started an investigation that it would take him seven years.
Thanks to the work of Nacho, today we have the series ‘ On Death Row’, of four episodes, which is complemented by five podcasts. The series is a creation of Bambú Producciones and Movistar+ and focuses not on proving whether Pablo is innocent or not, but on insufficient evidence to convict him.
Through this track, they reached the Spaniard, arrested him and sentenced him without further delay to death row. At this point in history is where we ask ourselves: can incriminating technologies be reliable? A blurred video recording was who blamed Pablo 25 years ago, today it is no longer the cameras but artificial intelligence who dresses as a cop to catch criminals.
In recent years, we have been able to read news as a virtual assistant witnesses a murder, but to what extent can this be real? The film Minority Report warned him in 2002: leaving everything in the hands of an algorithm is not the most advisable, especially if it is the innocence of a person. But why is this happening? According to expert consultants in this area, such as Oliver Wyman, using artificial intelligence to prevent bank crimes is very effective. However, it is difficult to ensure that it is also for blood crimes.
Other studies, such as article signed by several professors at Boston University Man is to Computer Programmer as Woman is to Homemaker?
Debiasing Word Embeddings demonstrate that machine learning systems have sexist and social biases.
Likewise, the mathematician Cathy O’Neil states that algorithms are “opinions enclosed in mathematics”, so depending on who constructs them they will give one result or another. “We tend to think that algorithms are neutral, but we don’t. Biases are structural and systemic, they have little to do with an individual decision, ” says Professor Virginia Eubanks, also author of the book Automating inequality, which reason on how algorithms profile, control and punish the less affluent social classes.
At this point and having the series On Death row as the protagonist, we ask ourselves: are technologies true helpers in the capture of criminals? Or maybe it takes more than a blurry recording or an algorithm to blame a person?