In 1956, the immense Philip K. Dick published “The Minority Report,” a story that imagined a future where crimes were predicted and stopped before they happened. Today, in 2024, that dystopian vision is starting to seriously resemble reality in South Korea. Dejaview, an artificial intelligence system, promises to revolutionize crime prevention. But are we ready for its implications?
The Precogs1 they are among us
The concept of predictive crime prevention has fascinated writers, philosophers, and scientists for decades. That's why what's taking shape in the laboratories of the South Korean Electronics and Telecommunications Research Institute is so interesting.
But what exactly is Dejaview? It is an advanced AI system that analyzes CCTV camera feeds in real time. It does more than observe: it interprets, predicts and warns. Using sophisticated machine learning algorithms, Dejaview evaluates a myriad of factors (from time of day to geographic location, criminal history to environmental conditions) to calculate the probability of a crime occurring.
How Crime Prediction Works
The system operates on two main fronts:
- Time and Space Based Crime Prediction: Dejaview analyzes the characteristics of specific locations and compares them with historical crime data. If an isolated area shares similar characteristics to places where crimes have occurred in the past, the system classifies it as high risk.
- Individual relapse prediction: The system tracks the movements of individuals considered at high risk of reoffending, analyzing their behavioral patterns to predict potential future crimes.
Where did Dejaview “study” to learn crime prevention?
The system has been trained on a large dataset of over 32.000 CCTV clips, which capture a variety of incidents over a three-year period. This approach raises questions about the representativeness of the dataset and the potential bias embedded in the system.
Promising results, pressing questions
In field tests conducted in Seocho City, Dejaview demonstrated an accuracy of 82,8% in predictive crime mapping. An impressive number that, however, raises crucial questions: what about the 17,2% error in crime prevention?
What are the ethical implications of a system that labels people as potential criminals before they commit a crime?
South Korea is not alone in this race towards predictive crime prevention. In Argentina, for example, a new AI unit has been established that aims to prevent, detect and investigate crimes using specialized algorithms. Their approach goes beyond CCTV analysis, including data from social media, websites and even the dark web.
Ethical and social implications
The implementation of systems like Dejaview raises many ethical questions. I identify essentially 3. I ask myself, and above all I ask YOU:
- Privacy: To what extent are we willing to sacrifice our privacy for security? A question I also asked you on this occasion.
- Presumption of innocence: How can this fundamental principle be reconciled with a system that “pre-judges” people’s intentions?
- Technological determinism: Are we in danger of creating a society in which people's future actions are judged by their past or their environment?
Let me know what you think on Futuro Prossimo's social channels.
The Future of Crime Prevention
For now, Dejaview's application appears to be limited to public safety infrastructure such as airports, power plants, and factories. Commercial use for specialized security agencies is expected by the end of 2025. But it's easy to imagine how this technology could expand in the future.
Dejaview and similar technologies represent a powerful tool in crime prevention. However, like any powerful tool, its value will depend on how we choose to use it. It will be essential to establish strong ethical guidelines and oversight mechanisms to ensure that these technologies are used for the common good without compromising the fundamental principles of justice and freedom on which our societies are based.
Philip K. Dick imagined a future where crime prediction led to a dystopian system: it is up to us to ensure that reality is more enlightened. Because the risk that from crime prevention we move to the depression of dissent is not science fiction.
- Precogs are individuals with precognitive powers that can predict future crimes. In the movie “Minority Report,” they are three people used by the police to prevent murders before they happen. ↩︎