Spain overhauls domestic violence system after criticism


Francisco Javier Curto, a commander for the military police in Seville who oversees gender violence incidents in the province, in his office in the Spanish city on March 30, 2024. The Spanish government announced a major overhaul to a program in which police rely on an algorithm to identify potential repeat victims of domestic violence, after officials faced questions about the system’s effectiveness. — ©2025 The New York Times Company

LONDON: The Spanish government this week announced a major overhaul to a program in which police rely on an algorithm to identify potential repeat victims of domestic violence, after officials faced questions about the system’s effectiveness.

The program, VioGén, requires police officers to ask a victim a series of questions. Answers are entered into a software program that produces a score – from no risk to extreme risk – intended to flag the women who are most vulnerable to repeat abuse. The score helps determine what police protection and other services a woman can receive.

A New York Times investigation last year found that the police were highly reliant on the technology, almost always accepting the decisions made by the VioGén software. Some women whom the algorithm labeled at no risk or low risk for more harm later experienced further abuse, including dozens who were murdered, the Times found.

Spanish officials said the changes announced this week were part of a long-planned update to the system, which was introduced in 2007. They said the software had helped police departments with limited resources protect vulnerable women and reduce the number of repeat attacks.

In the updated system, VioGén 2, the software will no longer be able to label women as facing no risk. Police must also enter more information about a victim, which officials said would lead to more accurate predictions.

Other changes are intended to improve collaboration among government agencies involved in cases of violence against women, including making it easier to share information. In some cases, victims will receive personalised protection plans.

“Machismo is knocking at our doors and doing so with a violence unlike anything we have seen in a long time,” Ana Redondo, the minister of equality, said at a news conference on Wednesday. “It’s not the time to take a step back. It’s time to take a leap forward.”

Spain’s use of an algorithm to guide the treatment of gender violence is a far-reaching example of how governments are turning to algorithms to make important societal decisions, a trend that is expected to grow with the use of artificial intelligence. The system has been studied as a potential model for governments elsewhere that are trying to combat violence against women.

VioGén was created with the belief that an algorithm based on a mathematical model can serve as an unbiased tool to help police find and protect women who may otherwise be missed. The yes-or-no questions include: Was a weapon used? Were there economic problems? Has the aggressor shown controlling behaviours?

Victims classified as higher risk received more protection, including regular patrols by their home, access to a shelter and police monitoring of their abuser’s movements. Those with lower scores got less aid.

As of November, Spain had more than 100,000 active cases of women who had been evaluated by VioGén, with about 85% of the victims classified as facing little risk of being hurt by their abuser again. Police officers in Spain are trained to overrule VioGén’s recommendations if evidence warrants doing so, but the Times found that the risk scores were accepted about 95% of the time.

Victoria Rosell, a judge in Spain and a former government delegate focused on gender violence issues, said a period of “self-criticism” was needed for the government to improve VioGén. She said the system could be more accurate it if pulled information from additional government databases, including health care and education systems.

Natalia Morlas, president of Somos Más, a victims’ rights group, said she welcomed the changes, which she hoped would lead to better risk assessments by the police.

“Calibrating the victim’s risk well is so important that it can save lives,” Morlas said. She added that it was critical to maintain close human oversight of the system because a victim “has to be treated by people, not by machines”. ©2025 The New York Times Company

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Microsoft to invest $700 million to boost Poland's cybersecurity
Online shopping giants bet on AI to curb clothes returns
Chatbot vs national security? Why DeepSeek is raising concerns
South Korea suspends local use of Chinese AI app DeepSeek
Apple aims to boost Vision Pro with AI Features, spatial content app
Tunisian startup takes on ewaste challenge
Royal Society to meet after concerns raised over fellow Elon Musk
Lost dog? A pet detective and his thermal drone are on the case
New downloads of DeepSeek suspended in South Korea, data protection agency says
Is the aurora borealis really that mind-blowing? Or is it just your cellphone photos?

Others Also Read