In January, Lina went to the police. Her ex-partner had been threatening her at home in the Spanish seaside town of Benalmádena. That day, he'd allegedly raised his hand as if to hit her. "There had been violent episodes - she was scared," Lina's cousin Daniel recalls. When she got to the police station, she was interviewed and her case registered with VioGén, a digital tool which assesses the likelihood of a woman being attacked again by the same man. VioGén - an algorithm-based system - asks 35 questions about the abuse and its intensity, the aggressor's access to weapons, his mental health and whether the woman has left, or is considering leaving, the relationship. It then records the threat to her as "negligible", "low", "medium", "high" or "extreme". The category is used to make decisions about the allocation of police resources to protect the woman. Lina was deemed to be at "medium" risk. She asked for a restraining order at a specialist gender violence court in Malaga, so that her ex-partner couldn't be in contact with her or share her living space. The request was denied. "Lina wanted to change the locks at her home, so she could live peacefully with her children," says her cousin. Three weeks later, she was dead. Her partner had allegedly used his key to enter her flat and soon the house was on fire. While her children, mother and ex-partner all escaped, Lina didn't. Her 11-year-old son was widely reported as telling police it was his father who killed his mother. Lina's lifeless body was retrieved from the charred interior of her home. Her ex-partner, the father of her three youngest children, was arrested. Now, her death is raising questions about VioGén and its ability to keep women safe in Spain. VioGén didn't accurately predict the threat to Lina. As a woman designated at "medium" risk, the protocol is that she would be followed up again by a nominated police officer within 30 days. But Lina was dead before that. If she had been "high" risk, the police follow-up would have happened within a week. Could that have made a difference to Lina? Tools to evaluate the threat of repeat domestic violence are used in North America and across Europe. In the UK, some police forces use DARA (Domestic Abuse Risk Assessment) - essentially a checklist. And DASH (Domestic Abuse, Stalking, Harassment and Honour-based Violence Assessment) may be employed by police or others, like social workers, to assess the risk of another attack. But only in Spain is an algorithm woven so tightly into police practice. VioGén was developed by Spanish police and academics. It's used everywhere apart from the Basque Country and Catalonia (those regions have separate systems, although police co-operation is nationwide). The head of the National Police's family and women's unit in Malaga, Ch Insp Isabel Espejo, describes VioGén as "super-important". "It helps us follow each victim's case very precisely," she says. Her officers deal with an average of 10 reports of gender violence a day. And every month, VioGén classifies nine or 10 women as being at "extreme" risk of repeat victimisation. The resource implications in those cases are huge: 24-hour police protection for a woman until the circumstances change and the risk decreases. Women assessed as "high" risk may also get an officer escort. A 2014 study found that officers accepted VioGén's evaluation of the likelihood of repeated abuse 95% of the time. Critics suggest police are abdicating decision-making about women's safety to an algorithm. Ch Insp Espejo says that the algorithm's calculation of risk is usually adequate. But she recognises - even though Lina's case wasn't under her command - that something went wrong with Lina's assessment. "I'm not going to say VioGén doesn't fail – it does. But this wasn't the trigger that led to this woman's murder. The only guilty party is the person who killed Lina. Total security just doesn't exist," she says. But at "medium" risk, Lina was never a police priority. And did Lina's VioGén assessment have an impact on the court's decision to deny her a restraining order against her ex-partner? Court authorities didn't give us permission to meet the judge who denied Lina an injunction against her ex-partner - a woman attacked on social media after Lina's death. Instead, another of Malaga's gender violence judges, Maria del Carmen Gutiérrez tells us in general terms that such an order needs two things: evidence of a crime and the threat of serious danger to the victim. "VioGén is one element I use to assess that danger, but it's far from the only one," she says. Sometimes, the judge says, she makes restraining orders in cases where VioGén has assessed a woman as at "negligible" or "low" risk. On other occasions she may conclude there's no danger to a woman deemed at "medium" or "high" risk of repeat victimisation. Dr Juan Jose Medina, a criminologist at the University of Seville, says Spain has a "postcode lottery" for women applying for restraining orders – some jurisdictions are much more likely to grant them than others. But we don't know systematically how VioGén influences the courts, or the police, because studies haven't been done. "How are police officers and other stakeholders using this tool, and how is it informing their decision-making? We don't have good answers," he says. Spain's interior ministry hasn't often allowed academics access to VioGén's data. And there hasn't been an independent audit of the algorithm. Gemma Galdon, the founder of Eticas – an organisation working on the social and ethical impact of technology – says if you don't audit these systems, you won't know if they're actually delivering police protection to the right women. Examples of algorithmic bias elsewhere are well-documented. In the US, analysis from 2016 of a recidivism tool found black defendants were more likely than their white peers to be incorrectly judged to be at higher risk of repeat offending. At the same time, white defendants were more likely than black defendants to be wrongly flagged as low risk. In 2018, Spain's interior ministry didn't give a green light to an Eticas proposal to conduct a confidential, pro-bono, internal audit. So instead, Gemma Galdon and her colleagues decided to reverse-engineer VioGén and do an external audit. They used interviews with women survivors of domestic abuse and publicly available information - including data from the judiciary about women who, like Lina, had been killed. They found that between 2003 and 2021, 71 women murdered by their partners or ex-partners had previously reported domestic abuse to the police. Those recorded on the VioGén system were given risk levels of "negligible" or "medium". "What we'd like to know is, were those error rates that cannot be mitigated in any way? Or could we have done something to improve how these systems assign risk and protect those women better?" asks Gemma Galdon. The head of gender violence research at Spain's interior ministry, Juan José López-Ossorio, is dismissive of the Eticas investigation: it wasn't done with VioGén data. "If you don't have access to the data, how can you interpret it?" he says. And he is wary of an external audit, fearing it could compromise both the security of women whose cases are recorded and VioGén's procedures. "What we know is that once a woman reports a man and she's under police protection, the probability of further violence is substantially lowered - we've no doubts about that," says López-Ossorio. VioGén has evolved since it was introduced in Spain. The questionnaire has been refined, and the "negligible" category of risk will soon be abolished. And even critics agree it makes sense to have a standardised system responding to gender violence. In Benalmádena, Lina's home has become a shrine. Flowers, candles and pictures of saints were left on the step. A small poster stuck on the wall declared: Benalmádena says no to gender violence. The community fundraised for Lina's children. Her cousin, Daniel, says everyone's still reeling from news of her death. "The family it's destroyed – especially Lina's mother," he says. "She's 82 years old. I don't think there's anything sadder than to have your daughter killed by an aggressor in a way that could have been avoided. The children are still in shock – they'll need a lot of psychological help."
Police algorithm said Lina was at 'medium' risk. Then she was killed
TruthLens AI Suggested Headline:
"Critique of VioGén Algorithm Follows Domestic Violence Tragedy in Spain"
TruthLens AI Summary
In January, Lina sought help from the police in Benalmádena, Spain, after her ex-partner threatened her, raising fears for her safety. Despite previous violent incidents, her case was evaluated using VioGén, an algorithmic tool designed to assess the risk of domestic violence. Lina was classified as being at 'medium' risk, which meant she would not receive immediate priority in police protection. Although she requested a restraining order to keep her ex-partner away, the court denied her request, stating that there was insufficient evidence of a serious threat. Tragically, just three weeks later, Lina was killed by her ex-partner, who allegedly used a key to enter her home and set it on fire. Her children and family escaped, but Lina did not, raising serious concerns about the effectiveness of the VioGén system and the judicial process that failed to protect her. Her son reportedly identified his father as the murderer, further complicating the already devastating situation for Lina's family.
The aftermath of Lina's death has ignited a broader debate about the VioGén algorithm and its role in police decision-making regarding domestic violence cases in Spain. Critics argue that reliance on such algorithms can lead to a dangerous abdication of responsibility by law enforcement in evaluating threats to women's safety. Although police officials maintain that VioGén is a valuable tool, they acknowledge that it is not infallible and that Lina's case highlights significant flaws in its application. The algorithm's risk assessment did not prompt timely follow-ups or protective measures that could have potentially saved Lina's life. Experts call for independent audits of the system to ensure its effectiveness and to address biases that may impact the safety of victims. Lina's community has since mourned her loss, creating memorials and advocating for better protections against domestic violence, emphasizing the urgent need for reforms in both law enforcement practices and judicial responses to such cases.
TruthLens AI Analysis
The article presents a tragic narrative surrounding the death of Lina, a woman who sought police protection from her abusive ex-partner. It raises critical questions about the efficacy of the VioGén algorithm, which is designed to assess the risk of domestic violence against women. This incident reveals potential shortcomings in how such systems evaluate threats and allocate resources, highlighting a significant issue in the protection of vulnerable individuals.
Implications of the Algorithm Failure
The use of the VioGén system, which categorized Lina as a "medium" risk, raises concerns about the reliability of algorithm-based assessments in domestic violence cases. The failure to predict the actual risk she faced has broader implications for policy and law enforcement practices. It suggests a potential over-reliance on technology in assessing human threats, which can lead to devastating outcomes, as seen in Lina’s case.
Public Sentiment and Awareness
This report is likely intended to increase public awareness about the limitations of algorithmic assessments in domestic violence situations. By sharing Lina's story, the article aims to evoke empathy and provoke outrage regarding systemic failures in protecting women from violence. It potentially encourages discourse on the necessity for improved training and resources for law enforcement when dealing with domestic abuse cases.
Underlying Issues and Hidden Agendas
While the article focuses on a specific incident, it may also serve to draw attention to broader systemic issues within law enforcement and social services related to domestic violence. This could suggest a need for policy reform, better training for police officers, and a reevaluation of how risk assessments are conducted. However, it does not appear to hide significant information but instead sheds light on a critical societal issue.
Manipulative Elements
The article employs emotional language and vivid descriptions to draw readers into the tragedy of Lina's story, potentially amplifying its impact. While this approach can be seen as manipulative, it may also be necessary to convey the gravity of domestic violence issues effectively. The focus on Lina's family and the circumstances of her death serves to humanize the statistics surrounding domestic violence.
Comparative Context
When compared to other news articles on domestic violence, this piece closely aligns with narratives that emphasize the need for systemic change and greater protection for victims. It may connect with ongoing discussions about women's rights and safety in various regions, particularly in contexts where similar algorithmic tools are in use.
Socioeconomic and Political Effects
This article could catalyze discussions about funding and resources for domestic violence prevention programs, influencing political agendas and community support initiatives. It may lead to increased pressure on lawmakers to ensure better protection mechanisms for victims of domestic violence.
Target Audience
The narrative likely resonates with a wide range of communities, especially those advocating for women's rights, gender equality, and victims of domestic violence. It serves to raise collective consciousness and may stimulate activism around these issues.
Market Impact
While the immediate economic implications of this article may be limited, the broader discussions it sparks could affect organizations involved in social services and law enforcement. Companies that provide technology solutions for public safety might also face scrutiny regarding their products' effectiveness.
Global Relevance
This situation reflects a larger global conversation about domestic violence and the responsibility of law enforcement agencies to protect vulnerable populations. It aligns with current discussions on gender-based violence in various parts of the world, emphasizing the need for systemic improvements in how such cases are handled.
Artificial Intelligence Considerations
The article may not have been directly written by AI, but its reliance on algorithmic assessment raises questions about the role technology plays in shaping narratives and decision-making processes within law enforcement. AI's influence is evident in the risk assessment methodology, which lacks the nuance of human judgment.
The overall reliability of the article seems strong, as it highlights an urgent and complex social issue backed by a specific case. However, its emotional tone and focus on one individual’s tragedy could lead to potential biases in understanding the broader context of domestic violence.