An Algorithm Told Police She Was Safe. Then Her Husband Killed Her.

https://www.nytimes.com/interactive/2024/07/18/technology/spain-domestic-violence-viogen-algorithm.html

4 Comments

  1. Blueberry_Conscious_ on

    It sounds like something out of Black Mirror:

    Before Ms. Hemid left the station that night, the police had to determine if she was in danger of being attacked again and needed support. A police officer clicked through 35 yes or no questions — Was a weapon used? Were there economic problems? Has the aggressor shown controlling behaviors? *—* to feed into an algorithm called VioGén that would help generate an answer.

    VioGén produced a score:

    LOW RISK

    Lobna Hemid

    2022

    Madrid

    The police accepted the software’s judgment and Ms. Hemid went home with no further protection. Mr. el Banaisati, who was imprisoned that night, was released the next day. Seven weeks later, he [fatally stabbed](https://www.elconfidencial.com/espana/madrid/2022-03-03/mujer-muerta-pozuelo-madrid-cita-juicio-por-divorcio_3385169/) Ms. Hemid several times in the chest and abdomen before killing himself. She was 32 years old.

    Working in tech and coming from a country where women are killed weekly by abusive domestic partners or family members, I sometimes see how tech can be part of solutions to many social issues. In this one, it failed dismally.

  2. An algorithm needs good data, but it was never easy to obtain gun death related data in the US.
    The NRA did a great job to obstruct any such scientific work.

  3. MYDOGSMOKES5MEODMT on

    This area of usage seems like the very very very very last bastion you would use something like this on.

    Like not until you have even the teeniest wrinkles smoothed out do you put something like this out into full production — and even then you air on the side of caution. what the fuck.

  4. Blueberry_Conscious_ on

    This also infuriates me: ”
    After Spain passed a law in 2004 to address violence against women, the government assembled experts in statistics, psychology and other fields to find an answer. Their goal was to create a statistical model to [identify women most at risk of abuse](https://www.nytimes.com/2011/02/24/world/europe/24iht-spain.html) and to outline a standardized response to protect them.”

    How about an algorithm to predict abusers (yeah i know a bit too clockwork orange) and implement mental health and anger management programs to stop men abusing women?