Found 3 bookmarks
By relevance
Training data that is meant to make predictive policing less biased is still racist
Training data that is meant to make predictive policing less biased is still racist
Arrest data biases predictive tools because police are known to arrest more people in Black and other minority neighborhoods, which leads algorithms to direct more policing to those areas, which leads to more arrests. The result is that predictive tools misallocate police patrols: some neighborhoods are unfairly designated crime hot spots while others are underpoliced.
·technologyreview.com·
Training data that is meant to make predictive policing less biased is still racist
Police Built an AI To Predict Violent Crime. It Was Seriously Flawed
Police Built an AI To Predict Violent Crime. It Was Seriously Flawed
A Home Office-funded project that used artificial intelligence to predict gun and knife crime was found to be wildly inaccurate. “Basing our arguments on inaccuracy is problematic because the tech deficiencies are solvable through time. Even if the algorithm was set to be 100 percent accurate, there would still be bias in this system.”
·wired.co.uk·
Police Built an AI To Predict Violent Crime. It Was Seriously Flawed