Data, Tech & Black Communities

Data, Tech & Black Communities

#2020 "#Crime and Justice"
Councils scrapping use of algorithms in benefit and welfare decisions
Councils scrapping use of algorithms in benefit and welfare decisions
Call for more transparency on how such tools are used in public services as 20 councils stop using computer algorithms. The Home Office recently stopped using an algorithm to help decide visa applications after allegations that it contained “entrenched racism”.
·theguardian.com·
Councils scrapping use of algorithms in benefit and welfare decisions
Black Lives Matter: How You Can Help in Scotland
Black Lives Matter: How You Can Help in Scotland
As the Black Lives Matter protests rage on in the US, many people in Scotland are wondering what they can do to show solidarity with the movement. Between 2000 and 2013, there were more race-related murders per capita in Scotland than in the rest of the UK – 1.8 murders per million, compared to 1.3, according to the Coalition for Racial Equality and Rights (CRER).
·sundaypost.com·
Black Lives Matter: How You Can Help in Scotland
Meaningful Transparency and (in)visible Algorithms
Meaningful Transparency and (in)visible Algorithms
Can transparency bring accountability to public-sector algorithmic decision-making (ADM) systems? High-profile retractions have taken place against a shift in public sentiment towards greater scepticism and mistrust of ‘black box’ technologies, evidenced in increasing awareness of the possible risks for citizens of the potentially invasive profiling.
·adalovelaceinstitute.org·
Meaningful Transparency and (in)visible Algorithms
Police Built an AI To Predict Violent Crime. It Was Seriously Flawed
Police Built an AI To Predict Violent Crime. It Was Seriously Flawed
A Home Office-funded project that used artificial intelligence to predict gun and knife crime was found to be wildly inaccurate. “Basing our arguments on inaccuracy is problematic because the tech deficiencies are solvable through time. Even if the algorithm was set to be 100 percent accurate, there would still be bias in this system.”
·wired.co.uk·
Police Built an AI To Predict Violent Crime. It Was Seriously Flawed