Data, Tech & Black Communities

Data, Tech & Black Communities

#Algorithms
End-User Audits: A System Empowering Communities to Lead Large-Scale Investigations of Harmful Algorithmic Behavior
End-User Audits: A System Empowering Communities to Lead Large-Scale Investigations of Harmful Algorithmic Behavior
Today, technical experts hold the tools to conduct system-scale algorithm audits, so they largely decide what algorithmic harms are surfaced. Our #cscw2022 paper asks: how could *everyday users* explore where a system disagrees with their perspectives?
tgyateng69·hci.stanford.edu·
End-User Audits: A System Empowering Communities to Lead Large-Scale Investigations of Harmful Algorithmic Behavior
A Drug Addiction Risk Algorithm and Its Grim Toll on Chronic Pain Sufferers
A Drug Addiction Risk Algorithm and Its Grim Toll on Chronic Pain Sufferers
A sweeping AI has become central to how the US handles the opioid crisis. It may only be making the crisis worse. With the creeping privatisation of our National Health Services, could the U.K. see similar opaque algorithms creating barriers here?
·wired.com·
A Drug Addiction Risk Algorithm and Its Grim Toll on Chronic Pain Sufferers
Deb Raji on Twitter
Deb Raji on Twitter
These are the four most popular misconceptions people have about race & gender bias in algorithms.I'm wary of wading into this conversation again, but it's important to acknowledge the research that refutes each point, despite it feeling counter-intuitive.Let me clarify.👇🏾 https://t.co/WdzmnGLaFm— Deb Raji (@rajiinio) March 27, 2021
tgyateng69·twitter.com·
Deb Raji on Twitter
"Any data scientist working to automate issues of a social nature, in effect, is engaged in making moral and ethical decisions – they are not simply dealing with purely technical work but with a practice that actively impacts individual people." - Abeba Birhane
"Any data scientist working to automate issues of a social nature, in effect, is engaged in making moral and ethical decisions – they are not simply dealing with purely technical work but with a practice that actively impacts individual people." - Abeba Birhane
"Any data scientist working to automate issues of a social nature, in effect, is engaged in making moral and ethical decisions – they are not simply dealing with purely technical work but with a practice that actively impacts individual people." - Abeba Birhane
·up.raindrop.io·
"Any data scientist working to automate issues of a social nature, in effect, is engaged in making moral and ethical decisions – they are not simply dealing with purely technical work but with a practice that actively impacts individual people." - Abeba Birhane
The coming war on the hidden algorithms that trap people in poverty
The coming war on the hidden algorithms that trap people in poverty
A growing group of lawyers are uncovering, navigating, and fighting the automated systems that deny the poor housing, jobs, and basic services. Increasingly, the fight over a client’s eligibility now involves some kind of algorithm. “In some cases, it probably should just be shut down because there’s no way to make it equitable,”
·technologyreview.com·
The coming war on the hidden algorithms that trap people in poverty
Training data that is meant to make predictive policing less biased is still racist
Training data that is meant to make predictive policing less biased is still racist
Arrest data biases predictive tools because police are known to arrest more people in Black and other minority neighborhoods, which leads algorithms to direct more policing to those areas, which leads to more arrests. The result is that predictive tools misallocate police patrols: some neighborhoods are unfairly designated crime hot spots while others are underpoliced.
·technologyreview.com·
Training data that is meant to make predictive policing less biased is still racist
Police Built an AI To Predict Violent Crime. It Was Seriously Flawed
Police Built an AI To Predict Violent Crime. It Was Seriously Flawed
A Home Office-funded project that used artificial intelligence to predict gun and knife crime was found to be wildly inaccurate. “Basing our arguments on inaccuracy is problematic because the tech deficiencies are solvable through time. Even if the algorithm was set to be 100 percent accurate, there would still be bias in this system.”
·wired.co.uk·
Police Built an AI To Predict Violent Crime. It Was Seriously Flawed
The Dark Side of Digitisation and the Dangers of Algorithmic Decision-Making - Abeba Birhane
The Dark Side of Digitisation and the Dangers of Algorithmic Decision-Making - Abeba Birhane
As we hand over decision-making regarding social issues to automated systems developed by profit-driven corporates, not only are we allowing our social concerns to be dictated by the profit incentive, but we are also handing over moral and ethical questions to the corporate world, argues ABEBA BIRHANE
·theelephant.info·
The Dark Side of Digitisation and the Dangers of Algorithmic Decision-Making - Abeba Birhane
Algorithmic Colonisation of Africa - Abeba Birhane
Algorithmic Colonisation of Africa - Abeba Birhane
Colonialism in the age of Artificial Intelligence takes the form of “state-of-the-art algorithms” and “AI driven solutions” unsuited to African problems, and hinders the development of local products, leaving the continent dependent on Western software and infrastructure.
·theelephant.info·
Algorithmic Colonisation of Africa - Abeba Birhane