Data, Tech & Black Communities

Data, Tech & Black Communities

406 bookmarks
Custom sorting
What can we learn from the qualifications fiasco? – The ODI
What can we learn from the qualifications fiasco? – The ODI
Algorithms increasingly influence our digital lives – from the search results we see to the shows that Netflix recommends to us – but they are also encroaching into our real lives, and being used to make decisions that affect our futures.
tgyateng69·theodi.org·
What can we learn from the qualifications fiasco? – The ODI
Artificial Intelligence in Hiring: Assessing Impacts on Equality
Artificial Intelligence in Hiring: Assessing Impacts on Equality
The use of artificial intelligence (AI) presents risks to equality, potentially embedding bias and discrimination. Auditing tools are often promised as a solution. However our new research, which examines tools for auditing AI used in recruitment, finds these tools are often inadequate in ensuring compliance with UK Equality Law, good governance and best practice. We argue in this report that a more comprehensive approach than technical auditing is needed to safeguard equality in the use of AI for hiring, which shapes access to work. Here, we present first steps which could be taken to achieve this. We also publish a prototype AI Equality Impact Assessment which we plan to develop and pilot.
·up.raindrop.io·
Artificial Intelligence in Hiring: Assessing Impacts on Equality
The coming war on the hidden algorithms that trap people in poverty
The coming war on the hidden algorithms that trap people in poverty
A growing group of lawyers are uncovering, navigating, and fighting the automated systems that deny the poor housing, jobs, and basic services. Increasingly, the fight over a client’s eligibility now involves some kind of algorithm. “In some cases, it probably should just be shut down because there’s no way to make it equitable,”
·technologyreview.com·
The coming war on the hidden algorithms that trap people in poverty
Training data that is meant to make predictive policing less biased is still racist
Training data that is meant to make predictive policing less biased is still racist
Arrest data biases predictive tools because police are known to arrest more people in Black and other minority neighborhoods, which leads algorithms to direct more policing to those areas, which leads to more arrests. The result is that predictive tools misallocate police patrols: some neighborhoods are unfairly designated crime hot spots while others are underpoliced.
·technologyreview.com·
Training data that is meant to make predictive policing less biased is still racist
How Data Can Map and Make Racial Inequality More Visible (If Done Responsibly) | by The GovLab | Data Stewards Network | Medium
How Data Can Map and Make Racial Inequality More Visible (If Done Responsibly) | by The GovLab | Data Stewards Network | Medium
Racism is a systemic issue that pervades every aspect of life in the United States and around the world. In recent months, its corrosive…
·medium.com·
How Data Can Map and Make Racial Inequality More Visible (If Done Responsibly) | by The GovLab | Data Stewards Network | Medium
Uber and worker exploitation
Uber and worker exploitation
This is what California just voted to protect. multibillion dollar tech companies legally paying people $2.50 per hour. no benefits, no union, and no power make their own decisions about the work they do like actual independent contractors. You legalized violating Labor law. https://t.co/7iD1Z1RMbY— (@BethLynch2020) February 19, 2021
adeadewunmi·twitter.com·
Uber and worker exploitation
The Scottish Racism Project
The Scottish Racism Project
The Scottish Racism Project does two things: (1) take deep dives into the various ways racism has manifested itself up north, explore courses of action to remedy this, and look at how BAME communities can empower themselves in the face of adversity (2) offer and find solidarity with BAME individuals who have shared real lives personal stories of racism and want the truth of their experiences to be known far and wide, often because the wider Scottish Press were uninterested when approached.
·scottish-racism.blogspot.com·
The Scottish Racism Project
Councils scrapping use of algorithms in benefit and welfare decisions
Councils scrapping use of algorithms in benefit and welfare decisions
Call for more transparency on how such tools are used in public services as 20 councils stop using computer algorithms. The Home Office recently stopped using an algorithm to help decide visa applications after allegations that it contained “entrenched racism”.
·theguardian.com·
Councils scrapping use of algorithms in benefit and welfare decisions
Black Doctors Work Overtime to Combat Clubhouse Covid Myths
Black Doctors Work Overtime to Combat Clubhouse Covid Myths
Article about the (unpaid) role that Black doctors are playing on Clubhouse in order to combat misinformation about the Covid vaccine. The fact that this platform isn't bothering to even try to address the negative externality of misinformation is a perfect example of a negative impact of data and tech. Black people are already at higher risk of being infected by Covid and of dying from it so the impact of misinformation is disproportionate.
adeadewunmi·bloomberg.com·
Black Doctors Work Overtime to Combat Clubhouse Covid Myths
Black Lives Matter: How You Can Help in Scotland
Black Lives Matter: How You Can Help in Scotland
As the Black Lives Matter protests rage on in the US, many people in Scotland are wondering what they can do to show solidarity with the movement. Between 2000 and 2013, there were more race-related murders per capita in Scotland than in the rest of the UK – 1.8 murders per million, compared to 1.3, according to the Coalition for Racial Equality and Rights (CRER).
·sundaypost.com·
Black Lives Matter: How You Can Help in Scotland
Meaningful Transparency and (in)visible Algorithms
Meaningful Transparency and (in)visible Algorithms
Can transparency bring accountability to public-sector algorithmic decision-making (ADM) systems? High-profile retractions have taken place against a shift in public sentiment towards greater scepticism and mistrust of ‘black box’ technologies, evidenced in increasing awareness of the possible risks for citizens of the potentially invasive profiling.
·adalovelaceinstitute.org·
Meaningful Transparency and (in)visible Algorithms
Police Built an AI To Predict Violent Crime. It Was Seriously Flawed
Police Built an AI To Predict Violent Crime. It Was Seriously Flawed
A Home Office-funded project that used artificial intelligence to predict gun and knife crime was found to be wildly inaccurate. “Basing our arguments on inaccuracy is problematic because the tech deficiencies are solvable through time. Even if the algorithm was set to be 100 percent accurate, there would still be bias in this system.”
·wired.co.uk·
Police Built an AI To Predict Violent Crime. It Was Seriously Flawed
Algorithmic Colonisation of Africa - Abeba Birhane
Algorithmic Colonisation of Africa - Abeba Birhane
Colonialism in the age of Artificial Intelligence takes the form of “state-of-the-art algorithms” and “AI driven solutions” unsuited to African problems, and hinders the development of local products, leaving the continent dependent on Western software and infrastructure.
·theelephant.info·
Algorithmic Colonisation of Africa - Abeba Birhane
The Dark Side of Digitisation and the Dangers of Algorithmic Decision-Making - Abeba Birhane
The Dark Side of Digitisation and the Dangers of Algorithmic Decision-Making - Abeba Birhane
As we hand over decision-making regarding social issues to automated systems developed by profit-driven corporates, not only are we allowing our social concerns to be dictated by the profit incentive, but we are also handing over moral and ethical questions to the corporate world, argues ABEBA BIRHANE
·theelephant.info·
The Dark Side of Digitisation and the Dangers of Algorithmic Decision-Making - Abeba Birhane
#BlackLivesMatter Brand Responses
#BlackLivesMatter Brand Responses
As protests and unrest have taken over the U.S. and other parts of the world - brands rushed to speak out and align themselves with anti-racism. We’ve gathered 100 + examples of brands responding to the Black Lives Matter movement. We’ve highlighted their responses, their actions and some reactions. Use this link to share this deck: https://bit.ly/2N9AV4O
·docs.google.com·
#BlackLivesMatter Brand Responses
Whats really changed? – Cubicgarden.com…
Whats really changed? – Cubicgarden.com…
Its coming up to 6 months since George Floyd was murdered by the Minneapolis police. One of the things I am planning is a look at all those pledges to make a change by companies to see if they actually did what they pledged. Part of my work is to extract the data from this amazing presentation. Put into a form where others can add to it, likely a airtable, mutliple google sheets or github somehow?
·cubicgarden.com·
Whats really changed? – Cubicgarden.com…
History is key to understanding vaccine hesitancy in people of colour - £paywall
History is key to understanding vaccine hesitancy in people of colour - £paywall
Distrust has its roots in ‘scientific’ experiments that aimed to prove racial superiority. As the rollout of the coronavirus vaccine continues across the UK, with more than 14m already having had the jab, in communities of colour there remains deep concern over its safety. Polls show that people of colour are around 20 per cent less likely to have a vaccination than the population as a whole.
·on.ft.com·
History is key to understanding vaccine hesitancy in people of colour - £paywall
September Report - Black student participation.pdf
September Report - Black student participation.pdf
This report has been developed to provide context and understanding for why Black Geographers has emerged as a necessary intervention, and what disparities the organisation aims to address going forward. There is a lack of research aimed at black students within geography so we're creating our own and giving black students a platform to speak up and be heard.
·up.raindrop.io·
September Report - Black student participation.pdf