What do I mean by a cock-up? Well, @CDEIUK more tactfully call them “faulty or biased systems”. They include: 📌 Black women being twice as likely to have their passport photos rejected by the Home Office. 📌 The A-level saga 📌 The London Gangs Matrix— Rachel Coldicutt (@rachelcoldicutt) November 26, 2020
Study finds gender and skin-type bias in commercial artificial-intelligence systems
A new paper from the MIT Media Lab’s Joy Buolamwini shows that three commercial facial-analysis programs demonstrate gender and skin-type biases, and suggests a new, more accurate method for evaluating the performance of such machine-learning systems.
Black Tech Employees Rebel Against ‘Diversity Theater’
Companies pledged money and support for people of color. But some say they still face a hostile work environment for speaking out or simply doing their jobs.
Artificial Intelligence in Hiring: Assessing Impacts on Equality
The use of artificial intelligence (AI) presents risks to equality, potentially embedding bias and discrimination. Auditing tools are often promised as a solution. However our new research, which examines tools for auditing AI used in recruitment, finds these tools are often inadequate in ensuring compliance with UK Equality Law, good governance and best practice.
We argue in this report that a more comprehensive approach than technical auditing is needed to safeguard equality in the use of AI for hiring, which shapes access to work. Here, we present first steps which could be taken to achieve this. We also publish a prototype AI Equality Impact Assessment which we plan to develop and pilot.