Found 5 bookmarks
By relevance
Rachel Coldicutt on Twitter
Rachel Coldicutt on Twitter
What do I mean by a cock-up? Well, @CDEIUK more tactfully call them “faulty or biased systems”. They include: 📌 Black women being twice as likely to have their passport photos rejected by the Home Office. 📌 The A-level saga 📌 The London Gangs Matrix— Rachel Coldicutt (@rachelcoldicutt) November 26, 2020
·twitter.com·
Rachel Coldicutt on Twitter
Study finds gender and skin-type bias in commercial artificial-intelligence systems
Study finds gender and skin-type bias in commercial artificial-intelligence systems
A new paper from the MIT Media Lab’s Joy Buolamwini shows that three commercial facial-analysis programs demonstrate gender and skin-type biases, and suggests a new, more accurate method for evaluating the performance of such machine-learning systems.
·news.mit.edu·
Study finds gender and skin-type bias in commercial artificial-intelligence systems
Artificial Intelligence in Hiring: Assessing Impacts on Equality
Artificial Intelligence in Hiring: Assessing Impacts on Equality
The use of artificial intelligence (AI) presents risks to equality, potentially embedding bias and discrimination. Auditing tools are often promised as a solution. However our new research, which examines tools for auditing AI used in recruitment, finds these tools are often inadequate in ensuring compliance with UK Equality Law, good governance and best practice. We argue in this report that a more comprehensive approach than technical auditing is needed to safeguard equality in the use of AI for hiring, which shapes access to work. Here, we present first steps which could be taken to achieve this. We also publish a prototype AI Equality Impact Assessment which we plan to develop and pilot.
·up.raindrop.io·
Artificial Intelligence in Hiring: Assessing Impacts on Equality