Air Canada must pay damages after chatbot lies to grieving passenger about discount
DTBC
What's in a Name? Auditing Large Language Models for Race and...
An audit design to investigate biases in state-of-the-art large language models, including GPT-4. Found that the advice systematically disadvantages names that are commonly associated with racial minorities and women. Names associated with Black women receive the least advantageous outcomes. Findings underscore the importance of conducting audits at the point of LLM deployment and implementation to mitigate their potential for harm against marginalized communities.
Judge allows Workday AI bias lawsuit to proceed as collective action
The High Stakes of Tracking Menstruation
"This report represents a timely intervention into the role and governance of menstrual tracking apps. Taking a data justice approach, it critiques society’s poor understanding of what is at stake in tracking menstruation, maps the existing and future harms of tracking menstrual cycles by private companies for both individuals and society, and suggests ways to mitigate these risks within the wider context of menstrual and reproductive health struggles."
Inside Amsterdam’s high-stakes experiment to create fair welfare AI
Resources for Primary Teachers: Children's Rights & AI: Teaching Pack — Scottish AI Alliance
"Children’s Parliament, the Scottish AI Alliance and The Alan Turing Institute are excited to share a free resource pack designed for teachers of primary-aged children (P5–P7). Developed with children and educators across Scotland, the pack introduces Artificial Intelligence (AI) in a way that’s practical, ethical and age-appropriate, with children’s human rights at its core.
Featuring six ready-to-use lesson plans, real-life scenarios, and animated videos created by children, the pack supports teachers to explore AI in fun, thoughtful and meaningful ways."
LSE: children's online rights resource
Digital Futures For Children: Brief enforcement action improves privacy for children in education: more is needed
An analysis of changes to policies and practice in Google's Workspace for Education
Edtech market in England
Report on the English EdTech market. Published 2022
Stop and Search
The dashboard is designed to make stop and search data and insights from it more accessible to researchers and the general public.
About EdTech Equity
Last updated June 2020, this was a resource to support schools and companies to consider racial equity in the use of EdTech projects
SDPC Resource Registry
Membership organisation of US based educators-schools to councils that develop tools and resources for ensuring student data privacy
Ai inequalities at work
Report draws on studies from across the world to review how AI impacts five categories of workers: Age, Women, Disability, Ethnicity, and Minority Language Speakers.
Kenya high court has ruled that it has jusridiction over case against meta
Meta is accused of promoting content that led to ethnic violence and killings in Ethiopia from 2020 to 2022
We're all now in a battle for drinking water with H20-guzzling data centres – and we're losing - Foxglove
23andMe files for bankruptcy protection
23 and Me, the US-based direct-to-consumer genetic-testings company, has filed for bankruptcy.
UK Home Office silent on alleged Apple backdoor order
The UK's Home Office refuses to either confirm or deny reports that it recently ordered Apple to create a backdoor allowing the government to access any user's cloud data.
Digital policing toolkit - Weaving Liberation
This community-centred and informed toolkit, written by Zara Manoehoetoe, offers an overview of what digital policing is and aims to do. It also showcases some examples of how we can resist, and the importance of building a collective and international, cross-movements solidarity.
Over half of all facial recognition deployments last year took place in areas with higher numbers of Black residents
The Voice Newspaper reported findings from City Hall Green researchers that last year, over half of facial recognition deployments were in high Black population areas.
Call to shut down Bristol schools’ use of app to ‘monitor’ pupils and families
Think Family Education app gives safeguarding leads easy access to pupils’ and relatives’ contacts with police and child protection
A learning curve?
Ada Lovelace Institute's report explaining AI use, guidance and efficacy in the UK education system
Protected Characteristics in Practice – The ODI
How to make a chatbot that isn’t racist or sexist | MIT Technology Review
OPEN-ODI-2020-01_Monitoring-Equality-in-Digital-Public-Services-report-1.pdf
Black Lives Matter: Why forming 'diversity' committees with Black staff, who have no expertise in D&I or Anti-Racism, is tokenising. | LinkedIn
The dividing line: how we represent race in data – The ODI
Co-op is using facial recognition tech to scan and track shoppers | WIRED UK
Facial recognition use by South Wales Police ruled unlawful - BBC News
Why we need to act on racial bias in AI - We And AI
Black women in the UK four times more likely to die in pregnancy or childbirth