[1910.06144] What does it mean to solve the problem of discrimination in hiring? Social, technical and legal perspectives from the UK on automated hiring systems
The ability to get and keep a job is a key aspect of participating in society and sustaining livelihoods. Yet the way decisions are made on who is eligible for jobs, and why, are rapidly changing...
Why we need to act on racial bias in AI - We And AI
Inaction makes you part of the problem.
§
In the UK, as protesters fill Hyde Park today, many are aware that while systemic racism may not result in as many horrific deaths of black people at the hands of police as in the US, racism is still pervasive within our society and politics.
Co-op is using facial recognition tech to scan and track shoppers | WIRED UK
Branches of Co-op in the south of England have been using real-time facial recognition cameras to scan shoppers entering stores.
§
See US based studies on FRT which shows the technology can be unreliable for black people, especially black women.
The dividing line: how we represent race in data – The ODI
The point of this essay is to encourage a critical approach to the relationship between race and data. It points to three questions that anyone working with data should ask if they are going to be collecting and using data about race.
§
If we are not careful, data can divide and sort us in exactly the sort of essentialising ways that the colonial idea of race supported. But if researchers ask the right questions, and know their history, we can use data to advocate for racial justice.
Black Lives Matter: Why forming 'diversity' committees with Black staff, who have no expertise in D&I or Anti-Racism, is tokenising. | LinkedIn
Black people are experts in their lived experience of racism.That does not automatically mean Black people are experts in Diversity and Inclusion (D&I) or Anti-Racism.
§
Having a corridor chat with a random member of staff who happens to be Black to pick their brains about D&I and Anti-Racism policy is not strategic. It's tokenism. And tokenism is a byproduct of racism.
Many of the public and private services we use are now digital. The move to digital is likely to increase as technology becomes more embedded in our lives. But what does this mean for how essential public services understand who is using, or indeed not using, them and why?
§
Data about the protected characteristics of people using these services isn’t currently collected andstatistics aren’t published in a consistent or collective way. This means it is harder tofind out who is excluded from using these services and why.
How to make a chatbot that isn’t racist or sexist | MIT Technology Review
Tools like GPT-3 are stunningly good, but they feed on the cesspits of the internet. How can we make them safe for the public to actually use?
§
Sometimes, to reckon with the effects of biased training data is to realize that the app shouldn't be built. That without human supervision, there is no way to stop the app from saying problematic stuff to its users, and that it's unacceptable to let it do so.
Many of the public and private services we use are now digital. The move to digital is likely to increase as technology becomes more embedded in our lives. But what does this mean for how essential public services understand who is using, or indeed not using, them and why?
§
Data about the protected characteristics of people using these services isn’t currently collected andstatistics aren’t published in a consistent or collective way. This means it is harder tofind out who is excluded from using these services and why.