Data, Tech & Black Communities

Data, Tech & Black Communities

406 bookmarks
Custom sorting
[1910.06144] What does it mean to solve the problem of discrimination in hiring? Social, technical and legal perspectives from the UK on automated hiring systems
[1910.06144] What does it mean to solve the problem of discrimination in hiring? Social, technical and legal perspectives from the UK on automated hiring systems
The ability to get and keep a job is a key aspect of participating in society and sustaining livelihoods. Yet the way decisions are made on who is eligible for jobs, and why, are rapidly changing...
·arxiv.org·
[1910.06144] What does it mean to solve the problem of discrimination in hiring? Social, technical and legal perspectives from the UK on automated hiring systems
How to make a chatbot that isn’t racist or sexist | MIT Technology Review
How to make a chatbot that isn’t racist or sexist | MIT Technology Review
Tools like GPT-3 are stunningly good, but they feed on the cesspits of the internet. How can we make them safe for the public to actually use? § Sometimes, to reckon with the effects of biased training data is to realize that the app shouldn't be built. That without human supervision, there is no way to stop the app from saying problematic stuff to its users, and that it's unacceptable to let it do so.
·technologyreview.com·
How to make a chatbot that isn’t racist or sexist | MIT Technology Review
OPEN-ODI-2020-01_Monitoring-Equality-in-Digital-Public-Services-report-1.pdf
OPEN-ODI-2020-01_Monitoring-Equality-in-Digital-Public-Services-report-1.pdf
Many of the public and private services we use are now digital. The move to digital is likely to increase as technology becomes more embedded in our lives. But what does this mean for how essential public services understand who is using, or indeed not using, them and why? § Data about the protected characteristics of people using these services isn’t currently collected andstatistics aren’t published in a consistent or collective way. This means it is harder tofind out who is excluded from using these services and why.
·up.raindrop.io·
OPEN-ODI-2020-01_Monitoring-Equality-in-Digital-Public-Services-report-1.pdf
Co-op is using facial recognition tech to scan and track shoppers | WIRED UK
Co-op is using facial recognition tech to scan and track shoppers | WIRED UK
Branches of Co-op in the south of England have been using real-time facial recognition cameras to scan shoppers entering stores. § See US based studies on FRT which shows the technology can be unreliable for black people, especially black women.
·wired.co.uk·
Co-op is using facial recognition tech to scan and track shoppers | WIRED UK
The dividing line: how we represent race in data – The ODI
The dividing line: how we represent race in data – The ODI
The point of this essay is to encourage a critical approach to the relationship between race and data. It points to three questions that anyone working with data should ask if they are going to be collecting and using data about race. § If we are not careful, data can divide and sort us in exactly the sort of essentialising ways that the colonial idea of race supported. But if researchers ask the right questions, and know their history, we can use data to advocate for racial justice.
·theodi.org·
The dividing line: how we represent race in data – The ODI
Black Lives Matter: Why forming 'diversity'​ committees with Black staff, who have no expertise in D&I or Anti-Racism, is tokenising. | LinkedIn
Black Lives Matter: Why forming 'diversity'​ committees with Black staff, who have no expertise in D&I or Anti-Racism, is tokenising. | LinkedIn
Black people are experts in their lived experience of racism.That does not automatically mean Black people are experts in Diversity and Inclusion (D&I) or Anti-Racism. § Having a corridor chat with a random member of staff who happens to be Black to pick their brains about D&I and Anti-Racism policy is not strategic. It's tokenism. And tokenism is a byproduct of racism.
·linkedin.com·
Black Lives Matter: Why forming 'diversity'​ committees with Black staff, who have no expertise in D&I or Anti-Racism, is tokenising. | LinkedIn
Protected Characteristics in Practice – The ODI
Protected Characteristics in Practice – The ODI
Many of the public and private services we use are now digital. The move to digital is likely to increase as technology becomes more embedded in our lives. But what does this mean for how essential public services understand who is using, or indeed not using, them and why? § Data about the protected characteristics of people using these services isn’t currently collected andstatistics aren’t published in a consistent or collective way. This means it is harder tofind out who is excluded from using these services and why.
·theodi.org·
Protected Characteristics in Practice – The ODI