Data, Tech & Black Communities

Data, Tech & Black Communities

406 bookmarks
Custom sorting
Technology in education
Technology in education
Technology is reshaping our increasingly interconnected world. The use of technology in education has been growing in recent years, and saw an immense expansion during the Covid-19 pandemic as school closures promoted a mass turn to online platforms. Technology in education - often described as EdTech - takes many forms.
tgyateng69·right-to-education.org·
Technology in education
Seven intersectional feminist principles for equitable and actionable COVID-19 data
Seven intersectional feminist principles for equitable and actionable COVID-19 data
This essay offers seven intersectional feminist principles for equitable and actionable COVID-19 data, drawing from the authors' prior work on data feminism. Our book, Data Feminism (D'Ignazio and Klein, 2020), offers seven principles which suggest possible points of entry for challenging and changing power imbalances in data science. In this essay, we offer seven sets of examples, one inspired by each of our principles, for both identifying existing power imbalances with respect to the impact of the novel coronavirus and its response, and for beginning the work of change.
tgyateng69·journals.sagepub.com·
Seven intersectional feminist principles for equitable and actionable COVID-19 data
New Research Report: Civic Participation in the Datafied Society
New Research Report: Civic Participation in the Datafied Society
Although data systems are rapidly rolled out in the public sector and state-citizen relations become increasingly automated, there is little public knowledge about these developments, with few possibilities to intervene into, and participate in, decisions about the use of data systems. How, then, do we advance people’s voice in the deployment of data and AI? How do we democratise datafied governance?
tgyateng69·datajusticelab.org·
New Research Report: Civic Participation in the Datafied Society
Surfacing Systemic (In)justices: A community view
Surfacing Systemic (In)justices: A community view
Surfacing Systemic (In)justices: A Community View shares findings from an extensive Europe- wide consultation undertaken by Systemic Justice that seeks to learn from the perspectives and experiences of affected community groups and organisations, in order to inform potential litigation and other strategies for change.
tgyateng69·systemicjustice.ngo·
Surfacing Systemic (In)justices: A community view
Black Twitter & Mastadon
Black Twitter & Mastadon
Sahdya Darr on Twitter. As many folks leave Twitter for Mastodon, I want to draw attention to thoughts being shared and discussions taking place about Elon Musk’s purchase of Twitter beyond white, left leaning accounts👇🏼— Sahdya Darr (@sahdyadarr) November 9, 2022
tgyateng69·twitter.com·
Black Twitter & Mastadon
Decolonising EdTech: A resource list for tackling coloniality and digital neocolonialism in EdTech - EdTech Hub
Decolonising EdTech: A resource list for tackling coloniality and digital neocolonialism in EdTech - EdTech Hub
At EdTech Hub, we’ve been reflecting on how coloniality is embedded in the work we do: from the colonial roots of the international development sector, to colonial practices embedded in research methods, to “core-to-periphery” design and deployment of EdTech interventions. We’ve just begun this journey, but in trying to embody one of our EdTech Hub values of ‘fearless and humble…
tgyateng69·edtechhub.org·
Decolonising EdTech: A resource list for tackling coloniality and digital neocolonialism in EdTech - EdTech Hub
Lauren Bridges on Twitter
Lauren Bridges on Twitter
I’m thrilled to share this new creative & educational video, “Toxic Clouds and Dirty Data” on the environmental impacts of data centers made in collaboration with the talented @adamcooperteran w/ support from @KleinmanEnergy and @internetsociety https://t.co/vLuesFXHwS— Lauren Bridges (@Laurenebridges) October 12, 2022
tgyateng69·twitter.com·
Lauren Bridges on Twitter
Toxic Clouds and Dirty Data
Toxic Clouds and Dirty Data
Energy policy research from the University of Pennsylvania. A project from Kleinman Center grantee Lauren Bridges, made in collaboration with artist and media-maker Adam Cooper–Terán, explores why we need to rethink the energy, water, and waste-intensive model of cloud computing.
tgyateng69·kleinmanenergy.upenn.edu·
Toxic Clouds and Dirty Data
The future of AI and education: Some cautionary notes
The future of AI and education: Some cautionary notes
In light of fast-growing popular, political and professional discourses around AI in education, this article outlines five broad areas of contention that merit closer attention in future discussion and decision-making. These include: (1) taking care to focus on issues relating to 'actually existing' AI rather than the overselling of speculative AI technologies; (2) clearly foregrounding the limitations of AI in terms of modelling social contexts, and simulating human intelligence, reckoning, autonomy and emotions; (3) foregrounding the social harms associated with AI use; (4) acknowledging the value-driven nature of claims around AI; and (5) paying closer attention to the environmental and ecological sustainability of continued AI development and implementation. Thus, in contrast to popular notions of AI as a neutral tool, the argument is made for engaging with the ongoing use of AI in education as a political action that has varying impacts on different groups of people in various educational contexts.
tgyateng69·onlinelibrary.wiley.com·
The future of AI and education: Some cautionary notes
End-User Audits: A System Empowering Communities to Lead Large-Scale Investigations of Harmful Algorithmic Behavior
End-User Audits: A System Empowering Communities to Lead Large-Scale Investigations of Harmful Algorithmic Behavior
Today, technical experts hold the tools to conduct system-scale algorithm audits, so they largely decide what algorithmic harms are surfaced. Our #cscw2022 paper asks: how could *everyday users* explore where a system disagrees with their perspectives?
tgyateng69·hci.stanford.edu·
End-User Audits: A System Empowering Communities to Lead Large-Scale Investigations of Harmful Algorithmic Behavior
Chanda Prescod-Weinstein on Twitter
Chanda Prescod-Weinstein on Twitter
A BIG PROBLEM with the fediverse is that there’s no mechanism for call and response — quote tweeting — and this is purposeful. This is going to be an issue with trying to port Black Twitter, which is prob one reason @shengokai says BT is non-fungible. https://t.co/dNx3NFHssc— Chanda Prescod-Weinstein (@IBJIYONGI) November 7, 2022
tgyateng69·twitter.com·
Chanda Prescod-Weinstein on Twitter