Found 430 bookmarks
Newest
How AI Cheapens Design (At a Great Ecological Cost)
How AI Cheapens Design (At a Great Ecological Cost)
It doesn’t come as much of a surprise to learn that, environmentally speaking, AI is an extremely wasteful and destructive technology. It’s rare that you can nail down the ecological cost of an image.  Still, researchers have assessed that generating a single AI image requires the same amount of energy needed to fully charge your iPhone […]
How AI Cheapens Design (At a Great Ecological Cost)
Stochastic Parrots Reading / Viewing List
Stochastic Parrots Reading / Viewing List
🦜Stochastic Parrots Day Reading List🦜 On March 17, 2023, Stochastic Parrots Day organized by T Gebru, M Mitchell, and E Bender and hosted by The Distributed AI Research Institute (DAIR) was held online commemorating the 2nd anniversary of the paper’s publication. Below are the readings which po...
Stochastic Parrots Reading / Viewing List
What Tech Calls Thinking
What Tech Calls Thinking
A New York Times Book Review Editors' Choice"In Daub’s hands the founding concepts of Silicon Valley don’t make money; they fall apart." --The New York T...
What Tech Calls Thinking
Waag | Nederlandse bevolking stelt prioriteiten voor onderzoeksagenda AI
Waag | Nederlandse bevolking stelt prioriteiten voor onderzoeksagenda AI
Onderzoek naar mening Nederlander over AI: 58% van de Nederlandse bevolking acht het thema "Nepnieuws, nepfoto's en polarisatie" cruciaal als het gaat over de ontwikkeling van Artificiële Intelligentie (AI) en het onderzoek hiernaar.
Waag | Nederlandse bevolking stelt prioriteiten voor onderzoeksagenda AI
The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence | First Monday
The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence | First Monday
The stated goal of many organizations in the field of artificial intelligence (AI) is to develop artificial general intelligence (AGI), an imagined system with more intelligence than anything we have ever seen. Without seriously questioning whether such a system can and should be built, researchers are working to create “safe AGI” that is “beneficial for all of humanity.” We argue that, unlike systems with specific applications which can be evaluated following standard engineering principles, undefined systems like “AGI” cannot be appropriately tested for safety. Why, then, is building AGI often framed as an unquestioned goal in the field of AI? In this paper, we argue that the normative framework that motivates much of this goal is rooted in the Anglo-American eugenics tradition of the twentieth century. As a result, many of the very same discriminatory attitudes that animated eugenicists in the past (e.g., racism, xenophobia, classism, ableism, and sexism) remain widespread within the movement to build AGI, resulting in systems that harm marginalized groups and centralize power, while using the language of “safety” and “benefiting humanity” to evade accountability. We conclude by urging researchers to work on defined tasks for which we can develop safety protocols, rather than attempting to build a presumably all-knowing system such as AGI.
The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence | First Monday
The TESCREAL Bundle | DAIR
The TESCREAL Bundle | DAIR
The Distributed AI Research Institute is a space for independent, community-rooted AI research, free from Big Tech’s pervasive influence.
The TESCREAL Bundle | DAIR
Ecosystem - Future Art Ecosystems 4: Art x Public AI
Ecosystem - Future Art Ecosystems 4: Art x Public AI
Future Art Ecosystems 4: Art x Public AI provides analyses, concepts and strategies for responding to the transformations of AI systems on culture and society.
Ecosystem - Future Art Ecosystems 4: Art x Public AI
Artifice and Intelligence
Artifice and Intelligence
Emily Tucker, Executive Director of the Center on Privacy & Technology at Georgetown Law, on tech, vocabulary and power.
Artifice and Intelligence
Future Art Ecosystems 4: Art x Public AI
Future Art Ecosystems 4: Art x Public AI
Future Art Ecosystems 4: Art x Public AI provides analyses, concepts and strategies for responding to the transformations of AI systems on culture and society.
Future Art Ecosystems 4: Art x Public AI
A Roadmap to Democratic AI - 2024 — The Collective Intelligence Project
A Roadmap to Democratic AI - 2024 — The Collective Intelligence Project
We are launching a "Roadmap to Democratic AI" outlining paths towards greater collective stewardship and better distribution of AI's benefits. Our roadmap outlines concrete steps that can be taken in 2024 to build a more democratic AI ecosystem that is adaptive, accountable, processes dece
A Roadmap to Democratic AI - 2024 — The Collective Intelligence Project
Measuring Diversity
Measuring Diversity
Search results that reflect historic inequities can amplify stereotypes and perpetuate under-representation. Carefully measuring diversity in data sets can help.
Measuring Diversity