Found 1953 bookmarks
Newest
Experiential AI
Experiential AI
Experiential AI is proposed as a new research agenda in which artists and scientists come together to dispel the mystery of algorithms and make their mechanisms vividly apparent. It addresses the challenge of finding novel ways of opening up the field of artificial intelligence to greater transparency and collaboration between human and machine. The hypothesis is that art can mediate between computer code and human comprehension to overcome the limitations of explanations in and for AI systems. Artists can make the boundaries of systems visible and offer novel ways to make the reasoning of AI transparent and decipherable. Beyond this, artistic practice can explore new configurations of humans and algorithms, mapping the terrain of inter-agencies between people and machines. This helps to viscerally understand the complex causal chains in environments with AI components, including questions about what data to collect or who to collect it about, how the algorithms are chosen, commissioned and configured or how humans are conditioned by their participation in algorithmic processes.
Experiential AI
Quasi | Home
Quasi | Home
Quasi Market | The First Ever AI Marketplace
Quasi | Home
Fellowship Open Climate
Fellowship Open Climate
Call for fellows 2023 open until December 11, 2022 Open Climate seeks seven (7) mid career professionals for the 2023 Open Climate Fellowship Program. We welcome applicants from backgrounds across …
Fellowship Open Climate
Call for Papers — Desirable AI
Call for Papers — Desirable AI
The aim of this conference is to interrogate how an intercultural approach to ethics can inform the processes of conceiving, designing, and regulating artificial intelligence (AI).
Call for Papers — Desirable AI
Dall e 2
Dall e 2
alternatives to dall-e 2
Dall e 2
HOLO 3: Mirror Stage
HOLO 3: Mirror Stage
Nora N. Khan assembles a cast of luminaries to consider the far-reaching implications of AI and computational culture. –$40
HOLO 3: Mirror Stage
Tools to Improve Training Data - Talking Language AI Ep#2
Tools to Improve Training Data - Talking Language AI Ep#2
Vincent Warmerdam builds a lot of NLP tools (https://github.com/koaning). Many of these tools target the scikit-learn ecosystem and there's a theme of labeling across many of them. A recent focus of his stack of tools is to improve training data. In this video, Vincent and Jay discuss a few of these tools and show how they work together. These tools are discussed in the video: - Human-learn: a toolkit to build human-based scikit-learn components - Doubtlab: a toolkit to help find doubtful labels in data - Embetter: A library that makes it very easy to use embeddings in scikit-learn - Bulk: a library that uses embeddings to leverage bulk labeling The talk includes live demos for each and to show how some simple tricks can go a long way. === Join the Cohere Discord: https://discord.gg/co-mmunity Discussion thread for this episode (feel free to ask questions): https://discord.com/channels/954421988141711382/1042163984817721527 Vincent on Twitter: https://twitter.com/fishnets88 human-learn: Natural Intelligence is still a pretty good idea. https://koaning.github.io/human-learn/index.html https://github.com/koaning/human-learn/ doubtlab: Doubt your data, find bad labels. https://koaning.github.io/doubtlab/ https://github.com/koaning/doubtlab embetter: just a bunch of useful embeddings https://github.com/koaning/embetter bulk: A Simple Bulk Labelling Tool https://github.com/koaning/bulk Calmcode: Code. Simply. Clearly. Calmly. Video tutorials for modern ideas and open source tools. https://calmcode.io/ About The Speaker: Vincent worked as an engineer, consultant, researcher, team lead, and educator in the past. Currently, he works as a Machine Learning Engineer over at Explosion, the company behind spaCy and Prodi.gy. In addition to his work at Explosion, he also maintains many scikit-learn-related plugins as well as a popular learning resource over at calmcode.io. He's also a frequent speaker at conferences where he defends common sense over the hype in ML. === Contents 0:00 Introduction 3:06 Tools for Data Quality 9:18 human-learn: Natural Intelligence is still a pretty good idea. 12:28 human-learn: demo 27:11 doubtlab: Doubt your data, find bad labels. 42:35 embetter: just a bunch of useful embeddings 46:16 embetter: demo 58:10 bulk: A Simple Bulk Labelling Tool 1:00:20 bulk demo: exploring text data 1:10:47 bulk demo: exploring images 1:16:20 Why use the scikit learn API? What are the benefits and limitations? 1:17:22 Programmer productivity tips
Tools to Improve Training Data - Talking Language AI Ep#2
Steamship Fellowship for Language AI at Writing Atlas - Airtable
Steamship Fellowship for Language AI at Writing Atlas - Airtable
Steamship is building a platform to put Language AI in the hands of all developers. Plympton is partnering with Steamship on a fellowship program to explore the intersection of tech & literature. You’ll join a two-month-long fellowship in which we’ll coach you through building new features for Writing Atlas. We’ve planned each feature for a smooth development experience, and the results will be something you can show off in production.
Steamship Fellowship for Language AI at Writing Atlas - Airtable
Open Call – bb15
Open Call – bb15
bb15 art space is seeking proposals for its 2023-2024 program around the open-ended theme of friction. Friction is against sameness, the continuum, the immediate. An emerging friction defeats expectations, questions the agreement and is itchy. Frictioning implies resistance, going against a certain motion. bb15 is looking for projects that bring tension, distortion and counter-current into the flow of things.
Open Call – bb15
Case Studies - Digital Freedom Fund
Case Studies - Digital Freedom Fund
Case studies DFF supports strategic litigation to advance digital rights in Europe, by providing financial support for strategic court cases and catalysing collaboration between those working to advance digital rights.Strategic litigation – litigation with broad impact and which can bring about legislative or policy change – has proven to be…
Case Studies - Digital Freedom Fund
Submit your MozFest Session before December 16th
Submit your MozFest Session before December 16th
Activists in the internet health movement will gather at MozFest to move the needle in tech, art, social responsibility, ethics, and AI. Will you join us in March 2023?
Submit your MozFest Session before December 16th
OP–Z Stable Diffusion — MODEM
OP–Z Stable Diffusion — MODEM
A digital extension for the Teenage Engineering OP–Z synthesizer translating music into real-time AI generated imagery.
OP–Z Stable Diffusion — MODEM
The Black Technical Object
The Black Technical Object
While machine learning—computer programming designed for taxonomic patterning—offers useful insights into racism and racist behavior, a gap is present in the relationship between machine learning and its connection to the racial history of science and the Black lived experience.
The Black Technical Object
Introspectabilia - Illo.tv Experience
Introspectabilia - Illo.tv Experience
This interactive experience asks you 10 questions and helps you find out which of the unnamed digital emotions has the biggest impact on you
Introspectabilia - Illo.tv Experience