AI_bookmarks

1490 bookmarks
Newest
The effect of ChatGPT on students’ learning performance, learning perception, and higher-order thinking: insights from a meta-analysis - Humanities and Social Sciences Communications
The effect of ChatGPT on students’ learning performance, learning perception, and higher-order thinking: insights from a meta-analysis - Humanities and Social Sciences Communications
Humanities and Social Sciences Communications - The effect of ChatGPT on students’ learning performance, learning perception, and higher-order thinking: insights from a meta-analysis
·nature.com·
The effect of ChatGPT on students’ learning performance, learning perception, and higher-order thinking: insights from a meta-analysis - Humanities and Social Sciences Communications
Chegg to lay off 22% of workforce as AI tools shake up edtech industry
Chegg to lay off 22% of workforce as AI tools shake up edtech industry
Chegg said on Monday it would lay off about 22% of its workforce, or 248 employees, to cut costs and streamline its operations as students increasingly turn to AI-powered tools such as ChatGPT over traditional edtech platforms.
·reuters.com·
Chegg to lay off 22% of workforce as AI tools shake up edtech industry
China issues guidelines to promote AI education in primary and secondary schools
China issues guidelines to promote AI education in primary and secondary schools
China’s Ministry of Education (MOE) has recently issued two guidelines to promote artificial intelligence (AI) education in primary and secondary schools by building a tiered, progressive and spiraling general AI education system, prohibiting students from independently using open-ended content generation at primary schools and banning teachers from using generative AI as a substitute for their core teaching responsibilities.
·globaltimes.cn·
China issues guidelines to promote AI education in primary and secondary schools
Implicit Bias in Large Language Models: Experimental Proof and Implications for Education
Implicit Bias in Large Language Models: Experimental Proof and Implications for Education
We provide experimental evidence of implicit racial bias in a large language model (specifically ChatGPT) in the context of an authentic educational task and discuss implications for the use of these tools in educational contexts. Specifically, we presented ChatGPT with identical student writing passages alongside various descriptions of student demographics, include race, socioeconomic status, and school type.
·scale.stanford.edu·
Implicit Bias in Large Language Models: Experimental Proof and Implications for Education
Generative AI like ChatGPT is at risk of creating new gender gap at work
Generative AI like ChatGPT is at risk of creating new gender gap at work
Artificial intelligence tools such as ChatGPT can be a boon for productivity, but when it comes to adopting the technology, a significant gender gap exists.
But when researchers looked at the demographics, they discovered that women were 16 percentage points less likely than men to use ChatGPT for job tasks, even when comparing workers within the same occupation and with similar job responsibilities.
·cnbc.com·
Generative AI like ChatGPT is at risk of creating new gender gap at work
Agatha Christie, Who Died in 1976, Will See You in Class
Agatha Christie, Who Died in 1976, Will See You in Class
An avatar of the long-dead British novelist is “teaching” an online writing course. But do we want to learn from a digital prosthetic built by artificial intelligence?
·nytimes.com·
Agatha Christie, Who Died in 1976, Will See You in Class
OpenAI Adds Shopping to ChatGPT in a Challenge to Google
OpenAI Adds Shopping to ChatGPT in a Challenge to Google
OpenAI is launching a shopping experience inside of ChatGPT, complete with product picks and buy buttons. WIRED spoke with Adam Fry, the company’s search product lead, to ask how it all works.
·wired.com·
OpenAI Adds Shopping to ChatGPT in a Challenge to Google
Upload and edit your images directly in the Gemini app
Upload and edit your images directly in the Gemini app
We’re rolling out the ability to easily modify both your AI creations and images you upload from your phone or computer.
Upload and edit your images directly in the Gemini app
·blog.google·
Upload and edit your images directly in the Gemini app
Engineered for Attachment: The Hidden Psychology of AI Companions | Punya Mishra's Web
Engineered for Attachment: The Hidden Psychology of AI Companions | Punya Mishra's Web
Dishonest Anthropomorphism is about the kinds of design choices made by these companies to leverage our ingrained tendency to attribute human-like qualities to non-human entities. Emulated empathy describes how AI systems intentionally seek to simulate genuine emotional understanding, misleading users about the true nature of the interaction.
·punyamishra.com·
Engineered for Attachment: The Hidden Psychology of AI Companions | Punya Mishra's Web
Teaching AI Ethics 2025: Bias
Teaching AI Ethics 2025: Bias
This post initiates a nine-article series revisiting the "Teaching AI Ethics" resources from 2023 exploring bias in GenAI.
·leonfurze.com·
Teaching AI Ethics 2025: Bias
AI Is Not Your Friend
AI Is Not Your Friend
How the “opinionated” chatbots destroyed AI’s potential, and how we can fix it
But the technology has evolved rapidly over the past year or so. Today’s systems can incorporate real-time search and use increasingly sophisticated methods for “grounding”—connecting AI outputs to specific, verifiable knowledge and sourced analysis. They can footnote and cite, pulling in sources and perspectives not just as an afterthought but as part of their exploratory process;
·theatlantic.com·
AI Is Not Your Friend
Stop Calling AI a Tool It’s Not a Tool
Stop Calling AI a Tool It’s Not a Tool
Generative AI doesn’t extend the artist, it replaces the conditions under which art is possible
I’ve received feedback that my 12–18-minute-long deep dives into the intersection of generative AI and art (specifically music) may be too long. So here’s my argument in under 5 minutes.Medium has recently featured several prominent articles claiming generative AI is just another tool. Two stand out: The Tools Will Change. Your Craft Doesn’t Have To by Agustin Sanchez (currently in the coveted top-right Staff Picks column) and Stop Pretending You Write Alone: AI, Authorship, and the End of the Solo Genius Myth by 404 (featured in a recent newsletter).I recommend reading these articles (direct links in my citation section below), but let me reprint my response to Sanchez’s piece, which succinctly summarizes my argument for why generative AI is not a tool:I have to push back very strongly on this claim that generative AI is just another tool. It is absolutely not. This is a category error that leads to a dangerous misreading of what is actually happening.We need to look at this through a different lens. The key distinction is between tools that extend the body and systems that are designed to replace it.A camera is a tool. It extends the eye. A paintbrush extends the hand and arm. A guitar extends the internal temporal rhythms of human experience into sound. Drawing on Susanne K. Langer, music objectifies time; it turns inner, lived temporality into something that can be shared.What these tools have in common is that they are inert until we act through them. They do nothing on their own. They do not have logic or agency. They sit quietly until we pick them up and use them to express something grounded in experience.AI is not that. AI does not extend the body. It is built to render it obsolete.AI markets itself as a tool, but it functions like an agent. Generative AI produces material within predefined parameters using massive datasets and its outputs are optimized to capture attention. It is not passive like a camera or a paintbrush. It acts on us.There is a logic built into it. That logic traces directly to the origins of cybernetics. Early cybernetic systems were not designed to enhance human capacity, they were designed to bypass it. During World War II, anti-aircraft systems used predictive feedback loops to automate tracking and firing. The human was treated as a lagging component in a system that prioritized precision and speed. Eventually, the human was cut out of the process entirely.This same logic drives today’s AI. These systems are not waiting for intention. They are designed to anticipate and override it. They do not follow our input. They predict it. They shape it. Generative AI does not assist human expression. It replaces the conditions under which expression is even necessary. It does not extend the body. It encodes and replaces it.You’re not shaping who you are in a new context. You’re accelerating your own obsolescence.To Wrap This UpLet’s be very clear about the ground of debate here: it isn’t about whether collaboration with generative AI is ‘real’ or whether technology belongs in art. That’s a diversion and builds a strawman argument.The question that matters, which I’ve explored extensively in my articles on Medium, is what kind of meaning-making we’re talking about.Art is not an output problem.I wrote this on LinkedIn yesterday:In 25 years as a professional session musician, playing on hundreds of records, I’ve never once heard someone in the studio say, “I wish I could do this faster.” Not once. That phrase just doesn’t exist in the vocabulary of people who are actually making music.And yet, especially on LinkedIn, my feed is full of people with music in their titles pushing products designed to speed up and scale the production of music.But speed and scale are not musical values. They are values of content. Or more precisely: content for the sake of content, and that’s not the same thing as music. It never was.Art is not a volume game, nor about speed or efficiency. Art is a wager — a grand one undergirded by infinite risk. Art is a symbolic act grounded in time, memory, relation. Tools can express that, but they don’t do the expressing for us.Generative AI isn’t helping us express more. It’s changing what expression means at the ontological level. It replaces uncertainty — the infinite risk inherent in art — with prediction and a narrowing of future possible paths. It swaps tension and ambiguity for certainty, which can only be manufactured. It strips art of the time it needs to reveal.Since I personally tarry in the medium of music, I’ll say this plainly:Music is freedom. Content is compliance.I call AI-generated art content rather than art because it is produced through logics that are antithetical to the creation of art.This is why I see the future of music offline and the future of content on Spotify. The same goes for all media produced through these logics, across every associated medium.
·medium.com·
Stop Calling AI a Tool It’s Not a Tool