AI

AI

946 bookmarks
Custom sorting
AI has a climate problem — but so does all of tech — Decoder with Nilay Patel
AI has a climate problem — but so does all of tech — Decoder with Nilay Patel
Every time we talk about AI, we get one big piece of feedback that I really want to dive into: how the lightning-fast explosion of AI tools affects the climate. AI takes a lot of energy, and there’s a huge unanswered question as to whether using all that juice for AI is actually worth it, both practically and morally. It’s messy and complicated and there are a bunch of apparent contradictions along the way — so it’s perfect for Decoder. Verge senior science reporter Justine Calma joins me to see if we can untangle this knot. Links: This startup wants to capture carbon and help data centers cool down | The Verge Google’s carbon footprint balloons in its Gemini AI era | The Verge Taking a closer look at AI’s supposed energy apocalypse | Ars Technica AI is exhausting the power grid. Tech firms are seeking a miracle | WaPo AI Is already wreaking havoc on global power systems | Bloomberg What do Google’s AI answers cost the environment? | Scientific American AI is an energy hog | MIT Tech Review Microsoft’s AI…
·overcast.fm·
AI has a climate problem — but so does all of tech — Decoder with Nilay Patel
AI is learning how to lie — Marketplace Tech
AI is learning how to lie — Marketplace Tech
Large language models go through a lot of vetting before they’re released to the public. That includes safety tests, bias checks, ethical reviews and more. But what if, hypothetically, a model could dodge a safety question by lying to developers, hiding its real response to a safety test and instead giving the exact response its human handlers are looking for? A recent study shows that advanced LLMs are developing the capacity for deception, and that could bring that hypothetical situation closer to reality. Marketplace’s Lily Jamali speaks with Thilo Hagendorff, a researcher at the University of Stuttgart and the author of the study, about his findings.
·overcast.fm·
AI is learning how to lie — Marketplace Tech
New ED Guidelines for Designing Trustworthy AI Tools in Education -- Campus Technology
New ED Guidelines for Designing Trustworthy AI Tools in Education -- Campus Technology
The United States Department of Education recently released a new guide that seeks to inform ed tech developers as they create AI products and services for use in education. We spoke with Kevin Johnstun, education program specialist in ED's Office of Educational Technology, about the ins and outs of the report and what it means for education institutions.
·campustechnology.com·
New ED Guidelines for Designing Trustworthy AI Tools in Education -- Campus Technology
Rethinking Assessment in the Age of AI
Rethinking Assessment in the Age of AI
This video demonstrates how AI tools are handling complex academic tasks, from answering quiz questions to writing essays and creating presentations. I’ll share data on student AI adoption, show live examples of AI capabilities, and discuss the challenges in detecting AI-generated work. The presentation concludes with strategies for adapting our teaching and assessment methods to this rapidly evolving landscape. It’s a candid look at how AI is reshaping education and what it means for the future of our programs. 00:00 - Introduction 00:16 - Student AI Adoption 01:03 - AI Aces Multiple Choice 01:15 - AI Tackles Short Answers 01:40 - AI Generates Essays 02:20 - AI Creates Presentations 03:37 - The Limits of AI Detection 04:38 - AI’s Exponential Growth 05:06 - Strategies for Adapting to AI Music: Unclean Machine by Burn Water http://burnwater.bandcamp.com
·youtube.com·
Rethinking Assessment in the Age of AI
What Does Automating Feedback Mean for Learning?
What Does Automating Feedback Mean for Learning?
This post is the third in the Beyond ChatGPT series about generative AI’s impact on learning. In the previous posts, I discussed how generative AI has moved beyond text generation and is starting to impact critical skills like reading and note-taking. In this post, I’ll cover how the technology is marketed to students and educators to automate feedback. The goal of this series is to explore AI beyond ChatGPT and consider how this emerging technology is transforming not simply writing, but many of the skills we associate with learning. Educators must shift our discourse away from ChatGPT’s disruption of assessments and begin to grapple with what generative AI means for teaching and learning.
·marcwatkins.substack.com·
What Does Automating Feedback Mean for Learning?