EdTech
Education leaders predict 2025 will require stronger teacher and state-level leadership in ed-tech implementation, improved data collection and usage, modernisation of college systems, and greater focus on AI ethics and literacy. Key areas include professional development, cybersecurity, data privacy, and addressing the end of emergency relief funding.
The global environmental impact of AI in education emerges as a critical consideration that extends beyond immediate classroom concerns. The shift from local to planetary implications raises questions about sustainability and resource consumption in educational technology deployment.
"We need to move past surface-level concerns like plagiarism," says Sean Michael Morris, vice president of academics at Course Hero, "and address the more pressing ethical dilemmas AI presents — its environmental costs, its cultural impact and how it reshapes our understanding of intelligence."
A proposed Australian law to ban social media for under-16s has sparked debate, with NZ public health researcher Samantha Marsh supporting similar legislation in New Zealand, while Australian academics oppose the ban, arguing for safety standards instead.
The tension between protecting youth mental health and maintaining digital inclusion creates complex policy challenges. The debate centres on enforcement practicality, platform accountability, and whether age verification technology can effectively regulate access while preserving beneficial social connections for vulnerable youth.
"The idea that platforms will self-regulate is ridiculous. It's in their interest not to. They benefit themselves from surveilling everyone ... they scrape the personal data and then they sell it on to others, and then it's used to hit you with advertisements," says Professor Judith Bessant from RMIT University.
"The consequence is that educational futures are being defined by actors whose speculative attention is primarily attuned to concerns of capital gain from edtech assets and private rent extraction rather than the future of education as a public good."
The integration of real-time analysis and customisation capabilities sets AI tools apart from conventional dyslexia interventions. The technology can adapt instantly to individual learning patterns, offering tailored support like auditory options with adjustable pacing and visual structuring of information - features that weren't possible with traditional accommodations.
The article discusses transforming educational assignments in response to AI, advocating for a shift from traditional assessment methods to tasks that emphasise critical thinking, collaboration, and real-world application. It provides practical examples of reimagined assignments.
The article's approach to AI as a partner rather than a threat represents a pragmatic evolution in educational thinking. The concept of testing assignments with AI tools to gauge their effectiveness, coupled with concrete examples of transforming traditional tasks into dynamic learning experiences, offers practical solutions for educators navigating this technological transition.
"When Google became mainstream, we said, 'If your question is 'Google-able,' it's not a good question.' The same principle applies to generative AI: If the assignment is 'AI-able,' it's not a good assignment."
The edtech industry faces challenges as institutions reassess tech spending amid post-COVID budget constraints, investor pullback, and growing public scepticism of Silicon Valley. The article examines how edtech companies must adapt to survive in an environment where technology alone is no longer seen as inherently valuable.
The edtech industry faces challenges as institutions reassess tech spending amid post-COVID budget constraints, investor pullback, and growing public scepticism of Silicon Valley. The article examines how edtech companies must adapt to survive in an environment where technology alone is no longer seen as inherently valuable.
As part of the Quebec government's work on the use of artificial intelligence (AI) in schools, Education Minister Bernard Drainville launched a guide for teaching staff on Wednesday.
The document, entitled “The pedagogical and ethical use of generative artificial intelligence (GAI)” sets out the criteria for the pedagogical, ethical and legal use of AI.
Drainville believes that the subject is unavoidable, since “artificial intelligence is now an integral part of our reality, including in our schools.”
We’re inviting leaders like you to help shape the next wave of research in digital learning. This survey aims to gather insights on what research would be most valuable for your online and hybrid programs. Your feedback will directly guide researchers in producing actionable data that supports your efforts and proves the effectiveness of your practices for students.
AI software connects to school systems to automatically send personalised, multi-language texts to parents within minutes of student absences, tracking responses and flagging issues for staff follow-up. The system creates attendance profiles and intervention plans to address rising chronic absenteeism, which has increased from 15% to 26% since 2018.
Online education company Chegg has seen its stock plummet 99% since 2021, losing $14.5 billion in value and 500,000 paid subscribers. The company, known for textbook rentals and homework help, is struggling to pay its debts as students abandon its $20 monthly subscription service in favor of ChatGPT.
Educational institutions face increasing cybersecurity threats, with 80% of K-12 schools experiencing ransomware attacks in 2022. Schools collect extensive personal data but often lack robust security measures. The MOVEit ransomware attack in 2023 affected 800+ educational organisations, compromising 1.7 million individuals' data.