With the rise of AI (Artificial Intelligence) and Generative AI that can generate new and unique content, rather than analyzing and acting on existing data, this has resulted in the release of OpenAI in 2015. A new innovative tool has recently emerged, it is named, ChatGPT and it has taken the world by storm. In just five days after its release, ChatGPT attracted over a million users globally and has rapidly gained popularity in all fields, especially in education. This new technology is again stirring up the AI versus education debate, some see it as a threat to the curren
With the rise of AI (Artificial Intelligence) and Generative AI that can generate new and unique content, rather than analyzing and acting on existing data, this has resulted in the release of OpenAI in 2015. A new innovative tool has recently emerged, it is named, ChatGPT and it has taken the world by storm. In just five days after its release, ChatGPT attracted over a million users globally and has rapidly gained popularity in all fields, especially in education. This new technology is again stirring up the AI versus education debate, some see it as a threat to the curren
Alpha School, which purports to teach children academics using AI for two hours a day, has got the support of the Trump administration, but leaves some education experts and parents unimpressed.
The Impact of AI on Work in Higher Education, an effort conducted in partnership with AIR, NACUBO, and CUPA-HR, summarizes work-related institutional AI strategies, policies, and guidelines; the risks, opportunities, and challenges associated with using AI for work in higher education; and specific examples of how staff and faculty use—and want to use—AI for work.
The Impact of AI on Work in Higher Education report provides a valuable, institution-level perspective on how AI is reshaping faculty and staff work beyond the classroom. I find it especially useful that the report balances opportunities such as increased efficiency, improved decision-making, and reduced administrative burden with clear acknowledgment of risks like workforce disruption, policy gaps, and ethical concerns. The inclusion of concrete examples of how employees are already using (and want to use) AI makes the findings feel grounded and actionable rather than speculative. What stands out most is the emphasis on the need for clear strategies, guidelines, and shared governance as AI becomes embedded in everyday institutional operations. Overall, the report reinforces that AI’s impact on higher education work will depend less on the technology itself and more on how thoughtfully institutions support, train, and protect the people doing the work.
This article presents a thoughtful and practical vision for how AI can support instructors in higher education without undermining the human core of teaching. I appreciate how it frames AI not as a replacement for pedagogical expertise, but as an efficiency tool that frees instructors to focus on higher-value work like mentoring, engagement, and inclusive learning design. The emphasis on evidence-based tools such as Consensus and research-mapping platforms stands out, as it reinforces the importance of grounding instructional decisions in scholarship rather than convenience. I also find the repeated attention to ethical considerations, especially data privacy, bias, and contextual judgment, both necessary and responsible in today’s AI-saturated landscape.
Artificial intelligence is transforming workplaces and emerging as an essential tool for employees across industries. The dilemma: Universities must ensure graduates are prepared to use AI in their daily lives without diluting the interpersonal, problem-solving, and decision-making skills that businesses rely on.
The Typology of Generative AI Tools for Education provides educators with a list of generative AI tools arranged into nine categories that are currently being used in educational contexts. This typology follows the previously published Typology of Free Web-based Learning Technologies (Bower & Torrington, 2020) and the Typology of Web 2.0 Learning Technologies (Bower, 2015), representing the evolution of educational technology into the generative AI era. To create this typology, 211 educators from nine countries spanning early childhood through higher education completed a survey in late 2025 about their generative AI tool usage. Tools that were reported by two or more educators were included in the typology. A total of 50 unique AI Educational tools were included in the Typology, and have been arranged into nine overarching categories: General-Purpose Large Language Models, Image Creation Tools, Audio and Music Generation Tools, Video Generation Tools, Presentation Generation Tools, Research and Study Tools, AI Tutoring and Chatbots, Custom Education Tools, and Other Miscellaneous Tools. Brief descriptions and links are provided for each tool to support educators in making informed decisions about which tools might suit their teaching and learning contexts. The generative AI landscape is rapidly evolving, and it is noted that some tools offer functionality across multiple categories. This typology represents a snapshot of educator-reported usage patterns in 2025-2026, and offers educators a touchstone for the appropriate selection of Generative AI technologies in their teaching.
The promise of artificial intelligence in higher education isn't to replace human work but to create space for the human interaction students value most.
Unofficial AI use on campus reveals more about institutional gaps than misbehavior.
The institutions that thrive in the AI era will be the ones that recognize shadow AI for what it is: a signal. Shadow AI won't disappear with stricter rules. It will disappear when the sanctioned path is better than the workaround.
After turning off ChatGPT’s ‘data consent’ option, Marcel Bucher lost the work behind grant applications, teaching materials and publication drafts. Here’s what happened next.
Pulled out of class, held back after school and forced to prove they're not AI cheats, students say NSW high schools are pitting them against faulty AI detectors.
This whitepaper is a follow-up of the Australasian Council of Open and Digital Education (ACODE) survey in 2024 on the governance of artificial intelligence (AI) and data in Australasian higher education (Selvaratnam, Ames & Leichtweis, 2024). Last year’s results showed that the sector was in the earlier stages of maturity, while the latest survey, conducted in the second half of 2025, shows growth in operationalising AI. The survey assesses the extent to which institutions have advanced their AI strategies, promoting social and emotional well-being, psychological safety, and strengthening ethical and data governance. To this end, the JISC AI Maturity Model for Education is used to gauge the sector’s growth in the governance of AI and data, both in policy and practice. The outcomes show that the sector has made progress in the last 12 months. The challenges were mainly resourcing constraints and a lack of systemic governance.
To get content containing either thought or leadership enter:
To get content containing both thought and leadership enter:
To get content containing the expression thought leadership enter:
You can enter several keywords and you can refine them whenever you want. Our suggestion engine uses more signals but entering a few keywords here will rapidly give you great content to curate.