Discover how using AI to teach critical thinking in higher education turns AI errors into powerful classroom strategies that build information literacy and academic judgment.
|
|
Scooped by
EDTECH@UTRGV
onto Educational Technology News April 3, 11:59 AM
|
Discover how using AI to teach critical thinking in higher education turns AI errors into powerful classroom strategies that build information literacy and academic judgment.
"When AI gets things wrong, it creates powerful teachable moments. Giving students an AI-produced answer that contains mistakes pushes them to slow down, test claims, and fix problems."
Your new post is loading...
Your new post is loading...
|
|
|
Scooped by
EDTECH@UTRGV
April 24, 9:41 AM
|
In higher education, the most pressing challenge is not AI itself, but the underlying pedagogy gap masked by traditional instructional models
"AI has acted as a catalyst, exposing this pedagogy gap by demonstrating that while machines can replicate the transfer of information with startling efficiency, they cannot replace the structured design required for true education."
|
|
Scooped by
EDTECH@UTRGV
April 24, 9:38 AM
|
"With critical thinking skills on the line I built a real-time AI collaborator, Thia — with vision and voice capabilities to keep early ideas raw, the loop tight, and the thinking mine."
"The only way to retain critical thinking skills is to use them, by keeping them sharp — that means embracing cognitive friction and not simply rushing to the finish line.”
|
|
Scooped by
EDTECH@UTRGV
April 24, 9:29 AM
|
"Micro-credentials may create added value for students and employers, but students frequently perceive them as traditional credentials repackaged. The questions students ask do more than identify information needs; they also illuminate how effectively our institutions are defining and communicating what micro-credentials are and what they do."
"What exactly is a micro-credential? When students ask this question, it shows that institutions aren’t being clear about what’s on offer."
|
|
Scooped by
EDTECH@UTRGV
April 23, 8:38 AM
|
"We are living through a fundamental shift in what work is for. As AI takes on more routine cognitive tasks, the uniquely human capacity to imagine, connect, and create meaning becomes the primary source of organizational value. Yet most companies are still measuring performance metrics prioritized for a different era: inventory turnover, cost per lead, and utilization rates."
"These metrics were designed to optimize extraction. They are poorly equipped to cultivate imagination."
|
|
Scooped by
EDTECH@UTRGV
April 23, 8:36 AM
|
From data privacy and staff readiness to classroom fit and long-term cost, here are the questions schools should ask before investing in AI.
"[B]efore investing in AI, school leaders should ask different questions. What problem are we actually trying to solve? Who will use it? What data will it store? Do we have the capabilities to use it effectively? How much control will we have? What happens if we get this wrong?"
|
|
Scooped by
EDTECH@UTRGV
April 23, 8:31 AM
|
"In this edition of Author Talks, McKinsey Global Publishing’s Barr Seitz speaks with McKinsey Senior Partners Kate Smaje and Robert Levin, and Eric Lamarre, McKinsey senior partner emeritus and special adviser, about the second edition of Rewired (Rewired: How Leading Companies Win with Technology and AI, Wiley, April 2026). They discuss what has changed over the past few years, what it means to build organizational speed, and why the most important transformations are ultimately about people."
"A rewired organization is one that has graduated to truly impactful, distributed innovation across the organization."
|
|
Scooped by
EDTECH@UTRGV
April 22, 1:41 PM
|
A new resource from CoSN provides guidelines for creating responsible technology use policies and supporting digital citizenship in schools and districts.
"CoSN promotes Responsible Use Agreements over traditional AUPs: The report recommends complementing compliance-based policies with RUAs that are clearer and more student- and family-friendly."
|
|
Scooped by
EDTECH@UTRGV
April 22, 1:38 PM
|
"Higher education is having a familiar conversation in an unfamiliar moment. We are debating whether students “should” use AI, whether it is “ethical,” whether it is “cheating,” whether we can “ban” it, whether we can “detect” it, whether it will “go away.” This is what happens when an institution confuses discomfort with principle."
"In higher education, AI should be a human-centered tool inside a curriculum that expects more from students, not less."
|
|
Scooped by
EDTECH@UTRGV
April 22, 1:34 PM
|
Learn three practical ways to humanize generative AI in the classroom and use AI to support student-centered learning, critical thinking, and differentiated instruction.
"AI is already changing the nature of academic work. It can write, summarize, and analyze with remarkable speed. But its most important contribution may be what it gives back to instructors, time to: design richer learning experiences, think more creatively about their teaching, and finding the students."
|
|
Scooped by
EDTECH@UTRGV
April 22, 1:27 PM
|
"Scroll LinkedIn for ten minutes, and you’ll meet him. The Refuser. He writes his own emails. He does his own thinking. He wants you to know this. The post is usually a variation on the same beat: I asked ChatGPT to do X and look how bad it was. Imagine outsourcing your mind to this. The comments agree vigorously. Everyone feels better. Nothing has been learned."
"What AI does, for the person willing to use it well, is strip the proxy away and leave the actual variable exposed: what do you know, what do you notice, what do you care about enough to push back on?"
|
|
Scooped by
EDTECH@UTRGV
April 22, 1:16 PM
|
Unsure how to spot AI-generated content? These tips can help.
"One of the biggest red flags is what I call the 'Wikipedia Voice,' or text that's grammatically perfect but completely soulless, relying on vague, over-the-top language that parrots the prompt back at me."
|
|
Scooped by
EDTECH@UTRGV
April 21, 9:27 AM
|
Conversations with Kevin Hogan: Extron's Jason Bond explains how districts can start small with esports AV infrastructure, leverage dual-purpose spaces, and use AV over IP to build a scalable foundation for student engagement.
"You're going to use that esports space for something like graphics design or cybersecurity training during the day, and then in the evening or after school hours, it becomes the esports playing facility."
|
|
Scooped by
EDTECH@UTRGV
April 24, 9:51 AM
|
Tools and Trends When Artificial Intelligence (AI) Becomes the First Source of “Confidence and Trust” in Learning March 16, 2026 Tools and Trends During a recent discussion, a colleague asked a retired educator: “What are the two feelings you most want your students to have toward you to be...
During a recent discussion, a colleague asked a retired educator: “What are the two feelings you most want your students to have toward you to be effective?” The educator answered immediately: “Confidence and trust. Not engagement. Not motivation. Not achievement. Confidence and trust.”
|
|
Scooped by
EDTECH@UTRGV
April 24, 9:39 AM
|
School leaders must move beyond experimentation and build AI systems that prioritize governance, purpose, and data integrity.
"Becoming “AI-ready” isn’t about chasing the newest shiny platform; it requires districts to build intentional systems that guide how AI is evaluated, implemented, and governed."
|
|
Scooped by
EDTECH@UTRGV
April 24, 9:31 AM
|
"Most trainers say they believe in learner experience. Far fewer actually design for it. They ask for introductions. Maybe they throw out an opening question. Maybe they invite people to “share from their background.” Then they move straight into the deck they were always going to use anyway. That is not learner-centered training."
"The first rule of AI in training: start with the learner, not the prompt"
|
|
Scooped by
EDTECH@UTRGV
April 23, 8:39 AM
|
"Nearly two out of three American adults have used an AI-powered search tool in the past six months. But here’s the stat that should keep every product builder up at night: only 15% say they trust the results “a lot.”
That gap between adoption and trust is the defining challenge for the next era of AI search. Consumers are showing up, but they are questioning the results. As product builders, we have to ask ourselves an uncomfortable question: Are we building experiences that earn and deserve consumer trust?"
"More than half of consumers say AI search feels like a walled garden. Here’s what product builders need to do about it"
|
|
Scooped by
EDTECH@UTRGV
April 23, 8:37 AM
|
This article shows how to use AI assessment guardrails to responsibly use AI-generated assessments while protecting quality and trust.
"AI can speed up assessment creation, but without guardrails it can also introduce errors, bias, and weak alignment."
|
|
Scooped by
EDTECH@UTRGV
April 23, 8:33 AM
|
After ChatGPT’s launch, the percentage of routine coding questions on an online forum fell sharply, while novel questions rose.
"[M]any of the routine coding questions that developers once posted on popular online forum Stack Overflow appear to have moved to AI tools, while the more novel problems still require human expertise."
|
|
Scooped by
EDTECH@UTRGV
April 23, 8:27 AM
|
The Prosocial AI Index offers business leaders a practical, auditable way to assess whether their AI systems are genuinely good, writes Wharton’s Cornelia Walther.…Read More
"Most organizations are evaluating AI with the wrong instruments — efficiency metrics and ROI dashboards that capture what is easy to count but miss what truly matters for people, purpose, and planet."
|
|
Scooped by
EDTECH@UTRGV
April 22, 1:40 PM
|
Passed in 1974, FERPA was never meant to govern cloud-based platforms, artificial intelligence, or the invisible flow of student data across third-party vendors. Our students deserve better.
"FERPA is outdated for modern ed tech use: Written in 1974 for paper records, FERPA does not address cloud platforms, AI, or widespread data sharing across digital tools."
|
|
Scooped by
EDTECH@UTRGV
April 22, 1:36 PM
|
Educators must seize on the opportunity to coach students on effective and acceptable uses of AI that enhance literacy learning.
"There are clearly questions raised about where to draw lines around when and how students should use AI tools; however, the reality is that the tools are here and students are using them."
|
|
Scooped by
EDTECH@UTRGV
April 22, 1:29 PM
|
The shift from AI possibility to measurable progress in higher education is underway.AI is not a future consideration.
"With few exceptions, institutions are no longer debating whether AI will transform the sector; instead, they are focused on how quickly they can translate its many possibilities into meaningful progress."
|
|
Scooped by
EDTECH@UTRGV
April 22, 1:22 PM
|
Educators should not be competing with chatbots and large language models. Instead, a continuum can help them guide students from passive learning from AI to synthesising information alongside it
"To preserve this intellectual agency, we need to stop treating AI as a monolithic tool and instead guide students through purposeful pedagogical engagement with it."
|
|
Scooped by
EDTECH@UTRGV
April 21, 9:29 AM
|
"Even the most well-intentioned edtech can fall short if it does not meet students where they are. After several years studying the usability of edtech for teachers, the research team at ISTE+ASCD turned its attention to students — examining how the technical and pedagogical design of digital tools shapes their learning experiences."
"The findings identify five areas that matter most to students and offer guidance for educators and product designers seeking tools that are intuitive, meaningful and engaging."
|
|
Scooped by
EDTECH@UTRGV
April 21, 9:26 AM
|
As large language models take over more and more cognitive tasks, researchers are warning this mental outsourcing comes with a cost.
"There is now a growing body of research suggesting that this "cognitive offloading" to AI can have a corrosive effect on our mental abilities. The consequences could be alarming and may even contribute to cognitive decline."
Your new post is loading...
This article has some useful tips https://www.facultyfocus.com/articles/teaching-with-technology-articles/when-ai-gets-it-wrong-a-pedagogical-approach/