Although nudging in small doses makes a difference, nudging is no panacea for all of the complex problems found in higher education. There are few studies that evaluate the overall effectiveness of nudging in changing behaviors and sustaining impact.6 Some studies even note the adverse effects of nudging.7 Like anything else in life, knowing when to use nudging — and when enough is enough — can be a challenge.
The answer is not simple. Perhaps the deepest concern lies in the definition of the problem and in who decides the direction of nudges. Nudging can easily become shoving or smacking. Obviously, the intentions behind most higher education practices are pure, but with new technologies, we need to know more about the intentions and remain vigilant so that the resulting practices don’t become abusive. The unintended consequences of automating, depersonalizing, and behavioral exploitation are real. We must think critically about what is most important: the means or the end.
With the transformative nature of new capabilities, we should explore both the opportunities and the threats associated with nudging in higher education. This is especially true at a time when academic credentials beyond the high school diploma are needed to acquire entry-level jobs, when colleges and universities are experiencing retention challenges, and when funding for higher education is decreasing. Nudging, used wisely, offers a promising opportunity to redirect students’ decisions and to contribute to the success of those students facing the steepest barriers.
This article is drawn from the recent research by the EDUCAUSE Center for Analysis and Research (ECAR) and Gartner researchers on the state of analytics in higher education. This research explores the analytics trends as well as future predictions for the deployment of analytics technologies. Publications include The Analytics Landscape in Higher Education, 2015; Institutional Analytics in Higher Education; and Learning Analytics in Higher Education. More information about the analytics maturity index and deployment index can be found in the EDUCAUSE Core Data Service (participating) and the EDUCAUSE Benchmarking Service.
Even though the work required to yield significant results in institutional analytics is hard and the journey long, the cost of doing nothing is no longer an option for most higher education institutions.
The college classroom has been impenetrable for those seeking to understand how students learn. This was more a function of the traditional methods of teaching and learning than a result of any intentional barriers. Student behaviors were fairly opaque—some note taking, possibly classroom discuss
No tutoring algorithm should be based purely on interaction data. The nuts and bolts of students’ learning experiences – from the lesson they are given through to the choice of knowledge representation – should be based on proven pedagogical principles. It is vital that these principles are baked into tutoring algorithms from the start, and that engineers work closely alongside pedagogical experts throughout the creation process.
Tutoring algorithms have a natural counterpart in real-time progress reports. It is the reports that are fed to parents and educators, who are ideally positioned to uncover the story behind each student’s data. They must be empowered to do exactly that.
An algorithmic approach is not sufficient to serve our students. Joshua has met with success because his teachers are active agents in his learning journey. His progress data may act as a guide, but it is Joshua’s teachers who can interpret his data within his unique context and take the relevant actions. For instance, the reports may highlight sporadic usage patterns (and give precise meaning to terms like ‘sporadic’) but the additional support that is needed for Joshua is a judgement best left to his school.
Algorithms and data need not be the mechanical vices of data scientists – with the right intentions, they can uplift educators and amplify their efforts to meet every the needs of every student. It is the combined potential of algorithms and human insight that will win the day.
Gartner Hype Cycle : What is it? What can you do? Provides reliable information? All this in response to the question of edublogger Don South Chesterman to a critical analysis of the latest Hype Cycle for Education.
Dashboards in Education: The message center for digital learning? What is it? Why ask teachers, school administrators and directors so? How dashboards support differentiation and customization? And how can you use the Hype Cycle in this issue?
Recent technology introductions, such as the Apple TV, which we tried in the meantime, the question that has to do an iPad Pro with a Tesla, the dirt-cheap laptop from Lenovo and ditto Lumia smartphone from Microsoft.
"Learning Analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs."
Whether they like it or not, members of the educational technology community and faculty members should engage in debates about learning analytics algorithms, transparency and data access. That was the message delivered by George Siemens, executive director of the Learning Innovation and Networked Knowledge Research Lab (LINK) at the University of Texas at Arlington, during a recent Future Trends Forum discussion with futurist Bryan Alexander.
"You can hate something and still recognize that it is an important factor to think about," Siemens stressed. "You can think analytics are crap, but you can't deny they are influential. If we want a future that embodies values that are important to us, then we have to be active participants in the sociotechnical and economic spaces driving that change."
CT covered the Hobsons announcement this past January, 2016, of its acquisition of the PAR (Predictive Analytics Reporting) Framework. Here, we talk with Ellen Wagner, Chief Research Officer for the PAR Framework and VP for Research at Hobsons to get a brief update on the current work of PAR after its first eight months with Hobsons.
Digital assessments have long been an effective means for freeing up instructors' time, particularly in blended learning settings, as well as for providing immediate formative feedback.17 Building on this work is the move to authentic assessment, to approaches in which humans and machines work in concert to quickly and accurately assess and provide feedback on student problems, where data is integrated from very diverse sources, and where data is collected longitudinally.18
With this shift we have, for the first time, data about virtually all aspects of students' skills, including the complex abilities that higher education attempts to foster—abilities that, in the modern economy, are more important than simple factual knowledge.19 We have the potential to assess postsecondary learners in ways that can improve depth, frequency, and response time, possibly expanding the scope with which students and instructors can monitor learning, including assessment of higher-level skills, and proving personalized feedback based on those assessments. However, the tools for understanding this data (e.g., edX ORA, Insights, EASE, and Discern) are still in their infancy. The grand challenge in data-intensive research and analysis in higher education is to find the means to extract such knowledge from the extremely rich data sets being generated today and to integrate these understandings into a coherent picture of our students, campuses, instructors, and curricular designs.
The grand challenge in data-intensive research and analysis in higher education is to find the means to extract knowledge from the extremely rich data sets being generated today and to distill this into usable information for students, instructors, and the public.
Technological leaders must draw on the strengths of both the proponents and the skeptics in our communities to ensure that institutional mechanisms are in place to examine the overall efficacy of learning analytics systems.
If I were into scrying (the art of predicting the future by gazing into a crystal ball), I would prophesy that EDUCAUSE Review readers will have two equal and opposite reactions on seeing an issue devoted to predictive analytics. The first reaction might be: "Are we still talking about how to use predictive analytics?" And the second reaction might be: "I wonder what predictive analytics we are using on our campus." We are all accustomed to tracking technologies that are emerging or that may seem to be more hype than substance, but what do we make of technologies like analytics? Here is a combination of tools and practices whose fundamental value is rarely questioned but that have not achieved the traction we might have expected by now. This issue of EDUCAUSE Review is a timely consideration of the state of predictive (and other) analytics across higher education: How are these tools and practices being used, how can they be better used, and how can institutions understand their own progress? How are the tools and practices of predictive analytics being used, how can they be better used, and how can institutions understand their own progress with analytics?
A specific type of adaptive learning technology, intelligent adaptive learning technology, is able to assess what students know and seeks to offer what they need to achieve content mastery. This technology “learns the learner” as the learner engages with it, which enables nuanced, personalized and relevant data collection and reporting that gives teachers insight into how the learner is solving problems and thinking. In addition to tracking achievement, the technology identifies trends in student motivation and engagement so that teachers can optimize learning efficiency, learning tenacity and ultimately learning outcomes. While students are working through lessons, the technology analyzes every click, hesitation and answer in order to direct students based on what they need in the moment to ensure deep understanding of key concepts, develop fluency with important skills, and cultivate critical thinking.
But as more colleges experiment, they're facing complex questions about what to do with the findings the data-crunching reveals.
What, if anything, should students be told about the judgments institutions are making about them from the data footprints they’re leaving behind? Should companies be able to profit from that data? And should students have the right to opt out of being monitored?
Just as a new medical finding can create standards by which doctors provide care to their patients, does having such information establish a new standard of care for colleges?
"We are entering a new era of data and data responsibility," says Mitchell Stevens, an associate professor in Stanford University’s Graduate School of Education who has long pushed for ethical standards around educational data that go beyond legal issues of privacy or security. In an era of ubiquitous data, he says, colleges need to decide: "Are we acting responsibly as educators? What values are we trying to pursue and preserve?"
Those were also some of the questions Mr. Stevens put front and center this month at a private convening of several dozen academics and a smattering of ed-tech company and foundation leaders.
Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement1Executive SummaryThe analysis of data from user interactions with technologies is changing how organisations function, prioritise and compete in an international market. All industries have been influenced or impacted by the so-called digital revolution and the associated analysis of user data. In the higher education (HE) sector this wave of data analytics has flowed through to the concept of learning analytics (LA). This field of research has been touted as a game changer for education whereby the outcomes of LA implementations will address core education challenges. These include concerns regarding student retention and academic performance, demonstration of learning and teaching quality, and developing models of personalised and adaptive learning. While there is broad consensus across the sector as to the importance for LA there remain challenges in how such endeavours are effectively and efficiently rolled out across an organisation. The lack of institutional exemplars and resources that can guide implementation and build institutional capacity represents a significant barrier for systemic adoption.
This report seeks to unpack these challenges to institutional adoption and provide new insights that can aid future implementations of LA and help advance the sophistication of such deployments. The study does so by interrogating the assumptions underpinning the adoption of LA in the Australian University sector and contrasting this with the perspectives of an international panel of LA experts. The findings and recommendations highlight the need for a greater understanding of the field of LA including the diversity of LA research and learning and teaching applications, alongside the promotion of capacity building initiatives and collaborations amongst universities, government bodies and industry.
Organizations can break free of the restrictions that learning management systems dictate and release learners to experience learning beyond the classroom. Learning doesn’t need to take place solely in front of a computer screen and keyboard. A system that tracks learning can take informal, social learning, or experiences in the real world and include those events alongside formal ones. When organizations can extend learning beyond a LMS and browser, employers can drive their employees’ learning opportunities and harness untapped potential.
The ScotXed Unit is part of Education Analytical Services Division within the Learning and Justice Directorate of the Scottish Government.
We support and develop a significant number of secure, efficient and effective electronic data exchanges between partners in the Scottish Government and wider service communities. ScotXed initially collected data solely for the Learning and Justice Directorate but is now being used more widely within the Scottish Government and its agencies. Data collections include the pupil and staff censuses, Looked After Children, Mental Health Benchmarking and Drug Treatment and Testing Orders to name a few. To view our extensive range of surveys, please see the links within the Data Collection Topics section below or the topics menu on the left hand side of this page.
We provide data to our analytical colleagues within the Scottish Government for national and international statistical publications. Analytical products contribute to the evidence base for policy development and making. We adhere to the principles of the National Code of Statistics and the Data Protection Act (1998) when collecting data for research and statistical purposes.
Kim Flintoff's insight:
System wide use of student data and learning analytics offers the potential for more coordination of support and resources, assisting students with progress and transitions, and other very positive social outcomes.... with all the usual caveats around ethical use and data security.
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.