New research at The University of Texas at Arlington will analyze massive scale data traces from online work and learning communities to create new designs for networked learning and next generation knowledge building on the internet.
The internet is today characterized by the convergence of ubiquitous connectivity, networked computing, and more intelligence through machine learning and artificial intelligence. The data sets used include social networking sites, medical devices, telescopes and satellites to emails, streaming data, financial and commercial transactions.
Earlier this week my Ithaka S+R colleagues and I published “Student Data in the Digital Era: An Overview of Current Practices,” in which we review how institutions of higher education are currently using student data, and some of the practical and ethical challenges they face in doing so. As we conducted research for this report, part of our Responsible Use of Student Data in Higher Education project with Stanford University, we heard recurring concerns about the growing role of for-profit vendors in learning analytics. These third-party vendors, the argument goes, operate without the ethical obligations to students that institutions have, and design their products at a remove from the spaces where learning happens.
Students who frequently check their grades throughout the semester tend to get better marks than do those who look less often.
That’s one of the findings from a new study by Blackboard, a company that sells course-management software to hundreds of colleges. It’s probably one of the deepest data dives ever done on student clicks on college web systems, analyzing aggregate data from 70,000 courses at 927 colleges and universities in North America during the spring 2016 semester.
Colleges and universities are doubling down on learning analytics. They’re trying to figure out how to better use the rich data they’re increasingly capturing about their students and how to improve our collective understanding of the impact of analytics on teaching and learning. At the Universit
CT covered the Hobsons announcement this past January, 2016, of its acquisition of the PAR (Predictive Analytics Reporting) Framework. Here, we talk with Ellen Wagner, Chief Research Officer for the PAR Framework and VP for Research at Hobsons to get a brief update on the current work of PAR after its first eight months with Hobsons.
Digital assessments have long been an effective means for freeing up instructors' time, particularly in blended learning settings, as well as for providing immediate formative feedback.17 Building on this work is the move to authentic assessment, to approaches in which humans and machines work in concert to quickly and accurately assess and provide feedback on student problems, where data is integrated from very diverse sources, and where data is collected longitudinally.18
With this shift we have, for the first time, data about virtually all aspects of students' skills, including the complex abilities that higher education attempts to foster—abilities that, in the modern economy, are more important than simple factual knowledge.19 We have the potential to assess postsecondary learners in ways that can improve depth, frequency, and response time, possibly expanding the scope with which students and instructors can monitor learning, including assessment of higher-level skills, and proving personalized feedback based on those assessments. However, the tools for understanding this data (e.g., edX ORA, Insights, EASE, and Discern) are still in their infancy. The grand challenge in data-intensive research and analysis in higher education is to find the means to extract such knowledge from the extremely rich data sets being generated today and to integrate these understandings into a coherent picture of our students, campuses, instructors, and curricular designs.
The grand challenge in data-intensive research and analysis in higher education is to find the means to extract knowledge from the extremely rich data sets being generated today and to distill this into usable information for students, instructors, and the public.
Technological leaders must draw on the strengths of both the proponents and the skeptics in our communities to ensure that institutional mechanisms are in place to examine the overall efficacy of learning analytics systems.
If I were into scrying (the art of predicting the future by gazing into a crystal ball), I would prophesy that EDUCAUSE Review readers will have two equal and opposite reactions on seeing an issue devoted to predictive analytics. The first reaction might be: "Are we still talking about how to use predictive analytics?" And the second reaction might be: "I wonder what predictive analytics we are using on our campus." We are all accustomed to tracking technologies that are emerging or that may seem to be more hype than substance, but what do we make of technologies like analytics? Here is a combination of tools and practices whose fundamental value is rarely questioned but that have not achieved the traction we might have expected by now. This issue of EDUCAUSE Review is a timely consideration of the state of predictive (and other) analytics across higher education: How are these tools and practices being used, how can they be better used, and how can institutions understand their own progress? How are the tools and practices of predictive analytics being used, how can they be better used, and how can institutions understand their own progress with analytics?
Do you want to be more reflective in your teaching practice and wonder if there are technologies that can help? Are you curious about how data-driven, evidence-based teaching practices can improve your students’ learning? This is the course for you!
Analytics for the Classroom Teacher is an introduction to the emerging field of teaching and learning analytics from the perspective of a classroom teacher.
Experts from all over the world will provide an overview of the current state-of-the-art in teaching and learning analytics. You’ll learn how teachers, curriculum developers and policy makers are collecting and analysing data from the classroom to help guide decisions at all levels.
The course will then focus on the school teacher, and how data analytics can help you to make improvements in your classroom.
You’ll learn to use analytics to improve your lesson plans and your delivery of those plans, and discover more about your students' learning.
No previous knowledge in data-driven instruction, teaching and learning analytics is needed. Join us and a large community of innovative teachers from around the globe and become a pioneer of teaching and learning analytics in your school.
Four years after the launch of edX, the data generated by massive open online courses still mystifies many institutions. Could inter-university collaboration unlock the secrets to better course delivery?
As college students click, swipe and tap through their daily lives – both in the classroom and outside of it – they're creating a digital footprint of how they think, learn and behave that boggles the mind.
In short, we want educational predictions to be wrong. If our predictive model can tell that a student is going to fail, we want that to be true only in the absence of intervention. If the student does in fact fail, that should be seen as a failure of the system. A predictive model should be part of a prediction-and-response system that (1) makes predictions that would be accurate in the absence of a response and (2) enables a response that renders the prediction incorrect (e.g., to accurately predict that, given a specific intervention, the student will succeed). In a good prediction-and-response system, all predictions would ultimately be negatively biased. The best way to empirically demonstrate this is to exploit random variation in the assignment of the system—for example, random assignment of the prediction-and-response system to some students but not all. This approach is rarely used in residential higher education but is newly enabled by digital data.The grand challenge in data-intensive research and analysis in higher education is to find the means to extract knowledge from the extremely rich data sets being generated today and to distill this into usable information for students, instructors, and the public.
We sacrifice control in the name of convenience. As we become like cyborgs, we should expect more control over our technology. Tech has long aimed to provide additional conveniences for modern living, with the idea that a gadget would take care of something for us. The premise is that our lives are made easier when we worry less about the small stuff, stepping aside to allow technology do the grunt work. But the more we step aside, the less involved we are, and the less we control our environment, our information, our lives. We are giving algorithms control over increasingly complex aspects of our lives.
The idea of using an algorithm to care for humans has received popular attention recently with the case of a driver who died when his Tesla Model S drove underneath a semi that was crossing his lane. The car was in autopilot mode, with assistive radar and cameras activated; the driver died when the top of his car was sheared off by the underside of the semi trailer. Now begins the blame-aversion game that will become increasingly common as automation takes over automobiles: The car maker says autopilot is an assist feature and that the fault lies with the driver. Consumer Reports says the name “autopilot” suggests autonomy and that the fault lies with the software system. The driver — the one person directly affected by the incident — cannot share his take on things.
Although nudging in small doses makes a difference, nudging is no panacea for all of the complex problems found in higher education. There are few studies that evaluate the overall effectiveness of nudging in changing behaviors and sustaining impact.6 Some studies even note the adverse effects of nudging.7 Like anything else in life, knowing when to use nudging — and when enough is enough — can be a challenge.
The answer is not simple. Perhaps the deepest concern lies in the definition of the problem and in who decides the direction of nudges. Nudging can easily become shoving or smacking. Obviously, the intentions behind most higher education practices are pure, but with new technologies, we need to know more about the intentions and remain vigilant so that the resulting practices don’t become abusive. The unintended consequences of automating, depersonalizing, and behavioral exploitation are real. We must think critically about what is most important: the means or the end.
With the transformative nature of new capabilities, we should explore both the opportunities and the threats associated with nudging in higher education. This is especially true at a time when academic credentials beyond the high school diploma are needed to acquire entry-level jobs, when colleges and universities are experiencing retention challenges, and when funding for higher education is decreasing. Nudging, used wisely, offers a promising opportunity to redirect students’ decisions and to contribute to the success of those students facing the steepest barriers.
This article is drawn from the recent research by the EDUCAUSE Center for Analysis and Research (ECAR) and Gartner researchers on the state of analytics in higher education. This research explores the analytics trends as well as future predictions for the deployment of analytics technologies. Publications include The Analytics Landscape in Higher Education, 2015; Institutional Analytics in Higher Education; and Learning Analytics in Higher Education. More information about the analytics maturity index and deployment index can be found in the EDUCAUSE Core Data Service (participating) and the EDUCAUSE Benchmarking Service.
Even though the work required to yield significant results in institutional analytics is hard and the journey long, the cost of doing nothing is no longer an option for most higher education institutions.
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.