New analytics tools from uClass aim to give users a better picture of what instructional materials their teachers are using—and whether these resources are effective analytics2-150x150Learning analytics have become a key feature within many school software programs. These tools can help educators understand trends and patterns in student learning, helping them target their instruction more effectively to improve achievement.
Most of these tools focus on analyzing student performance—but what if educators had tools that could measure the effectiveness of the instructional resources they’re using as well?
That’s the idea behind new analytics tools developed by an ed-tech company called uClass.
Analytics is the use of data, statistical analysis, and explanatory and predictive models to gain insights and to act on complex issues. Analytics has already transformed other sectors. In “10 Predictions for What the CIO Role Will Look Like in 2020,” John Brandon quotes CIO Oliver T. Bussmann: “[CIOs need] to earn a place in the C-suite by looking into the future of where the business is going, and determining the right technology roadmap to enable new business models, to improve margin, or otherwise give their company a leg up on the competition. In many cases this will mean leveraging in-memory computing and analytics to react to and capitalize on trends in real-time.”
The efforts to boost the nascent field of learning analytics could bring about a sea change in education, making it possible to personalize — on a massive scale — students’ learning by their individual interests and needs, according to a comprehensive report, involving experts from academia, business, nonprofits, foundations and government . It was written by Roy Pea, the David Jacks Professor of Education and Learning Sciences at the Stanford Graduate School of Education.
Designing a textbook or lecture with the average student in mind may sound logical. But L. Todd Rose, who teaches educational neuroscience at Harvard University’s Graduate School of Education, argues that doing so means that the lesson is designed for nobody.
Utilitarian concerns increasingly drive machine ethics, particularly the few ethical discussions in learning analytics. Determining the most good for the most people is a responsible way forward, but perhaps not the most comprehensive, especially when assessing potential outcomes. Innovations in learning analytics may occur more rapidly and with better outcomes if informed ethical discussions occur at every step of development.
Gamification, the idea that game mechanics can be integrated into assumed “non-game” circumstances has gained ascendance amongst champions of marketing, behavior change and efficiency. Ironically, some of the most heated critique of gamification has come from the broader community of “traditional” videogame developers. Connecting broadly to projects surrounding “big data” and algorithmic surveillance, the project of gamification continues to expand and intensify. This paper examines the complex relationship between game designers and the rise of arguments in support of gamification. I analyze the various actors and interests mobilizing arguments, deconstructing their underlying assumptions about the relationship between games and social phenomena. Turning to an analytic framework rooted in the Assemblage of Play (Taylor 2009) and emergent coercive forms of (played) control (Taylor 2006), the essay critiques assumptions on either side of the debate on the role of games and play. The strained connections between debates on gamification and broader interest in serious games offers an important moment to explore algorithmic surveillance.
Stanford calls for education data science field; presents national road map to analytics success
In a truly comprehensive report, with advice and suggestions from over 800 teachers and administrators, a Stanford-led national Workgroup calls for a new movement in learning analytics—redefining the field and offering a seemingly scalable road map to success.
The Workgroup, report, and road map were developed, because as Roy Pea, the David Jacks Professor of Education and Learning Sciences at the Stanford Graduate School of Education and lead investigator of the report, explains: the technology behind analytics has progressed faster than education’s ability to use it.
Concern was also expressed about the metrics being used for analytics – how accurate and appropriate are they and could it be dangerous to base interventions by tutors on metrics which portray an incomplete picture of student activity?
A number of the participants had already been thinking in detail about how to tackle these issues. There was a consensus that learning analytics should be carried out primarily to improve learning outcomes and for the students’ benefit. Analytics should be conducted in a way that would not adversely affect learners based on their past attainment, behaviour or perceived lack of chance for success. The group felt that the sector should not engage with the technical and logistical aspects of learning analytics without first making explicit the legal and ethical issues and understanding our obligations towards students.
This special issue of the eLearning Papers is based on the contributions made to the EMOOCS 2014 conference jointly organized by the École Polytechnique Fédérale de Lausanne (EPFL) and P.A.U. Education. The success of this conference with more than 450 participants demonstrates that MOOCs are at the beginning of a wave and a first step towards opening up education.
An introduction to the logic and methods of analysis of data to improve teaching and learning.
About this Course
Capturing and analyzing data has changed how decisions are made and resources are allocated in businesses, journalism, government, and military and intelligence fields. Through better use of data, leaders are able to plan and enact strategies with greater clarity and confidence. Data drives increased organizational efficiency and a competitive advantage. Simply, analytics provide new insight and actionable intelligence.
The Student Success System develops intervention strategies to provide students with personalised support, leading to higher student retention and achievement
Melbourne, Australia, September 02, 2014 – D2L (Desire2Learn) - the learning technology company that created Brightspace, the world’s first truly integrated learning platform (ILP) – takes student engagement and achievement rates at the University of Tasmania to a new level with the Student Success System (S3), a flagship solution in the D2L analytics portfolio.
Valid arguments exist for students to control data about themselves, and similarly plausible arguments suggest that the institution can claim ownership. This article explores both perspectives. To avoid win-lose solutions, institutions acting as "information fiduciaries" can reap the benefits of analyzing student data while respecting student rights.
At the CT 2014 conference this week in Boston, one former CIO told attendees that education is not "about gathering terabytes of data and asking it to tell me the patterns." Instead, he argued for the potential of "small data" to create a personalized learning experiences that cut down on student frustration and confusion.