At the CT 2014 conference this week in Boston, one former CIO told attendees that education is not "about gathering terabytes of data and asking it to tell me the patterns." Instead, he argued for the potential of "small data" to create a personalized learning experiences that cut down on student frustration and confusion.
Sponsored by the Provost's Task Force on Learning Analytics, Student Learning and Analytics at Michigan (SLAM) is a speaker series. Presenters will focus on the analysis and use of data about students, courses and academic programs-- for the purposes of improving teaching and learning.
Learning analytics holds increasing potential for student agency and autonomy, highlighting a need for ethical discourse at all levels of higher education institutions. Topics central to this dialogue include student awareness of analytics, the future of algorithms and learning analytics, and the redefinition of failure.
Did you know that 86% of organizations are focused on reporting, but only 15% of HR functions have strong analytics capabilities? This leads to a disconnect between Learning and Development (L&D) and Corporate within organizations.
Course redesign can be a major undertaking, but utilizing the data derived from your existing course can inform your decisions on what areas need to be targeted. When you combine the four factors I mentioned and use them to form a holistic, summative picture of your course redesign project; you can be certain that what is currently working in your course remains, and what is not working is revised.
MOOCs should be the Holy Grail of student data, but they aren't there yet.
One of the great promises of massive open online courses, besides making education more accessible for more students, is the treasure trove of student data collected on a grand scale.
Large amounts of data are exactly what higher education needs to stay relevant in this era of disruptive change, as Arizona State University's Adrian Sannier pointed out in his keynote at last year's Campus Technology annual conference. The only way to make sure colleges and universities are continually boosting student success, he said, is evidence-based pedagogy. And that requires scale: "You can't take evidence one class at a time, one person at a time — it takes too long, you don't get a broad enough sample…. I'm not sure you can do it at a university, at a single institution. You may not have enough scale, you may not have enough size."
Gamification, the idea that game mechanics can be integrated into assumed “non-game” circumstances has gained ascendance amongst champions of marketing, behavior change and efficiency. Ironically, some of the most heated critique of gamification has come from the broader community of “traditional” videogame developers. Connecting broadly to projects surrounding “big data” and algorithmic surveillance, the project of gamification continues to expand and intensify. This paper examines the complex relationship between game designers and the rise of arguments in support of gamification. I analyze the various actors and interests mobilizing arguments, deconstructing their underlying assumptions about the relationship between games and social phenomena. Turning to an analytic framework rooted in the Assemblage of Play (Taylor 2009) and emergent coercive forms of (played) control (Taylor 2006), the essay critiques assumptions on either side of the debate on the role of games and play. The strained connections between debates on gamification and broader interest in serious games offers an important moment to explore algorithmic surveillance.
Universities have been recording data digitally about their students for decades. No one would seriously question the necessity of collecting facts for administrative purposes, such as a student’s name…
The University of Wisconsin-Green Bay's CIO is pursuing a track that marries institutional data, predictions and business rules to create prescriptions for student and campus success. Here are his 12 best practices for prescriptive analytics.
Massive open online course providers are collecting troves of data about their students, but what good is it if researchers can't use the information?
The MOOC Research Initiative formally released its results on Monday, six months after researchers met in Arlington, Texas, to brief one another on initial findings. The body of research -- 22 projects examining everything from how social networks form in MOOCs to how the courses can be used for remedial education -- can perhaps best be described as the first chapter of MOOC research, confirming some widely held beliefs about the medium while casting doubt on others.
Massive Open Online Courses (MOOCs) collect valuable data on student learning behavior; essentially complete records of al student interactions in a self-contained learning environment, with the benefit of large sample sizes. […]
• […] 76% of all participants were browsers who collectively accounted for only 8% of time spent in the course, whereas, the 7% certificate-earning participants averaged 100 hours each and collectively accounted for 60% of total time.
• Students spent the most time per week interacting with lecture videos and homework, followed by discussion forums and online laboratories;
While written and oral language dominate instruction, the explosion of visual information has created new opportunities to represent complexity, reveal themes, explore data, and communicate information in powerful ways. Here is an overview of some of my favorite examples of visual data representation for education.