Learning Analytics, just as with analytics in business, web or healthcare, has the power to empower the people in the training department of your organization with the information they need. Learning managers no longer have to rely on making educated guesses to determine the most useful components of a training module. Incorporating Learning Analytics tools within your enterprise LMS does that far more easily and efficiently.
UK universities are monitoring students’ information to help them to improve their academic performance but are giving insufficient thought to the effectiveness of the technology they use and the rights of those they track.
This is according to Sharon Slade, senior lecturer in the Faculty of Business, Management and Law at The Open University, which is believed to have become the first institution in the UK to produce a publicly available written policy on the ethical use of student data for learning analytics – the practice of collecting and analysing student data with the intention of optimising their educational experience.
Work on a transparent ethical policy at The Open University highlighted concerns about technology and privacy
The initiative will be led by researchers at Carnegie Mellon University, who propose to construct a new data-sharing infrastructure that is distributed across multiple institutions, including third-party and for-profit vendors.
Stanford calls for education data science field; presents national road map to analytics success
In a truly comprehensive report, with advice and suggestions from over 800 teachers and administrators, a Stanford-led national Workgroup calls for a new movement in learning analytics—redefining the field and offering a seemingly scalable road map to success.
The Workgroup, report, and road map were developed, because as Roy Pea, the David Jacks Professor of Education and Learning Sciences at the Stanford Graduate School of Education and lead investigator of the report, explains: the technology behind analytics has progressed faster than education’s ability to use it.
Concern was also expressed about the metrics being used for analytics – how accurate and appropriate are they and could it be dangerous to base interventions by tutors on metrics which portray an incomplete picture of student activity?
A number of the participants had already been thinking in detail about how to tackle these issues. There was a consensus that learning analytics should be carried out primarily to improve learning outcomes and for the students’ benefit. Analytics should be conducted in a way that would not adversely affect learners based on their past attainment, behaviour or perceived lack of chance for success. The group felt that the sector should not engage with the technical and logistical aspects of learning analytics without first making explicit the legal and ethical issues and understanding our obligations towards students.
Sensors are cheap and abundant. They’re already in our devices, and soon enough, many of us may elect to carry sensors in and on our bodies, and embed them in our homes, offices, and cities. This terrifies people, Jason Silva says in a new video.
Who hasn’t heard of Big Brother or feared the rise of the surveillance state? But Silva says there’s an upside.
As the world is reduced to “algorithmic cascades of data” he thinks we’ll get what Steven Johnson calls the “long view,” like a microscope or telescope for previously invisible information and datasets.
Billions of sensors measuring location, motion, orientation, pressure, temperature, vital signs and more—each of these will be like a pixel. Seen up close, a modestly flashing primary color. But at a distance, individual pixels dissolve. Discrete points will smooth out into a contiguous image no one could have guessed by looking at looking at each pixel alone.
Understanding how an educational intervention is implemented is essential to evaluating its effectiveness. With the increased use of digital tools in classrooms, however, traditional methods of measuring implementation fall short. Fortunately, there
“ Using data to drive learning outcomes isn’t a new concept, really. For as long as teachers have been giving students assessments, the assessments and results have been used by both students and teachers (even if only loosely) to determine how to move forward. What needs to be reviewed more? What was covered/studied well? Learning analytics takes this concept and kicks it up a notch. Well, more like a thousand notches, especially if you’re considering things like adaptive computer based testing that changes as students use it.”
Adaptive learning technologies may pave the way to a "pedagogical renaissance." Although still in the early stages, these technologies have already shown some of their potential, as fewer students are failing or dropping out of classes that use these programs.
I believe the move to large-scale adoption of learning analytics, with the attendant rise in institution-level decisions, should motivate us to spend some time thinking about how concepts such as validity and reliability apply in this practical setting. Motivation comes from: large scale adoption has “upped the stakes”, and non-experts are now involved in decision-making. This article is a brief look at some of the issues with where we are now, and some of the potential pit-falls going forwards.
The search for data that impacts student success may lead to more questions than answers.
When we first sketched out a topic for this month's feature on the quest for data that impacts student success, we had a slightly different story in mind. We wanted to come up with a handful of data types that any higher ed institution pursuing learning analytics should have on its list — say, five key data points that really impact student outcomes.The search for data that impacts student success may lead to more questions than answers.
New analytics tools from uClass aim to give users a better picture of what instructional materials their teachers are using—and whether these resources are effective analytics2-150x150Learning analytics have become a key feature within many school software programs. These tools can help educators understand trends and patterns in student learning, helping them target their instruction more effectively to improve achievement.
Most of these tools focus on analyzing student performance—but what if educators had tools that could measure the effectiveness of the instructional resources they’re using as well?
That’s the idea behind new analytics tools developed by an ed-tech company called uClass.