This post is spurred by my attendance at the ELI Focus Session on Learner Analytics. Below are some takeaways and thoughts of how this impacts our research.
I strongly believe that we need to be evaluating technological implementations and performing research to help us guide our decision making on our campus. We should not spend resources (time, money, etc.) to diffuse technologies that do not have evidence of their effectiveness. If we want to call it learner analytics these days and bring attention to it (or what we have been doing for years), great! It seems that we are defining learner analytics, but I still am not buying the new terminology and separation from research that has been done on campus for years.
As George Siemen mentioned in his talk, Learner Analytics are not new (thank you, George!). We just usually call it something else. However, there are new components. See http://www.slideshare.net/gsiemens for George's preso/slides.
For more on what are learner analytics and why we should use them, read George's piece in EDUCAUSE:
I am more interested in learning what is in the black box than looking at input (demographics, gpa, etc.) and output variables (grades, retention, performance, success, achievement) in isolation, unless of course these findings draw attention to courses to can provide further explanation. Researchers for years in the social sciences, communication specifically, have noted problems with models that do not examine the process variables.
The black box is the social interaction that takes place within a course, or the process variables, that leads to impacting outcome variables (learning, performance - grades, success, achievement, satisfaction, etc.). This evidence that is collected needs to usually include the perspectives of students and faculty to understand what is in the black box. However, learner analytics surrounding LMS use can help us understand the social behavior or at least interactions with technology to a point.
I appreciate national and instutional discussions surrounding learner analytics because not only does it potentially help us understand the black box, but it brings to light the important of, not only learner analytics, but in research on the use of technology and it's impact on students. This provides us with the evidence need ed to support our pedagogical practices and faculty development recommendations.
"learning is a complex social activity (technical methods do not capture the full scope of nuanced nature of learning)"
George proposed a model to help us consider our strategies for learning initiatives.
This model applied to my unit:
For those looking for a network analysis tool, check snappvis.org out!
In listening to the presentations, we were looking for potential guidance on what sorts of questions could we have answered by building a data warehouse for our LMS, Desire2Learn (D2L), and getting an analytics tool. We are attempting to identify our needs of the data warehouse as well as our needs in a learner analytics tool and even possibly a student success tool (loosely defined).
In listening to the presentations from Roush, Little, Fritz, and McFayden, we started developing the following ideas:
Also, we found McFayden's presentation and public research very useful in identifying LMS areas of interest for learner analytics:
Open Learning Analytics Whitepaper by @gsiemens and Solar Research
Penetrating the Fog: Analytics in Learning and Education
GEORGE SIEMENS AND PHIL LONG
George Siemens (firstname.lastname@example.org) is with the Technology Enhanced Knowledge Research Institute at Athabasca University. Phil Long (email@example.com) is a Professor in the Schools of ITEE and Psychology and is Director of the Centre for Educational Innovation & Technology at the University of Queensland.
Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an ‚Äúearly warning system‚Äù for educators: A proof of concept. [Article]. Computers & Education, 54(2), 588-599.