Learning Analytics

A field I’ve enjoyed exploring in the MEITE program has been Learning Analytics, taught by Dr. Matt Bernacki. Before starting the course, I chose to write my final paper in Learning Sciences on Learning Analytics and Educational Data Mining, which you can check out below. After an integrative review and exploring themes between LA/EDM and formative assessment, learning progressions and scaffolding, I write about practical implications and walk through two examples of adaptive online assessment items that I built in two learning environments (Canvas LMS Mastery Paths and Qualtrics if-then logic).

Final Project: Learning Analytics Solution Product & Demo

In this video below, I demonstrate and present my proposed learning analytics product solution. I propose an adaptive checklist and rubric that can be used in LMS discussion forums to prompt learners to self-assess, ultimately promoting self-regulated learning and self-efficacy. I use Panadero et al.’s (2017) meta-analysis on the effects of self-assessment on self-efficacy and self-regulated learning when gender is considered as a moderator.

In the documentation below, you can see the full written proposal.

Guiding Questions and Discussion Themes

We delved into themes around big data, privacy, institutional and structural discrimination, the limitations of data analysis, and the implementation of Learning Analytics design in educational ecosystems. Below are just a few guiding questions I enjoyed intellectually exploring with the class.

“Which domains of the Knowledge-Learning-Instruction (KLI) framework might be feasible to implement, especially considering all the conditions and nuances mentioned in Koedinger et al. (2012)’s work? Beyond its contributions to research settings, how can this framework be operationalized in real-world instructional design?”

“Are instructors who are capable of using data and learning analytics responsible for informing learners’ ahead of time that they will use this data, and that the data might inform their perception of the student’s work? How could knowing this affect learners’ behaviors in a closed environment?”

“In instances where there is not student self-reported data to help confirm the elusive intention, what are some valid concerns about the instructor’s biases about what certain log data patterns could mean?”

“While cognitive tutoring systems offer opportunities to see these patterns, what are other online learning systems where we could similarly gather and use the data to understand learner behaviors–the expected and unexpected? Also, how does this learner behavior-centered approach change the way we might design–again, to welcome the expected and unexpected?”

“Yes, data analytics can help or harm, but I think these are in the shadow of institutional barriers. Moreover, we’ve talked about how learning analytics works well with “well defined” domains and not the “ill-defined” ones. But if we continue to see development in learning analytics only in these well defined domains (which seem to coincide with being the “challenging majors”), how are we adding to the gap? How can we leverage data analytics to not just think about how to divert a student’s trajectory, but to help support students no matter the trajectory they choose? How can we ensure that educators are optimally using data within their domain instead of perpetuating this notion that data  and learning analytics just might “work better” in some domains more than others?”

Learning Takeaways and Products

Presentation skills: This course did well to refine my presentation skills. Where I had previously focused my energy preparing on preparing scripts, this course taught me to focus more so on the learner’s cognitive load. In particular, we used Richard Mayer’s multimedia learning principles to consider our regulation of auditory and visual channels to effectively communicate new information to the learner. While it can be tempting to juice up presentations using Canva and other accessible graphic design platforms, using PowerPoint and optimizing animations, minimizing text and images, and aligning visualizations and sound can more powerfully reduce extraneous cognitive load.

Video & Design Collaboration: With our cohort being the largest MEITE cohort, we presented projects in groups rather than individually. This also meant learning to work collaboratively in a hybrid environment on products that needed to be cohesive, i.e. videos. Remote working environments have changed the way we need to present–body language, gestures and audience feedback such as laughter no longer buffer bad presentations. As such, we were challenged to not merely present but produce quick turnarounds for video demonstrations of our products of interest and designs.

Promoting Effective Help-seeking & Hints in CMU’s CS Academy

In this video, my colleagues and I specifically focus on how to promote effective help-seeking behavior by structuring trace data log to capture learners’ minimal code changes (we define this in the video), time duration between writing line of code, and potential guess-and-check habits.

One video I helped create included the one above in which Emily Trabert (MEITE, Learning Engineer and DevCred Founder), another colleague and I demonstrate a walk-through of Carnegie Mellon University’s Computer Science Academy “Learning to Code” activity. In this activity, the learner is responsible for writing lines of code in an IDE to re-create an Ebbinghaus Illustration. We specifically focus on how to promote effective help-seeking behavior by structuring trace data log to capture learners’ minimal code changes (we define this in the video), time duration between writing line of code, and potential guess-and-check habits. In doing so, analysts have greater accessibility to identifying and responding to learners who are “gaming” the platform through guessing and checking rather than learning how to code by using the resources and hints laid out for them.

In this video, we focus on Blackboard’s Discussion Analysis feature.

Rethinking Critical Thinking “Levels” in Blackboard’s Discussion Analysis

A second video that I worked on with Sanji Datar (MEITE, Innovation Specialist) included a walk-through of Blackboard’s Discussion Analysis feature. We found it fascinating in our conversation how often courses entail discussion board posting as course pre-work, and just how easy it is from a student’s perspective to write a post without really saying anything substantive. While reflecting upon the criteria that Dr. Bernacki outlined for us to help us abide by a certain standard quality of post, we developed a feature for Blackboard.

Blackboard includes a Discussion Analysis panel to help rate the quality of a learner’s post, predominantly in lexical measures. Interestingly, there is a Critical Thinking “Level” that we found problematic. Our response was to instead reframe the level as a point system. Rather than giving students a level that opened opportunity for gamification, we suggested a three-point rubric to prioritize reminding students of criteria for a thoughtful discussion post. Within this design, students can also gain self-regulated learning points for checking the rubric routinely and using it to check off their criteria once accomplished. Finally, we suggested a simple data visualization to help track the relationship between SRL and Critical Thinking points, with the expectation that there would be a positive correlation between the two and a potential opportunity for intervention depending on the trend.

css.php