Trip Report: Thoughts Coming Out of Learning Analytics 2012

My thoughts coming out of Learning Analytics 2012 in Vancouver, BC. (Hint: They’re very similar to my thoughts going in to the conference.)

Brandon’s Introduction

Use of data by academic institutions is not new. Institutions have done it for years for admissions process, student services, libraries, and so on. Some of the “newer” interpretations of the use of data focus more on student learning and student activity for the purpose of informing curriculum and pedagogy. Also, there’s a resurgence in data to inform administrative and academic services at universities. There are a couple terms for this floating about in the education-space: “learning analytics”, “educational data mining”, “academic analytics” and in the world-at-large: “big data”.

Going into Learning Analytics 2012, my experiences to date with “learning analytics” has been one of:

  • Lack of a coherent definition. When lurking in the Learning Analytics and Knowledge 2012 MOOC (http://lak12.mooc.ca/), and following the readings and chatting with a colleague, much of the “course” was a survey of all of the different ways people are thinking about the use of data in education. In the end I view the names simply as labels.
  • “Religious wars”: strong opinions about the language, activity and value around the application of data, whether that data is applied to improve student learning or to provision services. At ELI 2012 there was a very strong tension between the fear of “learning analytics” driving academia toward behaviorism (if student x does y, then we know or do z) versus a more nuanced interpretation of what we is possible and that we’re building on what’s easy to gather but not as the be all end all.

Coming out of Learning Analytics 2012, my opinion is still one of… “it depends”. It depends what and why one collects, analyzes and implements the results of data. I think it’s useful to see examples of what others are doing with data in this “newer” interpretation, I think ultimately institutions and faculty need to “get their hands dirty” and talk about what problems they face, what they think data might help them understand, and implement experiments to explore things. (If this sounds similar to the things we’ve hear about business intelligence, data warehousing, and so on that’s good. I think they’re basically the same thing.)

Some things are clear (and important) to me, is that:

  • We need conversations to discuss what questions in which we’re interested,
  • Systems that will allow us to extract data, and
  • Individuals to analyze and interpret the data.

Also, it’s important to me that we be prepared to share the data (with acknowledgement of privacy issues) and not lock data up—“open” and transparency are important and play a role in the “ethics” discussion with respect to learning analytics. I think many institutions run the risk of giving away the rest of the kingdom to LMS vendors and publishers if they cede the management and analysis of data about their students to companies (a number of others at the conference resonated with this thought).

Notes on Sessions

Wellman Keynote Summary

People function more as networked individuals, families function as networks, networks have grown not shrunk, more Internet use is correlated with more interpersonal contact, my work at home, networks are sparsely-knit/fragmented.

Some questions he raised:

  • How do online & flesh meetings integrate?
  • Is reciprocity always tit for tit? If I help you with x, you’ll help me with x’ not y
  • How do people navigate multiple networks?

Sheila MacNeill from JISC/CETIS Suggests: http://blog.questionmark.com/use-a-survey-with-feedback-to-aid-student-retention

“Self-assessment surveys to help increase student retention. Students answer questions about aspects of their studying and receive feedback to help them improve. These assessments provide useful data that the University has used to improve student retention…

Some key reasons why these self-assessment surveys seem successful:

  • They take only 10 minutes or so to complete
  • They are well promoted
  • They provide short actionable feedback
  • They are genuinely anonymous
  • They remain similar year on year, making them easy to maintain and useful for identifying trends
  • They focus on specific issues the University has learned are important for retention and student success

Dan Suthers

  • Levels of Learning in Social technical networks
  • Agency/Who or hat is the agent that learns (individual, small groups, networks)
  • Epistimologies/What is the process of learning
  • Process: Abstract transcript that pulls together a logfile, replace logfile with people and actions (r1, r2, r3, w1, w2, w3 as in reading and writing), relationships between events (contingencies, reply, semantic/lexical similartity, etc.) that leads to interactions, create graphs and associations, and see if one can interpret that as communities.

Jim Gaston and Bob Bramucci/ Sherpa (via Tweet)

  • At South Orange Community College District in California
  • Intervene at just the right time with “nudges” // timing of intervention is SO impt and hard to get at #Lak12
  • When a student tries to register for a full class, SHERPA tells them their other options for the same course. #LAK12
  • Point of nudges is to be relevant and timely – make the most sense to e’one involved #Lak12
  • #Sherpa’s map for how data is handled. #lak12 http://pic.twitter.com/IsngBsQ2
  • SHERPA can deliver nudges via email, text, and phone. students are hired as part of the design team. #LAK12
  • Students can rate the usefulness of the nudges they receive from the Sherpa system. #lak12

For MIT (and based on what we’ve seen with iCampus projects) we should consider how we can use analytics (probably “academic analytics”) to improve students planning their class schedules—linking thru social networks, access to OCW, links to degree requirements and so on.

Bieke Schreurs and Maarten/ Network Awareness Tool

  • Research on professional development of teachers
  • Looking at informal workplace learning (related to professional development of teachers)
  • Could their work be deployed for students on top of Piazza/discussion for courses

Rong Yang/Cyberlearners and Learning Resources

  • Notion of nearest neighbors and use of resources (me: could be used to identify experts, as they implemented it)

Simon Buckingham Shum and Ruth/

  • What does it mean to validate analytics?
  • While most folks are talking about outcomes and trajectories toward the outcomes, Simon posits that 21st century learners are being asked to learn not knowing the outcome, but rather to take a learning journey into the unknown (or into which they construct their knowledge)
  • How do we support learning dispositions for intentional learners, or student led inquiry as the norm?
  • Qualities of most effective learners: Resilience, strategic awareness, critical curiosity, creativity, meaning making, changing and learning, learning relationships
  • Combine data streams with data sets (self-reflective view of your qualities) to enable -> system simulations, recommendation engines, collective intelligence
  • ELLI WordPress plugin – Can tag posts with qualities to describe themselves, used as part of learning blogging.

Don Norris and Linda Baer Panel/Building Organizational Capacity in Analytics

  • “Optimizing student success in and outside the classroom”
  • Proven means for optimizing student success
    • Managing the student pipeline, at-risk students
    • Eliminating impediments to success
    • Dynamic query, alert and intervention, at-risk behavior
    • Learner relationship management systems and process
    • Personalized learning system environments and environments
    • Data mining/big data
    • Extend success to include employability/work
    • Preliminary Insights (from their research funded by Gates, leading to Toolkit for Organizational Change)
      • Some institutions are achieving strong ROI on student success analytics, the killer app
      • Case studies on successful execution and capacity building, although different patterns for different types of institutions
      • Leadership commitment is critical to move beyond departmental solutions and to changing behaviors
      • Examples of all levels of optimizing student success techniques
      • Wide variety of buy/build/mashup strategies
      •  Emphasis on analytics—new functionalities, applications, solutions
      • Expanded student success solutions from BI/ERP/LMS
      • New vendors in many categories
      • Cloud-based applications demonstrate the cloud’s potential to leverage vendor infrastructure
      • Strong marketplace incentives for continuing vendor innovations
      • Analytics Capacity Gap
        • Talent gap
        • Need for professional development, including for “free-range learners”
        • Need for continuing enhancements in tools, applications, solutions and services
        • Need for consulting
        • Vendor perspective (D2L): Which institutions will succeed with analytics?
          • Technology: Capacity of IT environment to deal with complexity
          • Culture: Viewing analytics as leadership
          • Business: For-profits get it, non-profits are still waffling
          • Facilitate staff on campus to be able to innovate – just getting access to the data
          • For-profits: Embedding predictive analytics in their day to day activities, and understanding what to do with it

Stephanie D. Teasley and Stephen Lonn/Analytics in the Hands of Academic Advisors

  • Criteria for use of their analytics tool (so not rocket science, but nice for them to explain it)
    • Absolute grade threshold
    • Difference from course average
    • Presence cutoff

David Garcia-Solorzano: Educational Data Monitoring based on Faceted Browsing

  • By faceted browsing, it’s just a visualization of the educational data
  • Complex (and colorful) visualizations of data at UOC as part of Ph.D. dissertation

Rebecca Ferguson/E-Mentoring in SocialLearn

  • Outcomes: Multiple overlapping relationship with respect to e-mentoring, can be effective without face to face
  • Key data: Connections with people, to content, and through content

Johann Ari Larusson and Brandon White/Point of Originality

  • Reads left to right with two inputs: writing samples, manual entry of key query terms (core concepts within the course), uses wordnet to determine lexical relationships
  • Take the core concept and synset terms for the core concept, then match against the same term relations for the writing sample
  • Two patterns: those with the highest average of originality in lead-in period, and those with extremely high burst of originality

Does Length of Time Off-Task Matter?

  • Literature: from direct correlation of off-task behavior to short breaks may improve cognition, but longer breaks are worse
  • Result: It is the act of BEING off task that negatively affects learning, but the length of the break did not seem to matter