Scalable learning analytics and feedback tools for large undergraduate classrooms
Org:
Lindsey Daniels (University of British Columbia)
[
PDF]
- ADEN CHAN, University of British Columbia
A framework for utilizing online grading software to deliver efficient assessment and feedback to students [PDF]
-
In large undergraduate classrooms, efficient assessment and feedback mechanisms are critical for supporting diverse student needs and maintaining instructional quality. However, collection and analysis of this data often requires a massive effort and insight into established data collection methods. This presentation will explore the use of a home-grown open-source software, which facilitates on-paper assessments while enabling marking online optimizing the marking process and generating detailed learning analytics. The software automatically tracks key metrics, including individual and aggregate student grades by question, marking times, question discrimination, and question difficulty providing valuable insights into student performance, question efficacy, and marking practices. Specifically, we will explore the use of the software in analyzing the data for a second year engineering course, and discuss the scalability of the methodology to large enrolment courses.
- MATT COLES AND KELLY PATON, University of British Columbia
Student experience of group work in a large first-year calculus course: measuring, facilitating, improving [PDF]
-
We describe in-class group work and group homework assignments in a large (~4000 student) both Calc 1 and 2. The scale of the course imposes constraints and challenges in delivery of the course but also in data gathering. We will explore how we frame, guide, and try to capture the student experience, as well as share what students have to say. With the third iteration of the course we address improvements and look to future directions.
- LINDSEY DANIELS, The University of British Columbia
Utilizing text analytics, data visualizations, and regression to inform teaching and feedback in large enrollment courses [PDF]
-
Diagnostic tools are often utilized to gauge student mastery of prerequisite skills and preparedness for a given course. These tools often take the form of a multiple choice assessment, where information can be gleaned from both correct and incorrect choices. However, these tools do not allow for more nuanced information about student thought processes and activated mathematical tools in solutions. At the same time, in large enrollment courses, there is a high degree of heterogeneity, where students have a variety of backgrounds, skill sets, and motivations, and providing individualized action-oriented feedback is challenging. In this project, we propose a framework for a diagnostic tool designed to provide nuanced information about a student cohort’s preparedness in a scalable way that can be leveraged to inform both teaching and student feedback.
- KENNETH G. MONKS, University of Scranton
Proof Verification with Lurch [PDF]
-
Would your students benefit from an easy-to-use, open-source, web-based word processor that could check their assigned mathematical proofs? In this talk we introduce Lurch, our software project designed specifically for this purpose. We will explain how you can use this software and accompanying course materials, and customize it for your own purposes. While existing proof verification tools like Lean, Isabelle, Coq, and Mizar are powerful and effective, they often have steep additional learning curves and can be difficult to customize. We will explain how the custom Lurch validation algorithm overcomes these challenges, and pose some questions for future work. Additional information is available at lurch.plus.
© Société mathématique du Canada