NUS MOOC Corpus: Crowdsourcing annotations to study instructor intervention

Abstract

Massive Open Online Courses (MOOCs) scale up their class size by eliminating synchronous teaching and the need for students and instructors to be co-located. Yet, the very characteristics that enable scalability of MOOCs also bring significant challenge to its teach. In particular, high student-to-instructor ratio in MOOCs restricts instructors’ ability to facilitate student learning by intervening in discussions forums, as they do in face-to-face classrooms — the lack of interaction and feelings of isolation have been attributed as reasons for why enrolled students drop from MOOCs.

Using a large sample of forum data culled from Coursera MOOCs, we propose to annotate the corpus of instructor intervened threads using crowdsourced human annotators (i.e., global volunteers from Amazon Mechanical Turk2) and annotators recruited on-site. Annotating a large corpus of discussion forum contents will enable supervised machine learning to automatically identify interventions that promote student learning. Such machine learning models may allow the building of dashboards to automatically prompt instructors on when and how to intervene in discussion forums. In the longer term, it may be possible to automate these interventions relieving instructors of this effort. Such automated approaches are essential for allowing good pedagogical practices to scale in the context of MOOC discussion forums.

MOOC Wikification (broken link) is a spinoff from this project with similar objectives.

Publications

Sorry, no publications matched your criteria.

Members

Resources

Meeting Minutes