Learning in Entire Courses

Learning in Entire Courses

Studying learning in entire courses can evaluate the effectiveness of one course vs. another, show whether reforms actually improved the (supposedly improved) course, provide evidence on the advantages of remote vs on-campus learning, and test the widespread prejudice that younger or less well prepared students learn less than the young college-educated professionals who predominantly take MOOCS.  We showed it was false.

Typical introductory physics courses – especially those using blended learning – assign a multitude of tasks for students (textbook, online pre-lecture preparation, lectures, online homework, written homework, recitations, and quizzes and examinations).  We studied which course elements were most responsible for student learning as a guide for what to emphasize, revise, or eliminate in future courses.

MoP09 What course elements correlate with improvement on tests in introductory Newtonian mechanics?, Elsa-Sofia Morote and David E. Pritchard Am. J. Phys. 77, 746, (2009)

We found that myCytertutor.com (the precursor to MasteringPhysics.com) was by far the course element with most effectiveness for improving the score on the MIT final, and was joined by in-recitation group problems for top effectiveness on standard conceptual inventories.  Recitation attendance was uncorrelated with improvement.

CCL14B  Learning in an Introductory Physics MOOC: All Cohorts Learn Equally, Including an On-­Campus Class  Kimberly F Colvin, John Champaign, Alwina Liu, Qian Zhou, Colin Fredericks, and David E Pritchard  Int. Rev. Research in Online and Distance Learning irrodl.org /index.php/irrodl/article/view/1902/3009

This research uses actual measurements to dispel the widespread notion that MOOCs benefit mostly the young professionals who form a large segment of the students at the expense of less well prepared students.

AZP15 Discovering the Pedagogical Resources that Assist Students in Answering Questions Correctly – A Machine Learning Approach – in Proceedings of educational data mining 2015, Giora Alexandron, Qian Zhou, David Pritchard

We apply AI to determine which resources in a course help students answer questions and problems in that course.  Ultimately this knowledge can allow us recommend resources at each point of need, weed out or improve resources that don’t help, and order the material so helpful resources are covered before they’re needed.

CCL14 Comparing Learning in a MOOC and a Blended, On-Campus Course K Colvin, J Champaign, A Liu, C Fredericks, D Pritchard   Proceedings of 7th Educational Data Mining Conference 2014 p343

BPD13 Studying Learning in the Worldwide Classroom: Research into edX’s First MOOC   Lori Breslow, David E. Pritchard, Jennifer DeBoer, Glenda S. Stump, Andrew D. Ho, Daniel T. Seaton Research & Practice in Assessment 8 13, 2013

CCL14A Correlating skill and improvement in 2 MOOCs with a student’s time on tasks John Champaign, Kimberly F. Colvin, Alwina Liu, Colin Fredericks, Daniel Seaton, and David E Pritchard  ACM Digital Library 2014 http://dl.acm.org/citation.cfm?id=2566250 http://dx.doi.org/10.1145/2556325.2566250

Comments

Dave Pritchard Avatar