Navigating the Challenges and Opportunities of Online Education
Mats Selen and Tim Stelzer, University of Illinois at Urbana-Champaign
Students’ unprecedented access to content on line is dramatically and irreversibly changing higher education. The opportunities are fantastic, but so are the risks. Similar to white water rafting, standing still is not an option, and simply letting the current dictate our path would be disastrous. In this article, we will argue that physics has a special role to play in navigating these changes, describe steps we have taken at the University of Illinois, and begin a discussion about the much larger challenges we face moving forward to ensure that our efforts at efficiency don’t come at the price of turning higher education into online training programs.
The physics community can be proud of the impact it has had on higher education. The development and dissemination of ideas to improve student learning in lecture, recitation and labs have certainly improved the quality of education in physics,1 and many other disciplines are not only adopting educational strategies developed in physics but are also gaining appreciation for the type of discipline-based education research we have established. Our past accomplishments give us the opportunity, and responsibility, to help navigate at this exciting time in education.
The changes we have made to our introductory courses at the University of Illinois provide a nice illustration of the opportunities and challenges technology is providing. Fifteen years ago our colleagues contributed an article, “Parallel Parking an Aircraft Carrier,”2 describing what turned out to be the first phase in transforming our introductory courses. This transformation included adding Peer Instruction3 to our lectures, group problem solving of context rich problems4 to our recitation sections, and “predict- observe-explain” labs5. These course enhancements measurably improved our students’ experience. Perhaps even more important was the change in infrastructure, which both institutionalized the changes and provided an environment for continued evolution. Indeed, the success of our course transformation did not result in our taking a break from innovation, but inspired our faculty to continue improving the courses.
One challenge we faced in implementing Peer Instruction was designing “clicker questions” that probe conceptual understanding at the appropriate level of difficulty. Fortunately, Just In Time Teaching (JiTT)6 provided an elegant solution. Having the students answer questions online before lecture allowed us to tailor the lecture and clicker questions based on their responses. One unanticipated benefit was that students really liked having their responses to the JiTT question incorporated into the lecture. Indeed, it became a badge of honor for students to have their response displayed in class.
One clear and somewhat disappointing result from examining the JiTT responses was that the students were learning very little from the textbook prior to lecture. Hence, a large amount of lecture time was devoted to delivering content rather than addressing student misconceptions and promoting deep conceptual understanding using Peer Instruction, frustrating both students and instructors.
Our solution was to replace the textbook with online multimedia content, and to require that students view this before coming to class. In this way we could spend class time discussing the material they just viewed rather than delivering the material itself. Placing the content online had the benefit of making it easy to deliver and gave us the ability to give credit to students for participating.
Using multimedia techniques that combine carefully designed audio and visual elements to deliver content is not a new idea. Indeed, we were inspired by significant existing research indicating that multimedia can be a very effective method for people to learn complex ideas.7 Guided by that research, we designed web-based “prelectures” for our introductory Mechanics and E&M courses. Our own clinical studies confirmed results in the literature.8 Figure 1 shows that students using the multimedia prelectures scored 13% higher on standard exam problems both immediately after seeing the material (Units 1-4) and two weeks later (Retention) than students reading from the book.
Figure 1. Results from a clinical study at the University of Illinois, showing that students using the multimedia prelectures scored 13% higher on standard exam problems both immediately after seeing the material (Units 1-4) and two weeks later (Retention) than students reading from the book
Figure 2. Results from surveys of students taking introductory Electricity and Magnetism courses at the University of Illinois. Adding prelectures and devoting more lecture time to Peer Instruction resulted in students finding the course less difficult, having a more positive attitude toward physics, and finding lecture more valuable in helping them learn the material.
Introducing prelectures into our physics course qualitatively changed the lecture experience. Although we didn’t use the phrase, we had effectively “flipped” the course. Students came to lecture better prepared, and we essentially doubled the class time devoted to Peer Instruction activities. The positive impact of these changes is best illustrated in Figure 2. It shows that students find the course easier, have a more positive attitude toward physics, and find lecture more valuable in helping them learn the material than before we made these changes.9
Our results on the positive impact of adding online prelectures to an introductory physics course may appear to support the case for developing effective fully on-line courses. As compelling as this may seem on the surface, there are important risks and challenges to consider.
At the heart of these challenges is our ability to validate the impact of transformations as massive as moving a course completely online. Our research on the impact of prelectures is similar to most physics education research in that the intervention is aimed at improving, not replacing, the student interactions and experiences that have defined the physics courses for generations. These relatively modest changes give one confidence in evaluating the impact of the changes using traditional instruments used to assess student performance in the course (e.g. Exams, Surveys). It is important to note that although these instruments are often validated, they are necessarily incomplete. The Force Concept Inventory10 provides an excellent illustration.
Prior to the introduction of the FCI, many physics professors assumed that a student’s ability to solve difficult problems in mechanics implied that they understood the concepts behind Newton’s laws. The FCI demonstrated that we had become so efficient at teaching students to solve problems they could now do so without understanding some very fundamental principles. The ability of the FCI to quantify the lack of understanding of agreed upon learning objectives inspired and validated many important reforms to physics education.3,11 Indeed, most of these reforms engaged students in “high order” reasoning activities to help them develop this conceptual understanding. The result was that, in addition to showing larger gains on the FCI, our students were developing critical thinking skills that transcend the course. In hindsight, the importance of directly testing conceptual understanding seems clear, however, one should not underestimate the extraordinary efforts of the physicists who championed the reforms to explicitly teach and assess students’ conceptual understanding.
Advances in online technology, combined with social and economic pressures, present us with a new opportunity and an even greater challenge. Soon there will be “efficient” online activities that can train students to perform well on FCI-like exams as well as traditional exam calculations. If our learning objectives for our introductory courses are limited to mastering the FCI and solving a set of “difficult” mechanics problems, then the path forward is clear. However, if we also want our courses to help students develop higher-level critical thinking skills, then it is important that we identify and clearly articulate these goals, and that, in the context of our courses, we use assessments that accurately reflect the student’s progress toward those goals. Relying solely on traditional exam problems and FCI-like assessments to guide the development of fully online courses will inevitably prune the critical thinking aspects from these courses, transforming them from educational experiences into training programs.
Higher education is undergoing dramatic and irreversible changes. We have a unique opportunity and responsibility to guide these changes so that they enhance the quality of our students’ educational journey, and preserve the learning experiences that enable our graduates to meet the challenges facing society. Using technology to develop new educational activities is an important and exciting component of this journey and will undoubtedly move forward quickly, so it is critical that our ability to assess the impact of these developments advance at the same rate. Our challenge is to engage in active discussion, clearly articulating the goals of our courses, and to develop and implement accurate and reliable assessments to guide our journey.
1. D. E. Meltzer and R. K. Thornton, Resource Letter ALIP–1: Active-Learning Instruction in Physics, Am. J. Phys. 80, 478 (2012); J. Docktor and J. Mestre, “A synthesis of discipline-based education research in physics,” Physical Review Special Topics (In review)
2. Campbell, Elliott and Gladding, “Parallel-Parking an Aircraft Carrier” Forum on Education Newsletter, American Physical Society, Summer (1997). Available online: http://www.aps.org/units/fed/newsletters/aug97/articles.html#campbell
3. E. Mazur, “Peer Instruction: A User’s Manual,” Prentice Hall, Upper Saddle River, NJ (1997).
4. P. Heller, R. Keith, and S. Anderson, “Teaching problem solving through cooperative grouping; Group versus individual problem solving,” Am. J. Phys. 60, 627-636 (1992);Patricia Heller and Mark Hollabaugh, “Teaching problem solving through cooperative grouping; Designing problems and structuring groups,” Am. J. Phys. 60, 637-641 (1992).
5. R.K. Thornton and D.R. Sokoloff, “Learning motion concepts using real-time microcomputer-based laboratory tools,” Am. J. Phys. 58, 858-867 (1990)
6. G. M. Novak, E. T. Patterson, A. D. Gavrin, and W. Christian, Just-In-Time-Teaching (Prentice Hall, Upper Saddle River, NJ,1999)
7. R. E. Mayer, “Multimedia Learning,” Cambridge U. P., Cambridge, (2001); “The Cambridge Handbook of Multimedia Learning,” edited by R. E. Mayer Cambridge U. P., Cambridge, (2005).
8. T. Stelzer, G. Gladding, J. P. Mestre, and D. T. Brookes, “Comparing the efficacy of multimedia module with traditional textbooks for learning introductory physics content,” Am. J. Phys. 77, 184–190 (2009).
9. T. Stelzer, G. Gladding, J. P. Mestre, and D. T. Brookes, “Impact of multimedia learning modules on an introductory course on electricity and magnetism,” Am. J. Phys. 78, 755 (2010).
10. David Hestenes, Malcolm Wells, and Gregg Swackhamer, “Force Concept Inventory” The Physics Teacher, 30, 141-158 (1992).
11. R. R. Hake, “Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses,” Am. J. Phys. 66, 64–74 (1998); D. Sokoloff and R. Thornton, “Using interactive lecture demonstrations to create an active learning environment,” The Physics Teacher, 35 340-347 (1997); D. Sokoloff and R. Thornton, “Assessing student learning of Newton’s Laws: the force and motion conceptual evaluation and the evaluation of active learning laboratory and lecture curricula,” American Journal of Physics 66 338-352 (1998).
Mats Selen is a Professor of Physics at the University of Illinois at Urbana-Champaign. In 2006, he was made a Fellow of the American Physical Society for his work in experimental high-energy physics. He is one of the developers of the iClicker system. He has won numerous teaching awards including the 2013 APS Excellence in Physics Education Award, which was awarded to him along with his collaborators Gary Gladding and Timothy Stelzer.
Tim Stelzer is an Associate Professor of Physics at the University of Illinois at Urbana-Champaign. He is active in the fields of theoretical high-energy physics and physics education research. He is one of the developers of the iClicker system. He has won numerous teaching awards including the 2013 APS Excellence in Physics Education Award, which was awarded to him along with his collaborators Gary Gladding and Mats Selen.
Disclaimer – The articles and opinion pieces found in this issue of the APS Forum on Education Newsletter are not peer refereed and represent solely the views of the authors and not necessarily the views of the APS.