FORUM ON EDUCATION
Web-Delivered Interactive Lecture Demonstrations: Creating an active science learning environment over the Internet
In this article we describe a project whose purpose is to extend a pedagogical procedure, Interactive Lecture Demonstrations (ILDs)[1,2], that already works well in a proven difficult learning environment, the physics lecture hall or classroom, to what is probably an even more difficult environment, the internet. Educators often compare student learning as a result of web-delivery of learning materials to traditional science courses, where research shows few students learn. To conclude from such a comparison that web-delivery is better, may not serve education. If we compare web-delivery learning results to classroom and laboratory techniques that actually result in most students understanding the materials, we are then using an authentic standard We have preliminary evidence the extension of the Interactive Lecture Demonstrations to internet delivery (WebILDs) results in student conceptual learning that is many times better than standard instruction and comparable to in-class delivery of ILDs.
Most in-class Interactive Lecture Demonstrations (ILDs) make use of real-time data collection and display or MBL (in our case LoggerPro with a LabPro interface from Vernier). Each individual demonstration in a sequence of 6-8 demonstrations follows an eight-step procedure. Students are given two sheets with the demo’s described--a “prediction sheet” which they hand in and a “results” sheet which they may keep.
Interactive Lecture Demonstration Procedure
1. Describe the demonstration and do it for the class without MBL measurements.
2. Ask students to record individual predictions.
3. Have the class engage in small group discussions with nearest neighbors.
4. Ask each student to record final prediction on handout sheet (which will be collected).
5. Elicit predictions & reasoning from students.
6. Carry out the demonstration with MBL measurements displayed.
7. Ask a few students to describe the result. Then discuss results in the context of the demonstration. Ask students to fill out “results sheet” (which they keep).
8. Discuss analogous physical situations with different "surface" features. (That is, a different physical situation that is based on the same concept.)
To deliver ILDs over the internet we needed to development web-aware software that 1) can present in proper order the many short video sequences that replace the actual presentation of demonstrations in a classroom; 2) is able to present and replay results as graphs and data synchronously with video sequences; 3) is able to present questions and collect student responses to a data base; 4) provides mechanisms to facilitate real-time internet textual discussions of predictions and results by small groups composed of students in different physical locations; 5) allows students to draw graphs and share them with others in the group; 6) provides administrative functions for monitoring students, collecting data for evaluation, and allows ILDs to be constructed. We have created such software and tested it with students at Tufts and the University of Oregon.
Student Testing and Learning Results (Academic Year 2002-2003)
We began testing with the software prototype with students in the introductory non-calculus physics class at Tufts University and the University of Oregon in September 2002.
At Tufts students were assigned the Third Law Sequence WebILD as homework. The other three ILD sequences in Motion, Force, and Energy series were delivered in class. These ILD sequences teach concepts in kinematics, Newton’s Laws and Energy. To do the WebILD the students had to find at least one partner to collaborate with who could be on the Web at the same time (not the same place). Tufts servers delivered the ILD and collected the results using the prototype software. Learning results are shown in Figure 1.
We are using questions from the Force and Motion Conceptual Evaluation (FMCE) to evaluate student learning. Figure 1 shows typical learning gains of approximately 10% in a traditional well-taught university classroom. For in-class ILDs in a previous year at Tufts, the average gain for both categories was 89%. The average gain for Third Law Sequence WebILDs was 72% using our first prototype software. This is better than expected with WebILDs producing about 81% of the learning that in-class ILDs produce and about 7 times better than standard instruction.
At the University of Oregon, we divided the students in the introductory algebra-based physics course into two groups where one group received Motion, Force and Energy ILDs in-class and one group received Motion, Force and Energy WebILDs. (The short introductory ILD sequence on the kinematics of walking was given in-class to both groups since we did not have a Web version). All other instruction was the same for two groups. The groups scored similarly on the pre-test using the FMCE. The students taking the WebILDs were scheduled for a class period so they could be supervised. The only communication with students in the groups they formed was over the Web. They were not allowed to talk to one another. The WebILDs were delivered from a Tufts server.
The results of this extended test are shown in Figure 2. Again the results were more than gratifying in that the Web results are 125% better than in-class delivery and about 4.6 times better than standard instruction. However the in-class delivery showed a normalized gain of only 45% while in previous years it has been closer to 80%. The ILDs were delivered by the same professor in the same way. We have no explanation for the drop. The students may be changing.
Figure 1: Normalized gain of students on Third Law questions (divided into collisions and contact forces) from the Force and Motion Conceptual Evaluation (FMCE). Normalized gain is the % of students who don’t know a concept that learn it. The first bars show gains in a traditional Calculus-based physics course at SUNY Albany. (Gains of 10% are typical in well taught traditional courses.) The second bars show results at Tufts in a previous year for in-class delivered 3rd Law ILDs. The final bars show the result for 3rd Law WebILDs.
We would expect an even better result for the WebILDs if the students had been allowed an appropriate amount of time. They were scheduled for a 50-minute period. This is enough time for an in-class ILD sequence but we estimate that students require 60 to 70 minutes when they are Web-delivered due to typing and figuring out what to do. Consequently not all students finished.
Current and Future Plans
We have revised the student interface and are testing again at Tufts and Oregon. In the next few months there will be an opportunity to try the WebILDs on-line. A link will be established at the web-site of the Center for Science and Math Teaching. (ase.tufts.edu/csmt/) There will be opportunities for testing for those who are interested.
This has been funded by FIPSE of the U.S. Department of Education. Ronald Thornton of Tufts University is the Project Director and David Sokoloff of the University of Oregon is a Principle Investigator. Academic Technology at Tufts University has been responsible for the web-based software implementation.
Figure 2: Normalized gain of students on the Force and Motion Conceptual Evaluation (FMCE). Normalized gain is the % of students who don’t know a concept that learn it. The first bars show gains in a traditional Calculus-based physics course at SUNY Albany. (Gains of 10% are typical in well taught traditional courses.) The second bars show results at the University of Oregon for in-class delivered Force, Motion, and Energy ILD sequences. The final bars show the result for students who experienced the Force, Motion, and Energy WebILDs.
1. R. K. Thornton and D. R. Sokoloff, Microcompter-based Interactive Lecture Demonstrations (ILDs): Motion, Force and Energy with Teachers’ Guide, Video, Set-up Files, Electronic Version (Portland, Vernier Software, 1999).
2. D. R. Sokoloff and R. K. Thornton, Using Interactive Lecture Demonstrations to Create an Active Learning Environment. The Physics Teacher, September, 1997, Vol. 35, pp. 340-347
3. R. K. Thornton and D. R. Sokoloff, “Assessing Student Learning of Newton’s Laws: The Force and Motion Conceptual Evaluation and the Evaluation of Active Learning Laboratory and Lecture Curricula,” Am. J. Phys. 66(4), 338-352 (1998).
4. R. K. Thornton, Why Don’t Physics Students Understand Physics? Building a Consensus, Fostering Change, chapter in The Thirteenth Labor, Improving Science Education, E.J. Chaisson, TC Kim ed., (Amsterdam, Gordon and Breach Publishers, 1999).
Ronald Thornton is professor of physics at Tufts University.