Interactive Engagement In MIT Introductory Physics
John W. Belcher
Over the last three years, the MIT Physics Department has been introducing major changes in the way that Mechanics I, 8.01, and Electromagnetism I, 8.02, are taught. These cases are the result of the TEAL (Technology Enhanced Active Learning) Project. The TEAL format is centered on an "interactive engagement" approach, and merges lecture, recitations, and desktop laboratory experience. The format is similar to the Studio Physics format at Rennesaeler Polytechnic Institute and to NCSU's Scale-Up format. We have expanded on the work of others by adding a large component centered on active and passive visualizations of electromagnetic phenomena. Most of these visualizations are on line and freely available at http://web.mit.edu/8.02t/www/802TEAL3D/.
Why is MIT moving to this model for teaching introductory physics? First, the traditional lecture/recitation format for teaching 8.01 and 8.02 had a 40-50% attendance rate, even with spectacular lecturers, and a 10% or higher failure rate. Second, there have been a range of educational innovations in teaching freshman physics at universities other than MIT over the last few decades that demonstrate that any pedagogy using "interactive engagement" methods results in higher learning gains as compared to the traditional lecture format. This is usually accompanied by lower failure rates. Finally, the mainline introductory physics courses at MIT have not had a laboratory component for over 30 years, which is a major pedagogical disadvantage when teaching physics. The motivations for moving to the TEAL format were therefore to increase student engagement with the course by using teaching methods that have been successful at other institutions, and to reintroduce a laboratory component into the mainline physics courses after a 30-year absence.
In spring 2003 we taught 8.02 (Electromagetism I) in the TEAL format for the first time in the mainline course with 550 students. In Fall 2001 and 2002 we taught a prototype to classes of about 150 students. In TEAL, students sit together at twelve round tables in a classroom especially designed for this purpose, the d'Arbeloff Studio Classroom. Each table accommodates nine students, with one laptop for every three students. Students are assigned to groups of three and remain in those groups for the entire term. Grades in the TEAL courses are not curved. Because collaboration is an element, it is important that the class not be graded on a curve, either in fact or in appearance, to encourage students with stronger backgrounds to help students with weaker backgrounds. Also, the cut-lines in the course are set in such a way that a student who consistently does not attend class cannot get an A. This is a deliberate policy to encourage attendance.
We have had an robust assessment and evaluation effort underway since the inception of the TEAL project, under the leadership of Professor Judy Yehudit Dori , a faculty member in the Department of Education in Technology and Science at the Technion in Hafia, Israel. We use a variety of assessment techniques, including the traditional in-class exams, focus groups, questionnaires (in addition to MIT's course evaluation questionnaire), and pre- and post-instruction conceptual testing. Based on the conceptual testing, the learning gains in TEAL spring 2003 were about twice those in the traditional lecture/recitation format (for detailed statistics, see http://web.mit.edu/jbelcher/www/802TEAL.pdf). These assessment results were consistent with the feeling of the physics faculty teaching the course that students were learning more with this method of instruction than they had in the traditional lecture/recitation format. The fact that interactive-engagement teaching methods produce about twice the average normalized learning gains when compared to traditional instruction replicates the results of many studies preformed at other universities. It is also consistent with the much lower failure rates for the Spring 2003 8.02 (a few percent) compared to 8.02 failure rates in recent years (from 7% to 13%).
In contrast to this overall increase in learning gains, student satisfaction with the spring 2003 course was mixed to negative. The MIT course evaluation overall course score for spring 2003 was 3.7/7.0, a low ranking. In hindsight, there were a number of missteps we made that contributed to this. For example, almost all of our students in the prototype courses had seen the material before at some level, and thus had some comfort level with it. This was not the case in spring 2003, when many students entering the course had never seen the material before. We use group work extensively in this class. Unfortunately, although we grouped according to background (that is, every group had a range of prior knowledge based on the pre test) in the prototype courses, in spring 2003 we simply assigned students to groups randomly. The result was that some of our groups consisted entirely of students who had never seen the material before. A frequent student complaint in our focus groups and in the course surveys was that "the blind can't lead the blind" in group work. We believe homogeneous grouping contributed to that reaction.
Another factor that may have impaced student reaction to the course was the low level of instructor knowledge in regard to the new methods and materials. We did train the six faculty members new to teaching the course in spring 2003 in regard to the teaching methods in the course. However, in hindsight, our training was not thorough enough to prepare them for the very new environment in the d'Arbeloff Classroom. This is true both in terms of the technology in the room and the teaching methods used in "interactive engagement." Moreover, we feel that we did not provide enough instruction to the student groups themselves in regard to collaborative work. Finally, many students said that they did not find the experiments useful. They were unsure of what they were supposed to learn from them, and the length of the experiments was such that frequently students did not have a chance to finish them.
We are teaching the mainline course again in spring 2004. The changes we are making this term in response to our experience in spring 2003 are: (1) heterogeneous grouping, and more training of students in collaborative methods; (2) more extensive training for course teaching staff, both section leaders, graduate student TAs and undergraduate TAs; (3) an increase in numbers of the course teaching staff (students felt we were understaffed during class); (4) fewer experiments that are better explained and better integrated into the course material; (5) better planning of individual classes to break our active learning sessions into smaller units that can be more closely overseen by the teaching staff.
The lessons of the TEAL experience thus far for educational innovation at MIT are many. First, any serious educational reform effort must be accompanied by a robust assessment effort. One needs some quantitative measure of the effectiveness of instruction to gauge whether the innovation is actually producing results that are superior to or equal to what it is replacing. Second, as is well known in educational circles, the most perilous part of any innovation is the attempt to move from small-scale innovation to large-scale implementation. With hindsight we feel that our major misstep in this transition was not training course personnel and students adequately to prepare them for this new method of teaching.