**Michael Loverude, California State University Fullerton**

This work is part of an NSF-funded collaboration with John Thompson (University of Maine), Joe Wagner (Xavier University) and Warren Christensen (North Dakota State University). The project goals are to investigate student learning and application of mathematics in the context of upper-division physics courses. Our project seeks to study student conceptual understanding in upper-division physics courses, investigate models of transfer, and to develop instructional interventions to assist student learning.

Historically PER has primarily focused on introductory-level courses, but in recent years there has been increasing attention to upper-division courses taken primarily by physics majors.[1] One key course taught by most departments remains under-researched (with a few exceptions[2]): the so-called “math methods” course (called MM for this article).

A MM course is generally taught between introductory physics and the core theory courses in the physics major (electricity and magnetism, classical mechanics, and quantum mechanics), focusing on mathematical techniques that students will encounter in these later courses. Typically MM courses focus on a vast list of topics including differential and integral calculus, series, complex numbers, vector calculus, differential equations, and linear algebra. As if this were not sufficient, often MM has additional implicit goals: students are expected to ‘think like a physicist’ when solving problems. Despite its seeming importance, this phrase is not often operationally defined. For this project we have sought to articulate measureable aspects of this idea and study the extent to which students develop appropriate skills in their lower-division coursework. While instructors value these skills, and there has been some previous discussion of them,[3] they are not often explicitly taught or assessed.

A key goal of our larger project is to develop a series of tasks suitable for use in MM, focusing on skills including dimensional / unit analysis, applying limiting cases, using approximations, identifying errors, and predicting the effects of problem changes on the resulting solution.[4] (We do not claim that this list is complete.).

Every research project chooses appropriate lenses through which it views data and derives conclusions. For this project, we have focused on the cognitive: students’ mental processes, reasoning, and models. Within this realm, there are many appropriate lenses from which to choose. This portion of our project is driven by practice; we are seeking to learn what is difficult for students and develop instructional interventions.[5] A growing body of work in PER has examined student use of mathematics in physics, and researchers have chosen (and developed) a variety of theoretical frameworks in order to interpret their findings.[6] In particular, we refer to two models. Redish described stages of *modeling*, *processing*, *interpreting*, and *evaluating*.[7] For the specific case of upper-division physics courses, Wilcox et al. proposed the ACER framework, in which students must *activate* the appropriate mathematical tool, *construct* a model, *execute* the mathematics, and *reflect* on results.[8] In each model, successfully executing a mathematical procedure is only one element, but physics courses often focus almost exclusively on processing or executing.

For this forum, we focus on a single sample task, the *Evaluate the Expressions* task, that is illustrative of the non-procedural skills that we are describing. This task involves the evaluation of mathematical expressions for correctness. The task describes an Atwood’s machine (see figure below), an example that students might have encountered in introductory mechanics. Students are shown three expressions for the acceleration of one of the two blocks and asked to determine whether the expressions could be correct. (All three expressions are incorrect.) The problem is posed on the first day of the MM course on an ungraded quiz and subsequently explored in a group discussion. The problem has been administered in three sections of MM (*N* = 47) at our university before any instruction and used as the basis of six interviews with students from a different MM course.

Unlike many problems that students have encountered, this task asks for evaluation (per Redish) or reflection (per Wilcox). Students are not asked to solve the problem and are given no numerical values. Instead they might check limiting values or identify cases in which the expression is unphysical (e.g., in the second expression, if M/2 = m_{1}+m_{2} the acceleration would be infinite).

This task is challenging for students. Only one student offered a completely correct solution. Ten others identified all three solutions as incorrect but with incomplete or incorrect explanations. Many students (10-20%) gave no response, despite ample time to complete the task. The approaches used by students varied considerably, and while many did attempt to reason with the mathematical expression, others seemed to respond as though this task were a more typical end-of-chapter problem. About 10% of the students solved the problem directly, and others performed algebraic manipulations of the given expressions. A few students mentioned partially remembered results: “my very rusty memory only recalls subtracting from the bottom.” These responses suggest an epistemological stance that is quite different that the problem intends.

A few responses reflected an attempt to reconcile the mathematical form of the expression with a sense of physical mechanism. Several students referred explicitly to the presence or absence of a term with the difference in masses: “Correct: m_{2} is countering m_{1} so m_{1} is accelerating at a portion of g.” Such responses are reminiscent of the work of Sherin on students’ reading of equations.[9] A few students gave similar explanations but with respect to other quantities: “this expression raises the value of acceleration as the mass of the pulley increases leading me to believe this is incorrect.” Both cases reflect good reasoning upon which we can build.

The interview responses were particularly illustrative of the issues described above. One student struggled with the task, and was increasingly unhappy with her inability to explain: “sorry that I’m saying I’m not sure for so many of these.” The interviewer then asked what he thought would be a final question: “If I look at the structure of the mathematical expression, the greater the mass of the pulley, what would I expect? What impact would that have on the expression?” The student took up this line of reasoning, revising her previous responses and generating additional examples. When prompted to use reasoning skills instead of procedural knowledge, the student was able to do so very productively, but it did not occur to her that this sort of reasoning would be useful or even allowed.

This portion of the project is in initial stages, but we feel that our preliminary results offer some insights into the MM course. Many students entering the MM course do not successfully reason quantitatively, even when explicitly prompted to do so. Few of the students spontaneously considered special cases of the variables in the problem or related to a sense of physical mechanism. Instruction focused on procedures does not necessarily lead students to develop these reasoning skills that physicists value. There is a need for tasks that can be used in instruction and assessment; developing such materials is one of our ongoing goals.

This research is supported in part by NSF grants PHY 1406035, 1405616, and 1405726.

*Michael Loverude is Professor of Physics and Director of the Catalyst Center at California State University Fullerton. He was a plenary speaker at FFPER: Puget Sound 2016.*

**Endnotes**

[1] M. E. Loverude and B. S. Ambrose, “Editorial: Focused Collection: PER in Upper Division Physics Courses,” *Physical Review Special Topics in Physics Education Research* 11, 020002 (2015).

[2] For an example of studies in a combined mathematical methods / classical mechanics course, see SEI: Science Education Initiative at University of Colorado Boulder,

http://www.colorado.edu/sei/departments/physics.htm

[3] E. F. Redish, “Problem solving and the use of math in physics courses.” In Conference World View on Physics Education in 2005, Delhi, August 21–26, 2005.

[4] M.E. Loverude, “Assessment to complement research-based instruction in upper-level physics courses, 2011 PERC Proceedings [Omaha, NE, August 3-4, 2011], edited by N. S. Rebello, P. V. Engelhardt, and C. Singh [AIP Conf. Proc. 1413, 51-54 (2012), M. E. Loverude, “Surprisingly, there is an actual physical application,” Student understanding in Math Methods, 2013 PERC Proceedings [Portland, OR, July 17-18, 2013], edited by P. V. Engelhardt, A. D. Churukian, and D. L. Jones; and M. E. Loverude, “Quantitative reasoning skills in math methods,” 2015 PERC Proceedings.

[5] P. R. L. Heron, “Empirical investigations of learning and teaching, Part I: examining and interpreting student thinking,” *Proceedings of Enrico Fermi Summer School on Physics Education Research*. (pp. 341-350). Italian Physical Society (2003).

[6] T. J. Bing and E. F. Redish, “Analyzing problem solving using math in physics: Epistemological framing via warrants,” *Physical Review Special Topics - Physics Education Research* 5, 020108 (2009), J. Tuminaro and E. F. Redish, “Elements of a cognitive model of physics problem solving: Epistemic games,” *Physical Review Special Topics - Physics Education Research* 3, 020101 (2007).

[7] E. Redish & E. Kuo, “Language of physics, language of math: disciplinary culture and dynamic epistemology,” *Science and Education* 24, 561-590 (2015).

[8] B Wilcox et al, “Analytic framework for students’ use of mathematics in upper division physics,” *Physical Review Special Topics - Physics Education Research* 9, 0201191 (2013).

[9] B. L. Sherin, “How students understand physics equations,” *Cognition and Instruction* 19, 479–541 (2001).

Disclaimer – The articles and opinion pieces found in this issue of the APS Forum on Education Newsletter are not peer refereed and represent solely the views of the authors and not necessarily the views of the APS.