AAPT Recommendations for the Undergraduate Physics Laboratory Curriculum: Implications for Assessment

Benjamin Zwickl, Rochester Institute of Technology

The AAPT Recommendations for the Undergraduate Physics Laboratory Curriculum [1] are the first step in an iterative revitalization of the undergraduate lab curriculum. In order to make meaningful progress, the recommendations must be translated into classroom curricula and assessable outcomes. Currently, there are few available assessment tools and limited physics education research aligned with the scientific practices discussed throughout the recommendations. Compared to the extensive resources and research-base for areas such as problem-solving and conceptual understanding, laboratory assessment is still in the early stages. The National Research Council’s 2012 report on Discipline-Based Education Research [2] highlights labs and scientific practices as emerging areas with a significant need for research on characterizing, measuring, and studying the development of expertise across all STEM disciplines.

Assessment and research on labs is unique for a few reasons. First, labs typically involve hands-on interaction with lab equipment so cognitive, perceptual, and motor skills are intertwined. Second, labs provide an intersection of theoretical and conceptual understanding with real-world experiences, which often do not satisfy the tidy idealizations of theoretical courses. Finally, laboratory courses prioritize scientific practices (e.g., experimental design or data analysis), rather than an understanding of key disciplinary content (e.g., Newton’s laws).

Developing the needed array of assessments requires the creativity of our entire community. These assessments need to be rigorous and classroom-relevant, a combination that is best achieved through combined efforts of instructors and education researchers. While talk about more assessments often evokes negative feelings among instructors (e.g., standardized testing or faculty course evaluations), assessments can guide positive long-term improvements in our curriculum. It is impossible to create a one-size-fits-all assessment for labs; rather, our community requires a wide range of tools developed for a range of purposes. What follows is a framework for the diversity of assessment options.

Focus-area: Which of the laboratory recommendations is the assessment aligned with?

Scale: Like any physical measurement, the spatial and temporal scale of a phenomenon and the interacting constituents guides the choice of measurement tool. Quarks are probed with different instruments than living cells. Within educational assessment, there are at least three relevant kinds of scale. (1) Scale of students: Is the assessment intended to assess progress and abilities of individual students, a team, an entire class, or a nationwide cohort of students? (2) Scale of activity: Are we assessing the impact of a single lab activity, a semester-long course, or a 4-year curriculum? (3) Time scale: Is this a one-shot assessment that verifies a benchmark is met, a pre/post measurement of growth during a course, or a longitudinal measurement tracking students throughout several years?

Feedback: Does the assessment provide usable feedback for improving student performance (formative assessments) or for documenting satisfactory completion of an outcome (summative assessment)?

Who benefits? Do the results provide insight to students, instructors, departments, institutions, or education researchers? While it is possible to design an assessment that benefits multiple stakeholders, assessments are typically optimized for a particular audience.

Format: Written or oral? Paper-based or online? Does it involve a hands-on component? Are the questions multiple choice or open-ended?

The assessment “phase space” in this framework is quite large. Not every assessment can accomplish every purpose. Instructors, with their close interaction with students and access to the laboratory classroom, are well-suited to create and administer assessments focused on individual students, including assessments with a hands-on component. For example, technical and practical skills are commonly assessed through a lab practicum. The rigor of physics education research is needed to develop assessments that can be applied across a variety of classes in a consistent way. Ensuring the validity and reliability of widely disseminated assessments requires additional testing and refinement that is typically not needed when instructors assess individual students in their own classes. Additionally, PER researchers will find many fruitful areas of study, particularly around sophisticated practices, such as modeling. Research is needed to characterize student thinking, understand the development of expertise, and develop assessments that can measure students’ development over time.

Although there is a need to develop new assessments, there are examples that are ready to use now. The Concise Data Processing Assessment [3] (focused on data analysis and visualization) and the Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS) [4] (aligned with constructing knowledge) are easy-to-administer tools intended for pre/post use in lab classes. Some curricula, such as the Investigative Science Learning Environment [5] developed at Rutgers, have activities with matched rubrics for several of the scientific practices in the AAPT Recommendations. Lab notebooks also document students’ progress throughout an extended activity and can assess a range of scientific practices. Even traditional laboratory reports, sometimes exalted (as the cousin of a PRL) and sometimes maligned (as a fake genre of writing only used in school), can serve as summative or formative of assessments of students’ ability to communicate technical details of their investigations and report their results.

The need for more and better assessments is a challenge worthy of our dedicated community of instructors and researchers in physics education. These efforts will flourish through collaboration between everyone involved in laboratory instruction and PER. The AAPT Committee on Laboratories is exploring ways to promote the sharing of ideas, curricula, and assessments through conferences, workshops, and online communities.

1. Subcommittee of the AAPT Committee on Laboratories - Joseph Kozminski (Chair). AAPT Recommendations for the Undergraduate Physics Laboratory Curriculum. (2014). at https://www.aapt.org/Resources/upload/LabGuidlinesDocument_EBendorsed_nov10.pdf
2. National Research Council. Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. Washington, DC: The National Academies Press, 2012. doi:10.17226/13362 http://www.nap.edu/openbook.php?record_id=13362
3. Day, J., & Bonn, D. (2011). Development of the Concise Data Processing Assessment. Physical Review Special Topics - Physics Education Research, 7(1), 010114. doi:10.1103/PhysRevSTPER.7.010114, https://www.physport.org/assessments/assessment.cfm?I=55&A=CDPA
4. Zwickl, B. M., Hirokawa, T., Finkelstein, N., & Lewandowski, H. J. (2014). Epistemology and expectations survey about experimental physics: Development and initial results. Physical Review Special Topics - Physics Education Research, 10(1), 010120. doi:10.1103/PhysRevSTPER.10.010120 http://tinyurl.com/ECLASS-physics
5. E. Etkina and A. Van Heuvelen, "Investigative Science Learning Environment - A Science Process Approach to Learning Physics," in Research-Based Reform of University Physics, edited by E. F. Redish and P. J. Cooney (American Association of Physics Teachers, College Park, MD, 2007), Reviews in PER Vol. 1, http://www.per-central.org/document/ServeFile.cfm?ID=4988 http://www.islephysics.net

Benjamin Zwickl is an Assistant Professor of Physics at the Rochester Institute of Technology. He completed his PhD in physics at Yale University, and spent three years at the University of Colorado Boulder as a postdoctoral scholar. His research focuses on how students develop experimental and research skills throughout the undergraduate curriculum, and how they leverage these skills in a variety of professions after graduation.

Disclaimer – The articles and opinion pieces found in this issue of the APS Forum on Education Newsletter are not peer refereed and represent solely the views of the authors and not necessarily the views of the APS.