Exposing and Assessing the Hidden Agenda of Laboratory Instruction

Natasha Holmes, Stanford University, and MacKenzie Stetzer, University of Maine

The 2015 conference season served as an excellent vehicle for a rich discussion of the role of labs in physics education as well as the implications for instructors and physics education researchers.  From a dedicated laboratory working group at the 2015 Foundations and Frontiers in Physics Education Research Conference to the Second Conference on Laboratory Instruction Beyond the First Year (BFY II) of College, and the lab-themed 2015 Physics Education Research Conference, a dedicated group of laboratory enthusiasts and education researchers collaboratively focused on learning and teaching in the laboratory.  At the heart of all of these conferences was an introduction to and a discussion of the AAPT recommendations for the goals of the undergraduate physics lab curriculum [1], which are described in detail in another article in this newsletter. The document from AAPT focuses on the development of skills in six core areas: modeling, designing experiments, constructing knowledge, technical and practical skills, analyzing and visualizing data, and communicating physics.

While the goals of laboratory instruction have been highly debated for years, the development of conceptual understanding has persisted as a key focus of undergraduate lab courses, particularly at the introductory level. This view often comes from an assumption that students will learn concepts better by seeing the impact of those concepts in the real world. Recent work [2], however, has shown that this is not necessarily the case. One possible reason for this is the extensive set of hidden and implicit goals, concepts, and tasks that are involved in experimentation. In order to make sense of the physical concept in an experiment, one needs to design, set up, and carry out the experiment, take data, make sense of and interpret those data, and engage in any necessary troubleshooting and iterating. The AAPT goals help make this otherwise hidden and implicit curriculum explicit and exposed.

A particular benefit of these laboratory learning goals is that they are not entirely unique to physics. While the list will surely be beneficial for preparing future physicists, competence in these areas will easily transfer to other disciplines and to the extensive set of future career paths open to physics and non-physics majors alike.

Conversations at these conferences, therefore, have centered around whether institutions and instructors are adopting these goals, what difficulties students have in these areas, how to structure labs to achieve these learning goals, and how to develop assessments to ascertain the extent to which these goals are being met. These in-depth conversations between physics instructors, lab coordinators, and physics education researchers are essential, and it is critical that they continue.

While research on physics labs and associated skills has been relatively sparse in comparison to the extensive body of work on student conceptual understanding and problem solving, these recent conferences also served to highlight some of the emerging research directions in the context of laboratory instruction. At the 2015 PERC, the plenary talks reflected the diversity of this work, including presentations on introductory undergraduate lab courses, undergraduate research experiences, and K-12 adoption of the Next Generation Science Standards. Taken together, the talks highlighted the increased focus on scientific practice goals at all levels.

At the same time, it is clear that more research is needed on, in the broadest sense, what students are (or are not) learning in lab courses, how to improve student learning within the context of these courses, and which structures and pedagogies best support which kinds of learning. While there has been extensive research on student difficulties related to measurement and uncertainty, there are many learning goals articulated in the AAPT guidelines for which student performance (including, for example, specific difficulties) has not yet been investigated (e.g., troubleshooting, understanding of common lab equipment such as oscilloscopes or multimeters, use of lab notebooks, experimental modeling of a measurement system, designing experiments, and identifying research questions). For each of these areas of interest, there are also questions of generalizability. Given the resource-intensive nature of labs, the facilities at every university typically differ. How generalizable, then, are findings from a given pedagogical approach in one context to another context with different experiments or equipment?

We believe that these conferences have set the stage for an increased focus on laboratory instruction within the physics education research landscape and we encourage instructors and researchers alike to join in this important conversation. We all need to think critically about what students are really learning in labs and how we are measuring that learning. One example from our own experience is troubleshooting. While students often engage in troubleshooting in electronics lab courses, we don’t necessarily know whether they are learning and developing expert-like troubleshooting skills or simply becoming frustrated and relying on novice tactics (e.g., rebuilding the entire circuit or securing assistance from the instructor or TA). Answering these sorts of questions will require the development of validated assessments and other research-based measures of learning in labs. (For a broader discussion of laboratory assessment, see Zwickl, “AAPT Recommendations for Undergraduate Labs: Implications for assessment,” in this newsletter.) Moreover, as instructors, we should look carefully at our existing course assessments to identify what they are actually measuring.  Because students’ behaviors are often guided by course assessment practices, aligning grading rubrics with these desirable learning outcomes is crucial. Often, we focus on whether or not students have labeled the axes on their graphs and properly propagated their uncertainties, rather than evaluating whether or not students have thought critically about their experimental design, interpreted the data in their graphs, and understood what those uncertainties actually mean.

We encourage instructors and researchers who are passionate about laboratory instruction to share their experiences and their work with the physics teaching and research community. In particular, the AAPT Committee on Laboratories is hosting sessions on each of the broader laboratory goals outlined in the recommendations at each of the upcoming national meetings (summer and winter).  We hope to see you there.

[1] Subcommittee of the AAPT Committee on Laboratories - Joseph Kozminski (Chair). AAPT Recommendations for the Undergraduate Physics Laboratory Curriculum. (2014).
https://www.aapt.org/Resources/upload/LabGuidlinesDocument_EBendorsed_nov10.pdf
[2] C. Wieman and N. G. Holmes, “Measuring the impact of an instructional laboratory on the learning of introductory physics,” Am. J. Phys., 83(11), 972–978, 2015.

Natasha Holmes is currently a postdoctoral researcher at Stanford University. She completed her PhD in 2015 at the University of British Columbia, where her dissertation focused on using structured quantitative inquiry labs to develop critical thinking skills in introductory physics labs. She is a member of the Physics Education Research Leadership Organizing Council (PERLOC).

MacKenzie Stetzer is an Assistant Professor of Physics at the University of Maine. He is active in the field of Physics Education Research, and has a particular research focus on student learning of content and skills in upper-division laboratory courses on analog electronics. He was part of the organizing committee for the 2015 Physics Education Research Conference, which focused on the topic of laboratory instruction.


Disclaimer – The articles and opinion pieces found in this issue of the APS Forum on Education Newsletter are not peer refereed and represent solely the views of the authors and not necessarily the views of the APS.