Student Expectations in Introductory Physics: Part 1

Edward F. Redish, Jeffery M. Saul, and Richard N. Steinberg

Department of Physics, University of Maryland, College Park, MD 20742

Abstract

Students' understanding of what science is about and how it is done and their expectations as to what goes on in a science course, can play a powerful role in what they get out of introductory college physics. In this paper, we describe the Maryland Physics Expectations (MPEX) Survey; a 34-item Likert-scale (agree-disagree) survey that probes student attitudes, beliefs, and assumptions about physics. We report on the results of pre- and post-instruction delivery of this survey to 1500 students in introductory calculus-based physics at six colleges and universities. We note a large gap between the expectations of experts and novices and observe a tendency for student expectations to deteriorate rather than improve as a result of the first term of introductory calculus-based physics.

*Supported in part by NSF grant RED-9355849.

To skip to part 2 of this paper, click here.
To skip to part 3 of this paper, click here.
To skip to part 4 of this paper, click here.

I. INTRODUCTION

What students expect will happen in their introductory calculus-based (university) physics course plays a critical role in how they respond to the course. It affects what they listen to and what they ignore in the firehose of information provided during a typical course by professor, teaching assistant, laboratory, and text. It affects which activities students select in constructing their own knowledge base and in building their own understanding of the course material. The impact could be particularly strong when there is a large gap between what the students expect to do and what the instructor expects them to do.

This paper explores student attitudes and beliefs about university physics and how those attitudes and beliefs change as a result of physics instruction. In this paper, we present the Maryland Physics Expectations (MPEX) Survey, a Likert-style (agree-disagree) questionnaire we have developed to probe some aspects of what we will call student expectations. We have used this survey to measure the distribution of student views at the beginning and end of the first semester of calculus-based physics at six colleges and universities. Our survey is included as an appendix.[1]

Because so little is known about the distribution, role, and evolution of student expectations in the university physics course, many questions can be asked. To limit the scope of this paper, we restrict ourselves to three questions.

  1. How does the initial state of students in university physics differ from the views of experts?
  2. To what extent does the initial state of a class vary from institution to institution?
  3. How are the expectations of a class changed as the result of one semester of instruction in various learning environments?

Other questions, such as what happens over the longer term and how items of the various clusters correlate with each other and with success in the course, are left for future publications.

We begin by reviewing previous work on the subject in section II. The structure and validation of the survey are described in section III. Section IV contains the results of the survey for five calibration groups, ranging from novice to expert. The results of our survey with students are presented in section V, and section VI discusses the implications of our work.

II. BACKGROUND AND REVIEW OF PREVIOUS WORK

A. Recent Progress in Physics Education: Concepts

In the past fifteen years, there has been a momentous change in what we know about teaching and learning in the introductory calculus-based physics course. Beginning about 1980, research began to show that the traditional class leaves most students confused about the basic concepts of mechanics.[2] Subsequent work extended those observations to other areas including optics, heat and thermodynamics, and electricity and magnetism.[3] In studying student understanding of the basic concepts of physics, much has been revealed about what students know and how they learn. The crucial element is that students are not "blank slates." Their experience of the world (and of school) leads them to develop many concepts of their own about how the world functions. These concepts are often not easily matched with those that are being taught in physics courses, and students' previous conceptions may make it difficult for them to build the conclusions the teacher desires. However, it has been demonstrated that if this situation is taken into account, it is often possible to provide activities that induce most of the students to develop a good functional understanding of many of the basic concepts.[4]

Success in finding ways to teach concepts is an excellent start (even though the successful methods are not yet widespread), but it does not solve all of our teaching problems with physics. We want our students to develop a robust knowledge structure, a complex of mutually supporting skills and attitudes, not just a patchwork of ideas (even if correct). We want them to develop a strong understanding of what science is and how to do it. We want them to develop the skills and confidence needed to do science themselves.

B. Student Expectations

It is not only physics concepts that a student brings into the physics classroom. Each student, based on his or her own experiences, brings to the physics class a set of attitudes, beliefs, and assumptions about what sorts of things they will learn, what skills will be required, and what they will be expected to do. In addition, their view of the nature of scientific information affects how they interpret what they hear. In this paper, we will use the phrase expectations to cover this rich set of understandings. We focus on what we might call students' cognitive expectations - expectations about their understanding of the process of learning physics and the structure of physics knowledge rather than about the content of physics itself.

Our model of learning is a growth model rather than a knowledge-transfer model.[5] It concentrates on what happens in the student, rather than what the teacher is doing. We therefore have chosen to focus our study on cognitive attitudes that have an effect on what it is students choose to do, such as whether they expect physics to be coherent or a loose collection of facts. The specific issues our survey covers are discussed in detail in the next section. Other issues, such as students' motivation, preferences, feelings about science and/or scientists, etc. are important but have been probed extensively elsewhere.[6]

Although we don't often articulate them, most physics instructors have expectation-related goals for their students. In our university physics course for engineers and other scientists, we try to get students to make connections, understand the limitations and conditions on the applicability of equations, build their physical intuition, bring their personal experience to bear on their problem solving, and see connections between classroom physics and the real world. We refer to this kind of learning goal a goal not listed in the course's syllabus or the textbook's table of contents as part of the course's "hidden curriculum." We are frustrated by the tendency many students have to seek "efficiency" to achieve a satisfactory grade with the least possible effort often with a severe unnoticed penalty on how much they learn. They may spend a large amount of time memorizing long lists of uninterpreted facts or performing algorithmic solutions to large numbers of problems without giving them any thought or trying to make sense of them. Although some students consider this efficient, it is only efficient in the short term. The knowledge thus gained is superficial, situation dependent, and quickly forgotten. Our survey is one attempt to cast light on the hidden curriculum and on how student expectations are affected by instruction.

C. Previous Research on Cognitive Expectations

There are a number of studies of student expectations in science in the pre-college classroom that show that student attitudes towards their classroom activities and their beliefs about the nature of science and knowledge affect their learning. Studies by Carey,[7] Linn,[8] and others have demonstrated that many pre-college students have misconceptions both about science and about what they should be doing in a science class. Other studies at the pre-college level indicate some of the critical items that make up the relevant elements of a student's system of expectations and beliefs. For example, Songer and Linn studied students in middle schools and found that they could already categorize students as having beliefs about science that were either dynamic (science is understandable, interpretive, and integrated) or static (science knowledge is memorization-intensive, fixed, and not relevant to their everyday lives).[9] Alan Schoenfeld has described some very nice studies of the assumptions high schools students make about learning mathematics.[10] He concludes that "Student's beliefs shape their behavior in ways that have extraordinarily powerful (and often negative) consequences."

Two important large scale studies that concern the general cognitive expectations of adult learners are those of Perry[11] and Belenky et al. (BGCT)[12] Perry tracked the attitudes of Harvard and Radcliffe students throughout their college career. Belenky et al. tracked the views of women in a variety of social and economic circumstances. Both studies found evolution in the expectations of their subjects, especially in their attitudes about knowledge.[13] Both studies frequently found their young adult subjects starting in a "binary" or "received knowledge" stage in which they expected everything to be true or false, good or evil, etc., and in which they expected to learn "the truth" from authorities. Both studies observed their subjects moving through a "relativist" or "subjective" stage (nothing is true or good, every view has equal value) to a "consciously constructivist" stage. In this last, most sophisticated stage, the subjects accepted that nothing can be perfectly known, and accepted their own personal role in deciding what views were most likely to be productive and useful for them.

Although these studies both focused on areas other than science,[14] most professional scientists who teach at both the undergraduate and graduate levels will recognize a binary stage, in which students just want to be told the "right" answers, and a constructivist stage in which the student takes charge of building his or her own understanding. Consciously constructivist students carry out their own evaluation of an approach, equation, or result, and understand both the conditions of validity and the relation to fundamental physical principles. Students who want to become creative scientists will have to move from the binary to the constructivist stage. This is the transition that we want to explore.

An excellent introduction to the cognitive issues involved is given by Reif and Larkin[15] who compare the spontaneous cognitive activities that occur naturally in everyday life with those required for learning science. They pinpoint differences and show how application of everyday cognitive expectations in a science class causes difficulties. Another excellent introduction to the cognitive literature on the difference between everyday and in-school cognitive expectations is the paper by Brown, Collins, and Duguid, who stress the artificiality of much typical school activity and discuss the value of cognitive apprenticeships.[16]

All the above-cited works stress the importance of expectations in how teens and young adults make sense of their world and their learning. If inappropriate expectations play a role in the difficulties our students commonly have with introductory calculus-based physics, we need to find a way to track and document them.

III. Constructing the Survey

A. Why a Survey?

Our interactions with students in the classroom and in informal settings have provided us with preliminary insights into student expectations. As is usual in physics education research, repeated, detailed, taped and transcribed interviews with individual students are clearly the best way of confirming or correcting informal observations and finding out what a student really thinks. The education literature contains particularly relevant transcripts of student interviews, especially in the work of David Hammer. In his Ph.D. thesis at Berkeley, Hammer followed six students throughout the first semester of their university physics course, tracking their progress through detailed problem-solving interviews. Each student was interviewed for approximately 10 hours. The interviews were taped and transcribed, and students were classified according to their statements and how they approached the problems. However, conducting interviews with large numbers of students would be prohibitively expensive, and they are unlikely to be repeated at many institutions. Interviews therefore cannot yield information about the distribution of student expectations in a large population. In order to study larger populations, a reliable survey is needed which can be completed by a student in less than half an hour and analyzed by a computer. We developed the Maryland Physics Expectations (MPEX) survey to meet this need.

B. The Development of the MPEX Survey

We began to develop the MPEX survey in the Autumn of 1992 at the University of Washington. Students in the introductory calculus-based physics class were given a variety of statements about the nature of physics, the study of physics, and their relation to it. They rated these statements on a five point Likert scale from strongly disagree (1) to strongly agree (5). Items for the survey were chosen as a result of a detailed literature review, discussions with physics faculty, and our combined 35 years of teaching experience. The items were then validated in a number of ways: by discussion with other faculty and physics education experts, through student interviews, by giving the survey to a variety of "experts", and through repeated delivery of the survey to groups of students.

The MPEX survey has been iteratively refined and implemented through testing in more than 15 universities and colleges during the last four years. The final version of the survey presented here has 34 items and typically takes twenty to thirty minutes to complete. We report here on the results of the MPEX survey given at six colleges and universities to more than 1500 students. These institutions are listed in Table 1. All students were asked to complete the survey during the first week of the term[17] (semester or quarter) and at the end of the term.

Table 1

InstitutionInstructional Characteristics N
University of Maryland,
College Park (UMCP)
traditional lectures, some classes with group-learning tutorial instead of recitation, no lab 445
University of Minnesota,
Minneapolis (UMN)
traditional lectures, with group-learning research designed problem-solving and labs 467
Ohio State University,
Columbus (OSU)
traditional lectures, group-learning research designed problem-solving and labs 445
Dickinson College (DC)Workshop Physics 115
a small public liberal arts
university (LA)
Workshop Physics12
a medium sized public
two-year college (TYC)
traditional44
Table 1: Institutions from which first semester or quarter pre- and post-instruction survey data was collected.
All data is matched, i.e., all students included in the reported data completed both the pre- and post-instruction surveys.

In the rest of this section, we describe how we chose the items of the survey and how we validated it.


Notes

  1. Various versions of the survey are available on the WWW.
  2. L. Viennot, "Spontaneous reasoning in elementary dynamics," Eur. J. Sci. Educ. 1, 205-221 (1979); D.E. Trowbridge, L.C. McDermott, "Investigation of student understanding of the concept of velocity in one dimension," Am. J. Phys. 48, 1020-1028 (1980); D.E. Trowbridge, L.C. McDermott, "Investigation of student understanding of the concept of acceleration in one dimension," Am. J. Phys. 49, 242-253 (1981); A. Caramaza, M. McCloskey, B. Green, "Naive beliefs in 'sophisticated' subjects: Misconceptions about trajectories of objects," Cognition 9, 117-123 (1981).
  3. For a review and references, see L.C. McDermott, "What we teach and what is learned - Closing the gap," Am. J. Phys. 59, 301-315 (1991); Arnold B. Arons, Teaching Introductory Physics, (John Wiley & Sons Inc., New York NY, 1997).
  4. For example, see (a) R. K. Thornton and D. R. Sokoloff, "Learning motion concepts using real-time microcomputer-based laboratory tools," Am. J. Phys. 58, 858-867 (1990); (b) P. S. Shaffer, and L. C. McDermott, "Research as a guide for curriculum development: An example from introductory electricity. Part II: Design of an instructional strategy," Am. J. Phys. 60, 1003-1013 (1992); (c) L. C. McDermott, P. S. Shaffer, and M. D. Somers, "Research as a guide for teaching introductory mechanics: An illustration in the context of the Atwoods's machine," Am. J. Phys 62 46-55 (1994); (d) P. Laws, "Promoting active learning based on physics education research in introductory physics courses," Am. J. Phys. 65, 14-21 (1997); (e) E. F. Redish, J. M. Saul, and R. N. Steinberg, "On the Effectiveness of Active-Engagement Microcomputer-Based Laboratories," Am. J. Phys. 65, 45-54 (1997).
  5. E. F. Redish, "Implications of Cognitive Studies for Teaching Physics", Am. J. Phys. 62, 796-803 (1994).
  6. R. W. Moore and R. L. H. Foy, "The scientific attitude inventory: A revision (SAI II)," J. Res. Sci. Teaching 34, 327-336 (1997); J. Leach, R. Driver, R. Millar, and P. Scott, "A study of progression in learning about 'the nature of science': issues of conceptualisation and methodology," Int. J. Sci. Ed. 19, 147-166 (1997).
  7. S. Carey, R. Evans, M. Honda, E. Jay, and C. Unger, " 'An experiment is when you try it and see if it works': a study of grade 7 students' understanding of the construction of scientific knowledge," Int. J. Sci. Ed. 11, 514-529 (1989).
  8. M. C. Linn, and N. B. Songer, "Cognitive and conceptual change in adolescence," Am. J. of Educ., 379-417 (August, 1991).
  9. B. Songer, and M. C. Linn, "How do students' views of science influence knowledge integration?", Jour. Res. Sci. Teaching 28 (9), 761-784 (1991).
  10. Schoenfeld, "Learning to think mathematically: problem solving, metacognition, and sense-making in mathematics", in Handbook of Research in Mathematics Teacvhing and Learning, Ed. D. A. Grouws (MacMillan Publishing, 1992), pp. 334-370.
  11. W. F. Perry, Forms of Intellectual and Ethical Development in the College Years (Holt, Rinehart, & Wilson, NY, 1970).
  12. M. F. Belenky, B. M. Clinchy, N. R. Goldberger, and J. M. Tarule, Women's Ways of Knowing (Basic Books, 1986).
  13. This brief summary is an oversimplification of a complex and sophisticated set of stages proposed in each study.
  14. Perry specifically excludes science as "the place where they do have answers."
  15. F. Reif and J. H. Larkin, "Cognition in scientific and everyday domains: Comparison and learning implications," J. Res. Sci. Teaching 28, 733-760 (1991).
  16. J. S. Brown, A. Collins, and P. Duguid, "Situated cognition and the culture of learning," Educ. Researcher 18 (1), 32-42 (Jan-Feb 1989).
  17. Whenever possible, we have tried to have the survey given as the first item in the class. However, this was not always possible. In the cases where the survey was given after the instructor's description of the class on the first day, there was sometimes a small but noticeable effect on some student responses to particular items.

To continue with part 2 of this paper, click here.
To skip to part 3 of this paper, click here.
To skip to part 4 of this paper, click here.

RETURNS

University of Maryland Physics Department PERG UMD The MPEX Project
page prepared by E. F. Redish
email:redish@quark.umd.edu
last updated 6/17/97