«PROCEEDINGS OF THE 15TH ANNUAL CONFERENCE ON RESEARCH IN UNDERGRADUATE MATHEMATICS EDUCATION EDITORS STACY BROWN SEAN LARSEN KAREN MARRONGELLE ...»
Calais, G. (2008). Microgenetic analysis of learning: Measuring change as it occurs. National Forum of Applied Educational Research Journal, 21(3).
Habre, S. (2009). Multiple representations and the understanding of Taylor polynomials.
PRIMUS 19(5), 417-432.
Martin, J. (2009). Expert conceptualizations of the convergence of Taylor series: Yesterday, today, and tomorrow. Doctoral Dissertation, University of Oklahoma.
Martin, J., Oehrtman, M., Roh, K. H., Swinyard, C. & Hart-Weber, C. (2011). Students’ reinvention of formal definitions of series and pointwise convergence. In (Eds.) S. Brown, S.
Larsen, K. Marrongelle, and M. Oehrtman, Proceedings of the 14th Annual Conference on Research in Undergraduate Mathematics Education, Vol. 1, pg 239-254. Portland, Oregon.
Nardi, Biza, & Gonzalez-Martin. (2008). Introducing the concept of infinite sum: Preliminary analyses of curriculum content. In Joubert, M. (Ed.) Proceedings of the British Society for Research into Learning Mathematics, 28(3), 84-89.
Soto-Johnson, H. (1998). The Impact of Technology on Infinite Series, International Journal of Computer Algebra in Mathematics Education, 5(2), p. 95-109.
Yerushalmy, M., Schwartz, J.L. (1999) A procedural approach to exploration in calculus.
International Journal of Mathematical Education in Science and Technology, 30 (6), 903TH Annual Conference on Research in Undergraduate Mathematics Education Teaching Methods Comparison in a Large Introductory Calculus Class Warren Code, David Kohler, Costanza Piccolo, Mark MacLean University of British Columbia Keywords: Calculus, design experiment, classroom experiment Abstract We have implemented a classroom experiment similar to a recent study in Physics (Deslauriers, Schelew, & Wieman, 2011): each of two sections of the same Calculus 1 course at a researchfocused university were subject to an “intervention” week where a less-experienced instructor encouraged a much higher level of student engagement by design; we employed a modified pseudoexperiment structure for our methods comparison with a Calculus 1 student population and with further steps to improve validity. Our instructional choices encouraged active learning (answering “clicker” questions, small-group discussions, worksheets) during a significant amount of class time, building on assigned pre-class tasks. The lesson content and analysis of the assessments were informed by existing research on student learning of mathematics, in particular the APOS framework.
Introduction and Research Questions Our work is motivated by a demand for empirical study of less-traditional but evidence-based instructional methods for introductory Calculus at the undergraduate level. We gleaned structural ideas from the Physics Education Research (PER) community, though instructional decisions in our study were based on research on students in mathematics, with an attempt to situate our analysis in the Action Process Object Schema (APOS) framework (Dubinsky &
McDonald, 2001). Our research questions are not unlike those of Deslauriers et al. (2011):
Question 1: Compared to more traditional lecture-based instruction, will students demonstrate more sophisticated reasoning on an immediate test of learning when high-engagement instruction is implemented for a single topic (100-150 minutes of class time)?
Question 2: Will any effects persist to later, more standard tests of learning in the course?
Theoretical Perspective Our framework for the pseudoexperimental design follows that of Deslauriers et al. (2011). To our knowledge, and supported by a recent survey article (Speer, Smith III, & Horvath, 2010), no study of this kind has been reported for this size of college-level mathematics classroom.
Our lesson structures borrowed ideas from Peer Instruction (Crouch & Mazur, 2001) and general principles about learning that are now available (National Research Council, USA, 2000) but are not known to many university mathematics faculty, particularly at research-focused institutions.
The key components of the instructional intervention were:
Pre-class activities: reading and structured exploration done individually, with some items submitted online for the instructor to read over.
High-engagement class time: group discussion and activities using structured notes and 15TH Annual Conference on Research in Undergraduate Mathematics Education 375 worksheets, driven in part by pre-class results, clicker questions with follow-up discussion among students and/or whole-class directed by instructor, reactive lecture with small portion of the time for (traditional) exposition.
Identical standard exercises were assigned to both sections after the instructional period, similar to previous course years and non-intervention topics. Student exposure in the interventions was thus largely compatible with the Activities, Class, Exercises (ACE) cycle; for previous research on implementation of this cycle, we consulted Weller et al. (2003).
In designing material for the two classroom intervention topics, we considered sources in the literature for APOS-based study of both topics. For the first topic, Related Rates, we considered the work of Engelke (2007) and especially the recent thesis of Tziritas (2011) where a genetic decomposition for related rates problems was performed and tested; our own decomposition is compatible though our data also permit some extension. For the second topic, Linear Approximation, we considered literature on covariational reasoning (Carlson et al., 2002).
Methodology The setting for our study is a research-focused university in a multi-section (11 instructors) Calculus 1 course primarily aimed at business majors, though the course shares most core material with the science Calculus 1 courses at the same institution. For our interventions, we chose sections with 150 and 200 students taught by two tenured faculty with strong teaching records in terms of length of experience, student evaluations and anecdotal department opinion.
Both instructors used “clicker” personal response devices to enhance classroom interactivity, asking 1- 2 such questions per hour on average. Otherwise, class time was primarily spent on relatively traditional lecture (concepts introduced at the blackboard, worked examples) with some directed whole-class discussion. Both were receptive to student questions during class.
For our instructional intervention, we employed similar elements as Deslauriers et al. (2011):
• Natural setting of two similar sections in the same course, during the same semester.
• Classroom intervention by an instructor with less experience but recent training on theories of learning and non-lecture pedagogy. In our case, a graduate student (the second author) who has taught 3 courses total, including this course once.
• Single topic intervention over approximately one week of classes.
We extended the experimental design in the following ways:
• Introducing a “crossover” by applying two single-topic interventions, one for each course section in a different week, to account for differing student populations. We claim that the two topics chosen, Related Rates and Linear Approximation, are relatively independent items in the course; in our context, the former draws on the notion of derivative as rate, implicit differentiation, word problems with geometric objects, while the latter is more closely connected to the graphical interpretation and estimation.
• Removing the primary investigator (the first author) further from the classroom intervention: though assisting in the development of instructional materials instruction, the primary investigator was not the instructor (the second author).
• Having the initial post-tests of learning based on agreed-upon learning objectives but written by someone (the third author) not involved in the instructional design.
376 15TH Annual Conference on Research in Undergraduate Mathematics Education Tracking student performance with respect to the two topics on subsequent course exams.
• Using the Teaching Dimensions Observation Protocol (TDOP) instrument (Hora & • Ferrare, 2010), developed as part of an NSF-funded project at multiple institutions of higher education, where an in-class observer codes instructor behavior and (expected) cognitive demands upon the students in 5-minute intervals. This has permitted a characterization of classroom activity of the control sections and experimental sections.
We have established a baseline of student abilities using three instruments, based on predictive value for course grades in recent years: a calculus diagnostic: a 20-minute in-class test of prior calculus knowledge mixing “standard” procedure-based problems and conceptual problems, developed for this project; an attitudes survey: online, based on the CLASS Physics survey (Adams et al., 2006), measuring expert-like orientation to the discipline; and a precalculus quiz:
online, based on a local placement exam, found in the previous year to have the same statistical power as high-school mathematics grades in predicting final grades.
Figure 1 shows a timeline, including the positions of the common assessments.
Figure 1: Sequence of the pseudoexperiment: instructional interventions (Xn) took place in Week 8 in Section A, Week 11 of Section B; assessments were: attitudes (attn), precalculus quiz and calculus diagnostic (D), quizzes for Related Rates (QRR) and Linear Approximation (QLA), common midterm question on related rates (MTRR) and the common final exam (FE).
Results of the research Our attitude and precalculus assessments indicated the student populations were similar to those of the previous year. On these and the new calculus diagnostic, the students in both sections achieved similar score distributions. Due to the “crossover”, we were not concerned about identical baselines, but this data establishes these as typical sections in this course.
The data from our immediate assessments support a positive answer for our first research question, and the follow-up assessment for the Related Rates material supports a positive result for the second question. In particular, we saw better performance on conceptual parts of the Related Rates assessments (i.e. about 5-15% more of the students demonstrated an Action or Process understanding of various concepts), and a larger number of students able to demonstrate the correct picture for Linear Approximation (66% versus 48% of the class could draw the correct tangent line, while 42% versus 21% could do so and label the relevant points), for the 15TH Annual Conference on Research in Undergraduate Mathematics Education 377 higher engagement section in each case. Performance in both sections was very close on computational items and concepts more strongly tied to earlier parts of the course. As of the time of writing, the data has not been collected from the common final exam which measures both topics.
For discussion o Do the enhancements to the similar PER study offer improvement? Are they sufficient?
o How broadly convincing are studies involving week-long interventions by “novices”?
o Recommendations on scope of reporting this type of study would be much appreciated;
how much detail on the various assessments, instructors, lessons, theory, results are desirable/feasible?
Acknowledgements The authors would like to thank Matthew Thomas (University of Arizona) and the conference referees for their extremely valuable feedback.
References Adams, W., Perkins, K., Podolefsky, N., Dubson, M., Finkelstein, N., & Wieman, C. (2006).
New instrument for measuring student beliefs about physics and learning physics: The Colorado Learning Attitudes about Science Survey. Physical Review Special Topics Physics Education Research, 2. doi:10.1103/PhysRevSTPER.2.010101 Carlson, M., Jacobs, S., Coe, E., Larsen, S., & Hsu, E.. (2002) Applying Covariational Reasoning While Modeling Dynamic Events: A Framework and a Study. Journal for Research in Mathematics Education, 33(5), 352-378.
Crouch, C. H., & Mazur, E. (2001). Peer Instruction: Ten years of experience and results.
American Journal of Physics, 69, 970. doi:10.1119/1.1374249 Deslauriers, L., Schelew, E., & Wieman, C. (2011). Improved Learning in a Large-Enrollment Physics Class. Science, 332(6031), 862 -864. doi:10.1126/science.1201783 Dubinsky, E. and McDonald, M.A. (2001). APOS: A constructivist theory of learning in undergraduate mathematics education research, D.A. Holton, Editor, The teaching and learning of mathematics at university level: An ICMI study, Kluwer Academic Publishers, Dordrecht, Netherlands, 273–280.
Engelke, N. (2007). Students understanding of related rates problems in calculus. Doctoral dissertation, Arizona State University.
Hora, M., & Ferrare, J. (2010). The Teaching Dimensions Observation Protocol (TDOP).
Madison, WI: University of Wisconsin-Madison, Wisconsin Center for Education Research.
National Research Council (USA) (2000). How people learn : brain, mind, experience, and 378 15TH Annual Conference on Research in Undergraduate Mathematics Education school (Expanded ed.). Washington D.C.: National Academy Press.
Speer, N. M., Smith III, J. P., & Horvath, A. (2010). Collegiate mathematics teaching: An unexamined practice. The Journal of Mathematical Behavior, 29, 99-114.
doi:10.1016/j.jmathb.2010.02.001 Tziritas, M. (2011). APOS Theory as a Framework to Study the Conceptual Stages of Related Rates Problems. Masters Thesis, Concordia University.
Weller, K., Clark, J., Dubinsky, E., Loch, S., McDonald, M. and Merkovsky, R. (2003). Student performance and attitudes in courses based on APOS Theory and the ACE Teaching Cycle, in A. Selden, E. Dubinsky, G. Harel, and F. Hitt (eds.), Research in Collegiate Mathematics Education V, American Mathematical Society, Providence, 97-131.
Keywords: abstract algebra, ring and field theory, developmental research, Realistic Mathematics Education, guided reinvention Research Problem The struggles of undergraduate students with their first course in abstract algebra are well-documented (Dubinsky, Dautermann, Leron, & Zazkis, 1994; Hazzan & Leron, 1996;