«PROCEEDINGS OF THE 15TH ANNUAL CONFERENCE ON RESEARCH IN UNDERGRADUATE MATHEMATICS EDUCATION EDITORS STACY BROWN SEAN LARSEN KAREN MARRONGELLE ...»
Subjects/Methods The first phase of data collection occurred during the Fall 2010 and Summer 2011 semesters at a large, public, research university. The participants were second-semester calculus students – a total of 40 in Fall 2010 and 57 in Summer 2011. After learning about applications of integration, and in particular, using integration to find volumes of solids, the students were tested on the material, and their written responses were analyzed for common mistakes and misconceptions. Each relevant exam question required that students: (a) set up (and possibly evaluate) an integral that represented the volume of a particular solid, (b) sketch the 2dimensional region that was being rotated about a line to form the solid, and (c) sketch a typical approximating cylinder on the same graph as the 2-dimensional region.
Currently, we are in the process of recruiting volunteers to participate in video-taped, task-based interview sessions. We will examine participants’ written work, and identify those whose mistakes fall into the categories that emerge from the phase data. This subset of participants will be interviewed about their thought processes and problem-solving strategies with respect to their written work, and they will also be asked to complete some non-routine definite integral application problems.
Preliminary Results Approximately one-fourth of the students were able to correctly construct volume integrals and sketch the corresponding approximating cylinders for each solid. The remaining students had extreme and varied mistakes and misconceptions in many different aspects of the
problem-solving process. The errors that were most pervasive in student solutions were:
incorrect variable of integration, incorrect bounds of integration, incorrect integrand, inability to sketch an approximating cylinder, inability to connect the integral set-up with the visualization of the solid, and failure to understand the ways in which the two “methods” for finding volumes (slicing vs. shell) differ.
There were no obvious patterns that emerged with respect to student misconceptions in students’ written work. There were instances of correct sketches with incorrect integral set-ups.
There were instances of incorrect sketches with correct integral set-up. And, of course, there were instances where both the sketch and the integral were incorrect.
One problem asked students to find the volume of the same solid in two different ways (via the slicing and the shell methods). Many students chose the same variable of integration for both methods (some choosing x, some choosing y), indicating a lack of appreciation for and understanding of the inherent differences between the two methods.
The broad range of errors and the lack of connection between students’ sketches and integrals indicate that students may be thinking pseudo-analytically during the process of solving these problems. Vinner states that the “most characteristic feature of the pseudo-analytical behavior is the lack of control procedures” (p. 114). Due to the absence of obvious patterns in students’ errors, the video-taped interviews will lend a great deal to our understanding of the source of their confusion.
582 15TH Annual Conference on Research in Undergraduate Mathematics Education Future Questions/Research After complete analysis of phase one data, we hope to arrive at a classification and categorization scheme that will act as a guide for the subsequent phases of research. We hope to come up with non-routine problems for interview sessions that will help us distinguish analytical from pseudo-analytical behaviors. We also want to investigate any pseudo-conceptual behaviors that students may exhibit when discussing their problem-solving processes/strategies during oneon-one interviews.
At this university, students are introduced to the concept of definite integral and a few elementary applications, and they then move on to investigate techniques of integration. After a full chapter of learning techniques, they return to the study of definite integral applications, but in more complex physical situations (volume, work, centers of mass, etc.). Since students do not seem to be making the connection between the volume of the small slice of the solid and the integral set-up, we believe it would be advantageous to actively maintain the Riemann sumdefinite integral connection and not have it broken up by discussion of calculation-heavy integration techniques. Knowing techniques of integration definitely gives students more tools for solving a greater variety of application problems, but these tools are only useful if the student can set up the integral correctly in the first place.
Questions for Audience
--What are examples of non-routine volume problems that will aid in uncovering students’ underlying misconceptions about the applications of definite integrals?
--Is there computer software that could aid students in the visualization aspect?
--What are the implications of students continuing on through calculus and not truly understanding the definite integral as a limit of the sum of smaller constituent parts of the whole?
(In other words, “So what?”)
Bezuidenhout, J., and Olivier, A. (2000). Students’ conceptions of the integral. In Proceedings of the 24th Annual Conference of the International Group for the Psychology of Mathematics Education (Hiroshima, Japan, July 23-27, 2000), 2, 73-80.
Dubinsky, E. (1991). Reflective abstraction in advanced mathematical thinking. In D. Tall (Ed.), Advanced Mathematical Thinking (pp. 231–250). Dordrecht, The Netherlands: Kluwer.
Gonzalez-Martin, A. and Camacho, M. (2004). What is first-year mathematics students’ actual knowledge about improper integrals? International Journal of Mathematical Education in Science and Technology, 35(1), 73-89.
Grundmeier, T., Hansen, J., and Sousa, E. (2006). An exploration of definition and procedural fluency in integral calculus. PRIMUS: Problems, Resources, and Issues in Mathematics Undergraduate Studies, 16(2), 178-191.
Huang, C. (2010). Conceptual and procedural abilities of engineering students in integration.
Proceedings of the Conference of the Joint International IGIP-SEFI Annual Conference (Trnava, Slovakia, Sept. 19-22, 2010).
Orton, A. (1983). Students’ understanding of integration. Educational Studies in Mathematics, 14(1), 1-18.
Piaget, J. (1970). Structuralism. New York: Basic Books, Inc.
Sealey, V. (2006). Definite integrals, Riemann sums, and area under a curve: What is necessary and sufficient? In Proceedings of the 28th Annual Meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education, (Merida, Yucatan, Mexico, Nov. 9-12, 2006), 2, 46-53.
Vinner, S. (1997). The pseudo-conceptual and pseudo-analytical thought processes in mathematics learning. Educational Studies in Mathematics, 34(2), 97-129.
Yeatts, F., and Hundhausen, J. (1992). Calculus and physics: Challenges at the interface.
American Journal of Physics, 60(8), 716-721.
Abstract: Online homework systems are designed to engage students with course topics while providing immediate feedback. Few studies have indicated a significant difference in student performance using online homework systems compared to traditional homework. Our study seeks to add to the growing body of research examining the effectiveness of online homework systems by investigating the performance of students taking Intermediate Algebra. This study will compare differences in student exam scores based on their homework medium: WebAssign, ALEKS, or traditional homework. Each instructor participating in the study taught at least one course with each homework system. Preliminary results indicate no significant difference in student learning between students using WebAssign or traditional homework. ALEKS data is currently being collected and suggests students are developing a thorough understanding of specific Intermediate Algebra topics.
Keywords: online homework, classroom research, ALEKS
Online homework systems, such as WebAssign, MyMathLab, WeBWorK, and ALEKS, have been designed to engage students with course topics while simultaneously providing immediate feedback. To this end, the systems are designed to better the delayed (or altogether absent) feedback of traditional paper-and-pencil homework. If a student has difficulty on a problem they usually have the option of seeing, either through text or a video-clip, an explanation of a similar problem.
Few studies have indicated a significant difference in student performance using online homework systems compared to traditional homework. In a study using WeBWorK in a calculus class, Hirsch and Weibel (2003), found that the final exam grades for students using WeBWorK were 4% higher, on average, than their non-WeBWorK peers. Hauk and Segalla (2005) found no significant difference between the performance of online and traditional homework sections of students taking college algebra. By designing online software that provides students with detailed feedback for incorrect responses and allowing several attempts at each assignment Zerr (2007) found that student learning in an introductory calculus course improved. Allowing multiple attempts can also be detrimental: not only can scores be quite high (over 85% is common) but students, seeing they received high marks, might develop a false sense of confidence and prepare less for their exams than their traditional homework counterparts.
15TH Annual Conference on Research in Undergraduate Mathematics Education 585 ALEKS (Assessment and LEarning in Knowledge Spaces) was not designed as purely an online homework tool such as WebAssign but is based on Knowledge Space Theory (Falmagne et al., 2000). To form a knowledge space, one must first define a set of concepts. For an Intermediate Algebra class this consists of a list of specific algebraic topics (the software contains almost 400 topics in Intermediate Algebra alone.) Based on a student’s performance on an initial assessment, ALEKS determines a subset of topics known by students and, taking into account prerequisite relations among topics, provides a list of topics students are ready to learn.
In order to learn new topics students must correctly answer three similar randomly-generated problems sequentially. At any time a student can bring up a detailed explanation of their current problem; if a student answers incorrectly three times they are given an explanation and asked to try again. Due to the nature of the ALEKS software a student spends the most time on topics they find difficult. In addition, students are always working on topics near their current ability.
This also means a classroom of students working on ALEKS can be working on a wide range and variety of topics and does not fit well within a typical structured class covering only specific topics on specific days.
Research on the integration of ALEKS is growing, though results are still varied. Stillson and Alsup (2003) found an almost 50% increase in drop and failure rate among Basic Algebra students using ALEKS. However, they believed ALEKS would benefit students more if they took the time to use it and recommended the course be offered in a classroom setting. Taylor (2008) found that Intermediate Algebra students using ALEKS had a better attitude and felt less anxious toward mathematics than a control group yet performed as well as students in a lecturebased class. Hagerty and Smith (2005) found that students using ALEKS performed significantly better (both short-term and long-term) in College Algebra than students in the control group. Oshima (2010) argued that ALEKS improves students’ mathematical knowledge and skills as well as their passing rate of College Algebra. He also reasoned the success of ALEKS was due to how it was integrated into the classroom—that ALEKS was not simply an add-on homework tool.
Our study seeks to address the question of which (of three) homework systems had the greatest impact on student learning (defined according to performance on a common final exam) on students taking Intermediate Algebra at a large public university in the Midwest and to add to the growing body of research examining the effectiveness of online homework systems.
Specifically we will compare WebAssign, traditional paper-and-pencil homework, and ALEKS.
Data collection began in Spring 2010 with three instructors each teaching at least one section using Webassign and one section using paper and pencil homework and will continue through Spring 2012. Each participating instructor used each homework system at least once to aid in the comparison. Before the study began a comprehensive multiple choice exam was created using questions from the test bank provided by the publisher that was designed to cover all of the learning objectives of the course. This exam served as both the pre-test and the post-test and was 586 15TH Annual Conference on Research in Undergraduate Mathematics Education administered to all students participating in the study. For the ALEKS classes, a group of four instructors met and chose the 228 specific topics that would comprise the course.
At the current stage of our study we have collected three semesters of pre and post test data with instructors using both WebAssign and traditional homework. Currently analysis has consisted of t-test comparisons by treatment and as of yet there has been no significant difference in student learning (as defined above) between WebAssign and traditional homework sections.
We are currently collecting ALEKS data from three instructors. Along with further ttests, an item analysis is planned with the inclusion of the ALEKS data. Following the direction of Stillson, Alsup, and Oshima we have allowed the ALEKS software to become the classroom.
Classes meet in a computer lab and spend the entirety of classroom time working on ALEKS topics. Instead of “professing” the instructor serves as a learning guide by moving about the classroom individually helping students with topics they are working on. No additional “homework” is assigned. Preliminary ALEKS data is promising and suggests students are developing a thorough understanding of specific Intermediate Algebra topics.
Questions for audience consideration:
1. What do instructors who have used ALEKS think of the program?