Journal of Statistics Education Volume 15, Number 2 (2007), http://jse.amstat.org/v15n2/jordan.html
Copyright © 2007 by Joy Jordan all rights reserved. This text may be freely shared among individuals, but it may not be republished in any medium without express written consent from the author and advance notification of the editor.
Key Words: Computer lab, Data analysis, Knowledge survey, Research application
Lovett begins her article with a summary of cognitive research (relevant to statistics) over the last 30 years. Specifically she mentions that the work in each decade (70s, 80s, and 90s) typically involved a solitary method of research (theoretical, empirical, and classroom-based, respectively). She argues for a collaborative approach to studying statistical reasoning that is strongly grounded in theory, yet involves research in both the psychology lab and the actual statistics classroom: “A successful route to improving students’ transfer of statistical reasoning skill may rely heavily on integrating instructional and cognitive theory, while maintaining a link to the realities of the classroom” ( Lovett, 2001, p. 348).
Lovett’s team at Carnegie Mellon used a collaborative research approach to determine the tasks with which their statistics students struggled and exactly why they struggled. Then they used these results to create an educational intervention. Among other findings, Lovett discovered that students, even after taking a statistics course that included a computer lab component, had difficulty choosing appropriate statistical displays. Lovett found students “were relying on the statistics package as a crutch to get a reasonable analysis on screen … By using the statistics package interface cues, they were able to apply a basic guess-and-test strategy in order to generate analysis” (p. 374). A contributing factor to this student struggle is that often in statistics classes, “students do not actually have many opportunities to make [data analysis] choices on their own; these choices are either made for them explicitly or implicitly” (pp. 378 – 379). Choices are made for students explicitly if an assignment specifically directs the analysis (e.g., “Create a histogram of variable x and describe the distribution”). Alternatively, choices can be made for students implicitly if only one or two new analysis techniques are introduced per week, which allows students to easily find the answers within the lab instruction. Lovett discusses the importance of purposely created scaffolding for students by which they can learn the process of data analysis. Using this idea, Lovett’s team created a computerized learning program for statistics students.
Although I did not use Carnegie Mellon’s specific computer program in my classroom, these research results resonated with me and generated many questions: When creating my computer lab assignments, do I explicitly and/or implicitly make all choices for my students? Do I provide scaffolding to help the students learn the data analysis process? At the end of the term, what do my students really know about the overall process of data analysis? It was these questions that informed my application of Lovett’s research to my classroom.
As in the past, I used the first three computer lab sessions to build students’ Minitab and data analysis skills. These sessions and the ensuing assignments were definitely directed. During the labs, though, I more purposely spoke about the process of data analysis—why we choose certain analyses. I realize this is a passive method of learning for students, but it was still a small change aimed at increasing their knowledge of the process of selecting graphical and numerical summaries.
I revised the fourth lab session to include an open-ended analysis of data from an alcohol metabolism case study (Ramsey and Schafer, 2002, pp. 305-306). In this session, students worked on their own to appropriately answer real research questions, using Minitab to aid their analyses. While the students worked, I walked around the room to observe their analyses and answer questions (this is what I typically do and is not a change). This time, though, I initiated interactions with them, asking why they chose a specific graph, particularly if it seemed an inappropriate choice (this is a small change I implemented—more actively engaging with students about their data analysis choices).
I also revised the fourth lab assignment to be more open-ended than previous assignments. I have long realized it is important for students to grapple with open-ended analysis questions, but I have equally struggled with a heavy grading load, where I sometimes spend too much time grading and not enough time with students. (I also realize the positive pedagogical effects of students collecting and analyzing their own data via projects, but within a 10-week period I have found these difficult to successfully implement.) The assignment I created (Appendix A) is a compromise—it does not make choices for the students, but it also is not onerous to grade. It is not a model assignment, but it is a small change I made to challenge my students in a new way.
It quickly became obvious that I was, in fact, challenging the students. There were more questions on this assignment than on any of the previous assignments. Furthermore, there were many more mistakes on this assignment. For example, in the first term where I implemented this change, one-third of the students when asked to describe the relationship between two quantitative variables (Appendix A, question 2) did not include a scatterplot (they only included separate one-variable graphs) or they included an inappropriate scatterplot.
By using only directed lab assignments in the past, I incorrectly assumed students could easily transfer their specific learned skills into general data analysis techniques. This made it clear to me that I should continue to use at least some open-ended assignments in my class. Furthermore, I need to think more carefully about how to provide appropriate scaffolding for students while they learn the process of data analysis. This scaffolding should not be too directed and specific (e.g., “Create a scatterplot and interpret”, yet neither should it be directionless (e.g., “Analyze these data.”). I have not accomplished this instructional goal within my computer lab sessions, but I continue to make small changes each term in hopes of moving closer to an answer.
At a recent conference, I learned about knowledge surveys (Wirth and Perkins, 2006). Knowledge surveys are a different type of assessment tool; they do not require students to actually complete problems, but instead simply ask students about their confidence in solving problems. In a short amount of time these tests can assess student confidence in a large body of material (since students do not take the time to actually solve the problems) with a variety of questions that cover the cognitive domain. Furthermore, scores on knowledge surveys can have a strong positive correlation with actual student performance on exams (Wirth and Perkins, 2006).
I used a knowledge survey to gauge my students’ confidence in data analysis, so I could focus the last two lab sessions in a more informed way. Two examples of my knowledge survey questions are included in Appendix B. To keep things simple, I used Wirth and Perkins’ exact wording for the multiple choice answers. An easy change to make in the future would be to modify these answer choices to my specific course goals.
When analyzing the results of the knowledge survey, I focused on questions that received many “a” answers (“I do not understand the question, am not familiar with the terminology, or am not confident I can answer the question well enough for grading purposes at this time”). I found my students were still uncomfortable with regression diagnostics (e.g., creating and interpreting a residual plot). Based on this information, I took time in the next lab to review regression analysis and also allowed students time to practice on their own. Hence, I found the knowledge survey to be a quick and helpful source of feedback about students’ confidence in their data analysis skills.
I think an essential component of applying research in the classroom is to start small. Small changes are typically easier and less time-consuming to implement (big changes may seem appealing, but often feel so onerous that they never actually get implemented). Furthermore, it is easier to assess the impact of small changes. If research results are implemented bit by bit over time, this can amount to very large and powerful changes in the long run (along a creative path of the instructor’s choosing).
Another positive action statistics teachers can take is to look outside statistics education research to pedagogical research more broadly. As I previously mentioned, there is a rich body of research in the areas of cognitive and educational psychology (for an overview see Bransford, Brown, and Cocking, 2001). There are also journals and conferences on the general scholarship of teaching and learning (e.g., College Teaching, Journal of Excellence in College Teaching). While not specific to statistics, these resources can aid any teacher and can also serve as creative stimulus for a teacher’s own research.
A teaching circle can also be a supportive group with which to discuss application of educational research in the classroom. I recently participated in a teaching circle at Lawrence on the scholarship of teaching and learning. We read new research articles and met bi-weekly to discuss the articles and the possible implications of the research results. This was a very supportive environment where I could obtain feedback on changes I made in the classroom. Furthermore, the group consisted of faculty from different disciplines (e.g., psychology, music theory, library science), which provided me with a fresh outlook on my pedagogical practices. The teaching circles at Lawrence are organized by the director of the Center for Teaching and Learning, but I think it would also be easy for any interested faculty member to create a circle on her own.
In summary, when applying educational research in one’s own classroom, I think it’s important to stay broadly informed, make small changes, and create an effective support network. There is always more to learn about pedagogical practices, which is why teaching is such a gratifying profession.
Consider the data in the cereal.MPJ file (in lab you should have copied this file into your account). This file includes many variables measured on 74 different cereals. (Note that “display shelf” is the shelf on which the cereal was located; 1 indicates the bottom shelf.) If you have any questions about the variables, please ask.
Using Minitab to aid your analysis (don’t do these by hand), answer the following questions:
All these questions should be answered assuming you’ll use Minitab to aid your analysis.
A Minitab data file contains the following information on a sample of 100 college students: sex, student ID number, GPA, number of study hours per week, year in school, age, and high school GPA. Using this data file, answer the following questions. (Circle either a, b, or c for each question.)
Bransford, J.D., Brown, A.L., and Cocking, R.R. (eds.) (2001), How People Learn: Brain, Mind, Experience, and School, National Academy Press.
Carver, S.M. and Klahr, D. (eds.) (2001), Cognition and Instruction: Twenty-Five Years of Progress (pp. 347-384), Lawrence Erlbaum Associates, Publishers.
Lovett, M. (2001), “A Collaborative Convergence on Studying Reasoning Processes: A Case Study in Statistics,” in Carver and Klahr (eds.), Cognition and Instruction: Twenty-Five Years of Progress (pp. 347-384), Lawrence Erlbaum Associates, Publishers.
Moore, D.S. and McCabe, G.P. (2006), Introduction to the Practice of Statistics, 5th Edition, Freeman and Company.
Ramsey, F.L. and Schafer, D.W. (2002), The Statistical Sleuth: A Course in Methods of Data Analysis, 2nd edition, Duxbury Thomson Learning.
Wirth, K.R., and Perkins, D. (2006), “Knowledge Surveys: An Indispensable Course Design and Assessment Tool,” presentation at Innovations in the Scholarship of Teaching and Learning at Liberal Arts Colleges. (http://www.macalester.edu/geology/wirth/WirthPerkinsKS.pdf)
Department of Mathematics
PO Box 599
Appleton, WI 54912
Volume 15 (2007) | Archive | Index | Data Archive | Information Service | Editorial Board | Guidelines for Authors | Guidelines for Data Contributors | Home Page | Contact JSE | ASA Publications