Michelle Sisto
International University of Monaco
Journal of Statistics Education Volume 17, Number 2 (2009), jse.amstat.org/v17n2/sisto.html
Copyright © 2009 by Michelle Sisto, all rights reserved. This text may be freely shared among individuals, but it may not be republished in any medium without express written consent from the author and advance notification of the editor.
Key Words: Business statistics; Cooperative learning; Assessment.
Students increasingly need to learn to communicate statistical results clearly and effectively, as well as to become competent consumers of statistical information. These two learning goals are particularly important for business students. In line with reform movements in Statistics Education and the GAISE guidelines, we are working to implement teaching strategies and assessment methods that align instruction and assessment with our learning goals. One of the main instructional tools we use is group projects with elements of data collection and analysis, written and oral presentation, and self, peer and professor assessment. This paper addresses specific challenges encountered while teaching and directing group work in a highly multicultural context of 10 to 20 different nationalities in the same classroom. It also focuses on the learning benefits of having students work collaboratively to discuss, write, present, and assess statistics projects in English.
Teaching business statistics in a small, private, English-language business school in Europe gives me the opportunity to work regularly with a diverse group of students. The student body and the faculty come from all over the world. In the 2007/2008 academic year, there were over 40 nationalities and over 25 first languages among the full time students. English is a second language for most of the student body.
Undergraduate students take a two course sequence in Introductory Statistics/Quantitative Methods and MBA students take one course in Data and Models. Over the last seven years I have integrated group projects into these courses and both the style of projects and the manner of assessing them have evolved considerably. In this paper I describe the goals and structure of the projects, the challenges and advantages of directing group work in such a multicultural atmosphere, the variety of assessment methods used and the advantages of each, and the improvement I’ve seen in student learning.
The advantages of using data driven projects and in working collaboratively in Introductory Statistics courses are well documented, (Fillebrown, 1994), (MacGillivray, 1998), (Chance, 2000), (Roseth, Garfield, & Ben-Zvi, 2008). Group projects can deepen students’ understanding of statistics, increase their interest in and appreciation of the usefulness of statistics in making business decisions, and help to align coursework more clearly with the goals of an undergraduate business education. Five main learning goals of our statistics courses are for students to:
We use project work to help students achieve these goals, goals which are in line with GAISE College report guidelines to use real data, emphasize statistical literacy and develop statistical thinking, stress conceptual understanding, foster active learning in the classroom, and use technology for developing concepts and analyzing data (Guidelines for Assessment and Instruction in Statistics Education).
Courses meet twice a week with at least one meeting in the computer lab. All classrooms are equipped with computers and projectors, so computer demonstrations are possible any day. Undergraduate class size is usually between 10 and 25 students. We use Excel and Excel Add-Ins to create graphics, find summary measures, produce various statistical outputs and run simulations. Prior to taking these courses students have passed a course on information technology so they are already familiar with Word, Excel and PowerPoint.
Students complete two group projects per semester. Each project consists of three related reports: a business memo to communicate results to a non-statistician, an appendix with results clearly summarized for a statistician and a PowerPoint presentation for an audience assumed to possess some technical knowledge. Some projects require students to collect data on their own, while others use published data.
The one to two page business memo for a non-technical reader explains the method of data collection and source of data, the goals of the study, the main findings of the project with references to graph/tables in the appendix, and concludes with recommendations for action or for further study. Typically students find the memo the most challenging component as it is the furthest from their comfort zone. They must pare down analysis to essentials and write in clear, concise English without using technical jargon. In addition, they are unaccustomed to weighing their analysis and making suggestions for action, rather than finding a "right" or "wrong" answer.
The Appendix contains all graphs, summary measures and statistical output. These must be labeled and commented where appropriate for a technical audience. For example, I expect explanations of statistical output such as p-values and regression coefficients. Despite continual reminders, the greatest challenge in this section still tends to be labeling graphs and axes correctly and selecting appropriate graphs. Errors in selecting statistical methods are caught early as the students turn in an "analysis plan" prior to carrying out the analysis so that I may guide them in an appropriate direction if need be.
Students assume a mixed audience for their presentations. They must articulate the goals of the project, the source of data, the main findings, and then make recommendations and discuss any aspect of the project/study that posed challenges or could be improved upon. Speaking time must be divided fairly evenly amongst group members. For many students, but particularly for our Asian students, speaking in front of the class is a new and intimidating experience, compounded by presenting in English, a second or third language for most. As we want all students to develop effective speaking skills, a positive and supportive classroom atmosphere is essential for this component.
On the first day of class, I explain the integral role that project work plays in the course. We go over the syllabus together and discuss the components of projects and the fact that I will ask them to do self and peer assessments. For the undergraduate courses, students work in groups of two or three. They may pick their own groups but no two students may work together more than once. Since no single nationality constitutes more than a quarter of our student body, this setup facilitates working across cultures and languages and leads to greater levels of participation by all group members. For the MBA, groups are assigned over the full term, each with 4-5 students and with no more than two common language speakers per group.
While there are many advantages to such multicultural teams, the "richness of this diversity make group dynamics much more complex," (Schneider & Barsoux, 2003, p.219). As Hofstede explains (Hofstede, 1997), understanding the differences in thinking among partners is at least as essential as understanding the technical factors in questions of cooperation. Cultural values are largely engrained during the childhood years, however, it is "common practices, not common values, that solve practical problems," (Hofstede, 1997).
On the first day of class we discuss some of the advantages and challenges of working in groups as well as strategies to capitalize on the advantages and to address the challenges. My goal is to get the groups moving toward "common practices" early on. As Schneider and Barsoux argue (Schneider & Barsoux, 2003 pp. 219-220), "cultural differences are expressed in different expectations about the purpose of the team and how the team is supposed to operate." By recognizing and addressing explicitly some of the stereotypes and expectations they have, students are better suited to begin teamwork. Schneider and Barsoux created a list of questions about task strategies and process strategies that I’ve found extremely helpful in stimulating discussion among team members. The task strategies include questions about creating a sense of purpose, structuring the task, assigning roles and responsibilities and reaching decisions. The process strategies include questions about team building, choosing how to communicate, eliciting participation, resolving conflict and evaluating performance. The corresponding cultural determinants of each of these groups of questions are listed as well to promote discussion. While these questions are directed for multicultural groups, they could benefit virtually any group work as other layers of culture include gender, generational, social class and so on. Another resource for getting teamwork off to a good start is "Having a Successful Team," (Williamson, Mendel, Tarr, & Yoklic, 2005).
Mutual respect and a comfort level with each other are key ingredients to successful teamwork and carrying out peer critiques. To facilitate getting to know each other and to create a positive and constructive atmosphere in the classroom, we each introduce ourselves the first day, saying something about where we are from, what languages we speak, what our passions are and our experience in working with groups. The introductions may take a half hour, but the time is well spent as students discover peers with common interests and begin to feel comfortable with each other and with speaking to the whole class.
As in most business schools, we have debated the use of statistical packages versus Excel for our courses. While we are aware of many of the shortcomings of Excel for statistical analysis (Cryer, 2001), (Nash, 2006), we choose to use Excel with the Data Analysis Add-In and some other textbook-specific Add-Ins for all of our statistics courses. Our reasons are multiple. First, all of our students study business and Excel is commonly used in business for presenting financial and other information, and for adding graphics to presentations. We are also confident that our students are more likely to transfer their skills and knowledge from our courses into other courses and the workplace if they learn a tool that is commonly available. Additionally, given the lack of confidence in their own mathematical abilities from which many of our students suffer, requiring they learn a new software package tends to increase anxiety about taking statistics. In a comparative study of Excel 2003 to some other statistical packages typically used in introductory courses, Excel 2003 performed well on most tasks (Apigian & Gambill, Winter 2004-2005). Lastly, and importantly in our multicultural context, a hidden advantage of the Data Analysis Add-In is that it is language specific, the output and the menu options are in the language of the version of Excel the students use. This gives them the opportunity to learn statistical terms in English as well as in their native language, again increasing the likelihood that they will continue to transfer knowledge learned upon returning to their countries of origin. Table 1 below gives a taste of the differences, all results of running Tools/Data Analysis/Descriptive Statistics.
Table 1
English |
Spanish |
Japanese |
Swedish |
Flemish |
Mean |
Media |
平均 |
Medelvärde |
Gemiddelde |
Standard Error |
Error típico |
標準誤差 |
Standardfel |
Standaardfout |
Median |
Mediana |
中央値 (メジアン) |
Medianvärde |
Mediaan |
Mode |
Moda |
最頻値 (モード) |
Typvärde |
Modus |
Standard Deviation |
Desviación estándar |
標準偏差 |
Standardavvikelse |
Standaarddeviatie |
Sample Variance |
Varianza de la muestra |
分散 |
Varians |
Steekproefvariantie |
Kurtosis |
Curtosis |
尖度 |
Toppighet |
Kurtosis |
Skewness |
Coeficiente de asimetría |
歪度 |
Snedhet |
Scheefheid |
Range |
Rango |
範囲 |
Variationsvidd |
Bereik |
Minimum |
Mínimo |
最小 |
Minimum |
Minimum |
Maximum |
Máximo |
最大 |
Maximum |
Maximum |
Sum |
Suma |
合計 |
Summa |
Som |
Count |
Cuenta |
標本数 |
Antal |
Aantal |
Joan Garfield describes the current vision of assessment as "that of a dynamic process that continuously yields information about student progress towards the achievement of learning goals," (Garfield, 2000). She goes on to discuss the evolution of assessment from the assignment of grades to an integral part of the teaching and learning process, and in particular to say that the primary purpose should be to improve student learning.
Garfield recommends that an appropriate assessment framework include
For the group projects, all aspects of this framework have evolved over the years.
Regarding "what to assess," when a project is assigned, all students receive a grade sheet rubric with several sets of criteria detailing what will be assessed and how value will be assigned for each criteria:
I’ve found that the more specific and clear the criteria are, the higher the quality of the projects. For courses above the introductory level, less guidance may be appropriate in order to mimic a consulting experience, but for introductory courses with students coming from diverse backgrounds and cultures, I prefer this more structured approach. Please refer to Appendices A and B for examples of grade sheets/ rubrics I’ve used at the undergraduate and the graduate levels.
To make assessment a purposeful tool to enhance student learning, in addition to assigning a numerical value to each criterion, I include written comments for each providing clear indications of how to improve on future work. Comments praising what students have done well are also important to encourage good habits in the statistical investigation cycle. Additionally, as many students come with a negative view of their own ability to succeed in a quantitative, mathematical course, a little praise goes a long way in establishing self confidence. Students also have the option of submitting work at least three days early for review in order to benefit from feedback prior to submitting the final "graded" version.
Over the years, both the "who assesses" and the "method" of assessments have evolved. Originally I was the sole assessor of work using the grading sheet rubric and comments as described above. Currently, each project also includes elements of self and peer assessment.
Garfield states that "engaging the student in self-assessment is a critical and early part of the assessment process, and that no major piece of work should be turned in without self-criticism," (Garfield, 2000). Having received detailed assessment criteria, the students have clear guidelines to use in assessing their own work. Since most groups tend to divide up parts of the projects, these guidelines help them to critically evaluate each other’s work during project preparation.
I stress on the first day that it is more difficult to produce original work than to critique another’s work but that almost any work can be improved by further discussion with colleagues, and I emphasize the need for the group members to complete all work well ahead of time in order that they may provide each other with helpful feedback. For example, once the data analysis stage is complete, one member may write the memo while another creates the PowerPoint presentation. Since all members earn the same grade for the final project, they recognize the benefit in reviewing each other’s work and making constructive suggestions. This group self-assessment process has double benefits, leading them to create work of higher quality and improving their critical reading skills.
On the day projects are due and presentations are made, each student individually fills out a grade sheet rubric for his own project. After I’ve graded the projects, I compare my evaluation to the students’ self evaluations and provide comments where our evaluations differ. I’ve found this process helps students develop more realistic expectations of their "grade." Doug Wittenberg presented a talk at JSM 2006 in which he added points to the students’ final exam if their self-predicted grades were within 3 points of his evaluation. I have not tried that in this context, but find it an interesting idea.
When I first began using peer critiques, it was as informal discussion after each oral presentation. One year a group of particularly talented and articulate "presenters" gave a presentation with stylish slides, convincing and engaging audience contact, and horrendously flawed statistical reasoning – concluding that the price of a barrel of Brent crude oil would remain between $26 and $30 for the next five years. When I asked the class for feedback, they were overwhelmingly positive. So impressed were they with the style, that not a single student had noticed the errors in content. From that day on, I have included explicit written peer assessment of content in the projects as a way to increase critical listening skills. The GAISE guidelines recommend that students should "know how to critique news stories and journal articles that include statistical information," and I try to extend this to statistical information presented in oral presentations.
For each oral presentation, students must evaluate their peers, writing at least two specific positive comments about what went well and two specific comments on how the project presentation could be improved. Two of the four comments must be content related and all must be constructive or regenerative. See Table 2 for examples.
Table 2: Examples of Peer Assessment Comments
What went well? |
What can be improved? |
Clear analysis of the correlation matrix Good idea to show all variables in a single chart with trend line Interesting analysis comparing male/female future work/study plans, but expected values less than 5 for chi-squared test Clear interpretation of slope for dummy variable fireplace and for lot size.
|
Scatter plots should be on same slide in order to compare better. Serious crimes should be in first column of correlation matrix for better visualization Could have done a hypothesis test to show that there is no statistical evidence that the two independent variables used in multiple regression are related R-sq almost 1 so it should be a strong line fit, group said it wasn’t P value of t-test for "male arrests" is bigger than 0.1 so they should not consider it in final model Clarify sampling method and population more |
"While not easy to do in any culture, some cultures are more prepared to give and receive feedback than others, making the process of evaluating performance a potential cultural minefield," (Schneider & Barsoux, 2003, p. 239). At the Joint Statistical Meetings 2006, Huizhen Guo presented a moving and frank talk on Balancing Cultural Differences in Teaching Statistics, (Huizhen, 2006). She spoke of her difficulty in evaluating professors as an undergraduate as her culture taught that it was not the place of the student to evaluate professors. On the flip side, she addressed the difficulty she had upon receiving her own evaluations as a professor and the challenges she faces in accepting feedback from the students. In a study examining the relationships between bosses and subordinates (paralleled here by professor and student), Hofstede defines Power Distance as one of the dimensions of national cultures (Hofstede, 1997). The Power Distance Index indicates a relative measure of the level of "dependency" between bosses and subordinates in national cultures, with France, Spain, Asian and African nations showing a high index and the United States, Great Britain and Scandinavian nations showing a low index. The lower the index, the more comfortable subordinates feel disagreeing with bosses and the stronger the preference for a consultative style of decision making. These cultural issues therefore arise in much the same way in peer evaluations, so I must carefully approach the "minefield" of peer evaluations with each new group of students.
To break the ice and to provide practice, I often ask students to anonymously critique my class a few days before their first presentation session. "Exposing" myself to criticism in this way helps the students to accept the idea of critiquing each other. I distribute a summary of all comments in the following session and as a class we develop useful vocabulary for constructive criticism. Students work in pairs to identify comments that are useful, and to rewrite comments that are neither constructive nor regenerative. The entire discussion takes about 15-20 minutes of class time but has long-lasting implications on the quality of comments and the comfort level in critiquing.
For all projects, each group receives an anonymous summary of peer comments. These peer assessments have multiple benefits. Students are strongly motivated to receive positive feedback from their peers, they tend to react more positively to critiques from their peers than from professors, and by doing the critiques students improve their critical listening skills and develop the kind of evaluation skills that are transferable to many other academic and business situations. Additionally, students become actively engaged in all presentations rather than passive listeners. Project and critique quality improve dramatically throughout the two semesters as students gain confidence and a greater comfort level in giving and receiving constructive criticism. An added bonus is the more specific qualitative feedback I receive in student evaluations at the end of the semester which helps me to improve the course.
Each individual student also earns from -2 to +2 points for the quality of his/her written critiques. Each student’s final project grade is then calculated as the group grade adjusted by individual critique grade. The motivating effect of these extra points far exceeds their effect on the final grade.
Student response to project work has been overwhelmingly positive. Typical students’ comments on course evaluations are that they found math interesting and applicable for the first time, they improved their ability and their confidence in writing and presenting technical information and working in English, and gained valuable experience in working as a team on a common project. Often there are comments on the amount of work the course requires, however over the last 5 years more than 90% of numerical responses on a scale of 1 to 5 with 5 as "strongly agree" have been 4 or 5 to the questions "Do you find the quantity of projects appropriate" and "Do you find the quality of projects appropriate". Many students also now use some form of statistical reasoning and analysis in their final thesis projects before graduating, a promising sign of greater interest in and retention of material learned.
While I have only anecdotal data, since incorporating regular project work, I’ve received positive feedback from client disciplines suggesting students are better prepared for project work in other courses, are using statistics in other courses, and have strong critiquing skills.
One of the greatest challenges for the instructor, however, is that teaching with projects and especially incorporating peer feedback have increased the workload for these courses. For example, completing the peer feedback summaries adds to grading time and, to be useful, this must be done quickly after the presentation. One possible solution in larger classes is to photocopy critique sheets and then cut out the critiques, returning a stapled stack of comments to each group. Getting other colleagues on board and excited about these methods is another challenge. However, as Robert Wardrop argues in his article on using student projects (Wardrop, 2000), "on certain occasions a teacher should opt for a more difficult way to teach statistics." In my experience, the extra effort to teach this way is well rewarded by increased satisfaction from seeing students apply what they learn to new situations and develop skills they can use throughout their academic and professional lives. In addition, I have had the opportunity to learn about a variety of new topics based on my students’ choices of projects.
1.1 Select a topic of interest and collect data (either by using a resource already available or by collecting it yourselves). You must have at least 4 independent variables and one dependent variable. (4%)
1.2. Clearly identify the population from which the sample is taken, the method of data collection, the source of the data, and comment on any limitations of the method/data. (4%)
1.3. Comment on the dependent variable and its expected relationship with all independent variables. Show me your data BEFORE beginning the rest of the project. (2%)
2.1. Defend your choice of "best variable" to use in the simple regression model. (3%)
2.2. Create an appropriate graph, write down the model, and interpret model coefficients. (5%)
2.3. Evaluate the model in the context of your data – think in terms of statistical significance of relationship, strength of relationship, and explained variation. (9%)
2.4. Make interval predictions using your model and comment these. (5%)
2.5. Verify model assumptions. (3%)
3.1. Discussed choice of variables and showed reasoning on what to (not to) include and why? (5%)
3.2. Evidence of model building and consideration of changes to coefficients as variables added/dropped? (4%)
3.3. Model coefficients interpreted? (4%)
3.4. Model evaluated? (5%)
3.5. Predictions made and commented? (4%)
3.6. Model assumptions verified? (3%)
4.1. Introduction explaining research goals and source of data. (3%)
4.2. Body of memo summarizing results. (5%)
4.3. Clear conclusion about relationships found and how well the independent variables explain the dependent variable. (2%)
4.4. Discussed any shortcomings of project, anything you would do differently next time, and possible follow up studies. (2%)
4.5. Well written (grammar, punctuation, spelling, structure) and clear to the non-statistician. (4%)
4.6. References to appendix? Appendix well lain out and clearly labeled? (4%)
5.1. Voice projection, audience contact, posture, body language, engaging? (4%)
5.2. Slides: style, spelling, clarity, dynamic? (4%)
5.3. Teamwork evident?   Presentation time split amongst members? (4%)
5.4. Purpose, conclusions clear? Choice of "best model" clear? (8%)
|
0-6 |
7-8 |
9-10 |
Data collection |
Data collection method inappropriate or only vaguely described. |
Data collection method explained and appropriate. |
Data collection method well explained, appropriate, with any limitations noted. |
Data presentation |
Several graphs poorly labeled, some graphs inappropriate for goals, graphs on the whole did not add much insight into the issues discussed. Numerical measures inappropriate or interpretation did not add to understanding of data. |
Most graphs clearly labeled and appropriate for type of data and goals, graphs generally aided in understanding of arguments made in write-up. Numerical measures and their interpretations generally correct and add some insight into data. |
All graphs clearly labeled and appropriate for goals, showed creativity in creating graphs to make arguments clear and graphs fully supported write-up. Numerical measures interpreted well and contributed to understanding of data set.
|
Business write-up |
Often superficial or insufficient reference to context, described aspects of graphs rather than the information they displayed, Little effort to go beyond context and look at implications for business decisions. Appendix poorly organized or superficially commented. |
Generally well written with reference to context; analysis correct, some effort to look at implications. Written at a level accessible to a non statistician. Appendix referred to in text and clearly lain out. |
Clear and well written, continual reference to context; Sound analysis and went beyond current context to look at implications of information gathered on future business decisions. Written at a level accessible to a non statistician. Appendix referred to in text, clearly lain out and commented. |
Oral presentation |
Transitions between team members not smooth or unbalanced participation by members or poor time keeping. Slides plain or contain some grammatical or spelling errors, lack structure. Project goals and conclusions less than clear to the audience.
|
Some teamwork evident, transitions fairly smooth, and time limit respected. Generally positive audience contact with room for improvement on voice projection or posture. Slides correct and support presentation. Goals and conclusions of project clear to audience. |
Teamwork evident with smooth transitions, good timing and shared participation by all members; Excellent audience contact, voice projection, posture; Clear, dynamic and correct slides that enhance presentation; Purpose, conclusions and limitations of project clear to audience. |
Apigian, C., & Gambill, S. (Winter 2004-2005). Is Microsoft Excel 2003 Ready for the Statistics Classroom? Journal of Computer Information Systems, 27-35.
Chance, B. (2000). Experiences with Authentic Assessment Techniques in an Introductory Statistics Course. In T. e. Moore, MAA Notes #52 Teaching Statistics: Resources for Undergraduate Instructors (pp. 209-218). Washington DC: Mathematical Association of American and American Statistical Association.
Cryer, J. (2001). Problems with Using Microsoft Excel for Statistics. Proceedings of the 2001 Joint Statistical Meetings. Alexandria, VA: American Statistical Association.
Fillebrown, S. (1994). Using Projects in an Elementary Statistics Course for Non-Science Majors. Journal of Statistics Education , 2 (2), online.
Garfield, J. (2000). Beyond Testing and Grading: New Ways to Use Assessment to Improve Student Learning. In T. e. Moore, MAA Notes #52 Teaching Statistics (pp. 201-207). Mathematical Association of America and American Statistical Association.
Guidelines for Assessment and Instruction in Statistics Education.(n.d.). Retrieved August Monday, 2008, from jse.amstat.org/Education/gaise/GAISECollege.htm
Hofstede, G. (1997). Culture and Organization: Software of the Mind. New York: McGraw-Hill.
Huizhen, G. (2006). Balancing Cultural Differences in Teaching Statistics. Proceedings of the Joint Statistical Meetings. Alexandria, VA: American Statistical Association.
MacGillivray, H. (1998). Developing and Synthesizing Statistical Skills for Real Situations through Student Projects. ICOTS 5, Israel (pp. 1150-1156). International Association of Statistics Education.
Moore, T. e. (2000). Section 2: Teaching with Data. In ec., MAA Notes #52 Teaching Statistics (p. section 2). Mathematical Association of America and American Statistical Association.
Nash, J. (2006). Spreadsheets in Statistical Practice - Another Look. The American Statistician , 60 (3).
Roseth, Garfield, Ben-Zvi (2008) cited in 2. needs reference
Schneider, S., & Barsoux, J. (2003). Managing Across Cultures, 2nd ed. Pearson Education Limited.
Wardrop, R. (2000). Small Student Projects in an Introductory Statistics Course. In T. e. Moore, MAA Notes #52 Teaching Statistics (pp. 19-25). Mathematical Association of America.
Williamson, D., Mendel, M., Tarr, J., & Yoklic, D. (2005). Student Manual for Essential Mathematics for Business Decisions: Part 1: Probability and Simulation, 2nd ed. Mathematical Association of America.
Michelle Sisto
International University of Monaco
2 Avenue Prince Albert II
Monte Carlo 98000
Principality of Monaco
Phone: +377 93 150 610 (home)
Fax: +377 92 052 030 (University)
msisto@monaco.edu
Volume 17 (2009) | Archive | Index | Data Archive | Resources | Editorial Board | Guidelines for Authors | Guidelines for Data Contributors | Home Page | Contact JSE | ASA Publications