A Study Comparing Traditional and Hybrid Internet-Based Instruction in Introductory Statistics Classes

Jessica Utts, Barbara Sommer, Curt Acredolo, Michael W. Maher, and Harry R. Matthews
University of California, Davis

Journal of Statistics Education Volume 11, Number 3 (2003), jse.amstat.org/v11n3/utts.html

Copyright © 2003 by Jessica Utts, Barbara Sommer, Curt Acredolo, Michael W. Maher, and Harry M. Matthews, all rights reserved. This text may be freely shared among individuals, but it may not be republished in any medium without express written consent from the authors and advance notification of the editor.


Key Words: Distance learning; Teaching elementary statistics; Web-based instruction.

Abstract

Advances in technology coupled with increasing student enrollment numbers have led some universities to begin offering on-line classes. This paper discusses a study comparing a traditional offering of elementary statistics with a "hybrid" offering. In the hybrid offering the class met once a week, but students were required to learn the material on their own using web-based materials and a textbook. We examined differences in student performance, student satisfaction and investment of both student and instructor time. Performance of students in the hybrid offering equaled that of the traditional students, but students in the hybrid were slightly less positive in their subjective evaluation of the course.

1. Introduction

As technology advances and student enrollments increase many universities are exploring the use of web-based instruction. The options range from the use of web-based applications in traditional classrooms to full-blown online courses in which there is no face-to-face contact. A "hybrid" course is one that utilizes both distance learning via the web and the traditional classroom format in some combination.

Few studies have been done comparing online and traditional methods in elementary statistics courses. Hilton and Christensen (2002) evaluated the impact of incorporating multimedia presentations into the traditional lecture format and found that it did not improve student learning or attitudes. Stephenson (2001) uses a distance education format with videotaped PowerPoint presentations for presenting a statistics course to an industrial audience. Zhang (2002) uses WebCT at Indiana University of Pennsylvania to teach an online course, but neither of these last two authors reported comparisons between the distance and in-class offerings.

Hundreds of studies in other disciplines have been done comparing traditional lectures with distance learning in general and web-based instruction in particular. The results have indicated that there was no significant difference so consistently that a website named "The No Significant Difference Phenomenon" has been established with links to these studies (http://teleeducation.nb.ca/nosignificantdifference).

In the Fall Quarter 2001, we compared a traditional offering of elementary statistics with a hybrid offering. This work was part of a grant to our institution from the Andrew W. Mellon Foundation, to experiment with Internet-based instruction in large general education courses. The hybrid class primarily relied on Internet instruction but met once a week for a quiz and overview of the material. We compared student performance and student satisfaction with the two methods. Also, we measured the initial and ongoing investment of instructor time, as well as the investment of student time in the course.

2. Course Design

The traditional offering of Elementary Statistics is a 10-week course that meets three times a week for a one hour lecture (200 - 250 students) and once a week in smaller one hour discussion sections (40 to 60 students). (Classes are actually 50 minutes, but are always discussed as "hour" classes.) The regular course has no computer component. The instructor assigns homework, which is collected weekly. There are three midterm exams and a comprehensive final exam.

The Internet-based course, Elementary StatisticsV (for “virtual”), met once a week for 80 minutes. The first 20 to 30 minutes were used to administer a quiz covering the material assigned for the week. (University policy requires that all exams be given in person, so on-line quizzes were not an option.) In the remaining time the instructor provided an overview of the material to be studied in the coming week, and demonstrated some of the interactive material on the web. In addition to the weekly quizzes, there was one midterm exam and a comprehensive final exam. Homework was due weekly.

To make the courses as comparable as possible, the first author taught both courses in the same quarter (Fall 2001) at the same time of day (3pm). All students used the same textbook. The Internet class additionally used CyberStats, a commercial on-line introductory statistics course. CyberStats contains basic text plus hundreds of interactive applications, interactive practice problems, and self-assessment tests that provide students with immediate feedback. Students in the Internet class were given study assignments each week in both the textbook and CyberStats, with particular attention paid to the interactive applications. Table 1 summarizes the similarities and differences in the two courses.


Table 1. Comparison of course features.

Feature Traditional Class Hybrid Class
Quarter offered
Lecture times
Instructor
Final class size*
Discussion/Lab
Exams
Homework
Textbook
Computer use
Fall 2001
MWF 3:10 - 4:00 PM
Jessica Utts
208
Weekly, ~50 students each
Three midterms + final exam
Assigned daily, due weekly
Mind on Statistics
None
Fall 2001
Tuesdays 3:10 - 4:30 PM
Jessica Utts
77
None
Weekly quizzes, one midterm, final exam
Assigned and due weekly
Mind on Statistics
CyberStats online course

* Students are not allowed to drop classes after the first 10 days of instruction, so final class sizes are equivalent to class sizes after the second week of the quarter.


It was not possible to randomly assign the students to the two courses; they were allowed to self-select. The Internet course was not in the general catalog, so students heard about it from their academic advisors or from attending the first day of one of the traditional Elementary Statistics classes. (There were three traditional classes taught by other instructors that quarter, in addition to the one used for this study.) As the regular classes fill quickly, some of the students who enrolled in the Internet course may have done so because they needed it to complete their Statistics course requirement.

3. Evaluation Methods

The study measured a number of variables in four functional categories:

There were also two focus groups conducted during the quarter, the first with 15 and the second with 10 participants. Of the 25 participants, 14 were enrolled in the hybrid class, and 11 students were in the traditional class, but had experience with or wanted to convey their opinions of on-line or hybrid classes.

3.1 Initial Comparability

We measured a number of potentially confounding variables at the beginning of the quarter: Grade Point Average (GPA), Class standing (Frosh, sophomore, etc.) and Gender. Also, students filled out two inventories. The first was a self-report Student Expectation Inventory (see Table 2). The questions were constructed by the second author based on results of an open-ended query to students in another course with a virtual option. The second was the Basic Technology Competencies for Educators Inventory (BTCEI), a 23-item self-report measure of skill in basic computer operation, word processing, spreadsheets (e.g., Excel), telecommunication, and media communication (Flowers & Algozzine, 2000).


Table 2. Student expectation inventory.

Response scale [to be applied to each part of each question]:
5 = Strongly agree, 4 = Agree, 3 = Undecided, 2 = Disagree, 1 = Strongly disagree, N = Not applicable

1. I am taking this course because
   a. it is required for my major or minor
   b. it satisfies a GE requirement
   c. I need the units
   d. of interest in subject matter
   e. it was recommended by friend
   f. I wanted an online course
   g. it fits my schedule
   h. other
2. I expect to learn a lot in this course
3. I expect this to be a difficult course
4. I expect this course to
   a. improve my critical/analytic thinking skills
   b. expand my knowledge of the topic
   c. arouse my interest in the subject area
   d. improve my computer/Internet skills
   e. help me reach my career goals


3.2 Performance Measures

The pretest was given on the first day of class and contained 12 multiple-choice questions of the type one would expect in a final exam in introductory statistics. Many students are now learning some basic statistics in high school, so we wanted to see if the two classes had comparable initial knowledge. We also wanted to measure student improvement.

The post-test consisted of a slightly modified version of the same 12 questions embedded in a set of 30 multiple-choice questions on the final examination. (There was also a free-response or essay part of the final exam, but because different teaching assistants graded it, it was not used in the comparison.) Because the two classes took the final exam at different times, pairs of comparable questions were developed for security purposes. One item from each pair was randomly assigned to appear on the final exam for the traditional class, and the other for the hybrid class.

3.3 Student Behaviors and Satisfaction

Immediately following the multiple-choice questions on the final exam, we asked students to indicate their primary and secondary sources of learning. For the traditional class the four choices were: The textbook, lectures, discussion sections, doing homework. For the Internet class the choices were: CyberStats, the textbook, the lectures, doing homework. We also asked them to indicate the average amount of time spent on this course outside of class per week: 0 to 3 hours, 3 to 6 hours, 6 to 9 hours, 9 to 12 hours, more than 12 hours.

To measure class satisfaction, anonymous student evaluation forms were given in each class near the end of the quarter, prior to the final exam. One confounding factor for these evaluations is the response rate. They were given in class, where attendance is voluntary. In the Internet class most students were in attendance because of the weekly quiz, while in the traditional class, the evaluations were given on a day scheduled for review and only about 65% of the students were in attendance.

3.4 Instructor Investment of Time

The instructor received a daily email reminder prior to and during the quarter to fill out an online time sheet providing number of hours, to the nearest half-hour, spent on a variety of activities for either or both classes. Items included activities that are likely to remain constant from one quarter to the next, such as "Interacting with students outside of class" and activities that are likely to require less time in subsequent offerings of the course, such as "Planning the course, developing materials, developing lecture content."

4. Results

4.1 Initial Comparability

The two groups were strikingly similar on the preliminary measures which included class level, mean GPA, gender and computer competence (Table 3). The mean pre-test scores were also very similar. The 12 multiple choice questions had five choices each, so the mean scores of 4.7 and 4.8 questions correct reflect some prior knowledge, or at least some ability to rule out choices.


Table 3. Class profiles.

Variable   Traditional Hybrid
Class Standing









Gender




Grade Point Average (GPA)
(4 = A)

Computer Competence
(1=low, 5=high)

Pre-test (Total possible=12)


Freshman
N
Sophomore
N
Junior
N
Senior
N
University Extension
N
Male
N
Female
N

Mean
S.D.
N
Mean
S.D.
N
Mean
S.D.
N
51.3%
102
33.2%
66
12.6%
25
2.5%
5
0.5%
1
38.69%
77
61.3%
122

2.72
0.62
198
3.95
0.63
150
4.72
1.90
180
51.3%
39
27.6%
21
17.1%
13
1.3%
1
2.6%
2
44.8%
34
55.3%
42

2.71
0.66
74
4.08
0.60
57
4.84
2.01
58


With regard to the Student Expectation Inventory in Table 2, the principal differences between the groups were that the students in the hybrid course wanted an online course, took it to improve their computer/Internet skills, and agreed that it fit their schedule. They were less likely to be taking it because it was required for the major, and more likely to agree that it would expand their knowledge of the topic (mean of 4.00 vs. 3.75, range 1-5). Also, their agreement that the course would improve their critical/analytical skills was significantly higher (mean of 4.25 vs. 4.05, range 1-5).

There was little difference between the groups on the overall computer competency (BTCEI) measure (see Table 3). The hybrid group did rate itself significantly higher on the last item “overall media communication skill” (2.96 vs. 2.60, range 1 = never used, to 5 = very competent), but otherwise there were essentially no differences. Thus, the students in the hybrid class were not particularly computer-savvy compared to the general pool of students who take this course.

4.2 Performance

Performance was almost identical for the two classes (see Table 4). Mean scores on the pretest, posttest and multiple-choice part of the final exam (of which the posttest made up 12 out of 30 questions) were basically the same for both classes. For the pre-post test comparison, the effect size was almost identical, and the t-statistic differed only because the sample sizes were so different (see Table 4).


Table 4. Performance comparison of traditional and hybrid groups.

Variable Descriptive Statistics Traditional Hybrid
Pre-test
(Total possible = 12)
Mean
S.D.
N
4.72
1.90
180
4.84
2.01
58
Post-test
(Total possible = 12)

Mean
S.D.
N
8.64
1.88
180
8.69
1.70
58
Post-test - Pre-test





Mean
S.D.
t
df
p
Effect size (t/root(n))
3.93
2.28
23.14
179
<.001
1.72
3.84
2.28
12.86
57
<.001
1.69
Final examination
(Total possible = 60)

Mean
S.D.
N
36.80
7.70
199
36.16
6.96
76


Performance by the two groups was also assessed using an Analysis of Covariance. The dependent variable was score on the 30-item multiple choice exam. The independent variable was course format (traditional vs. hybrid). Covariates were grade point average (GPA) and class standing, indicated by number (1 = frosh, 2 = soph, 3 = junior, 4 = senior, 5 = university extension). There was no significant difference in performance between the two groups.

Curiously, we found that for the students in the hybrid class, there was a negative relationship between self-reported computer score and overall performance in the class as measured by the multiple-choice final (r = -.503). There was no relationship between self-reported computer competency and performance in the traditional group.

4.3 Behaviors and Satisfaction

Almost half of the traditional class cited the textbook as their primary learning source, a handful (2.7%) credited the discussion sections, and the remaining students were almost equally split between lectures (25.3%) and homework (22.6%). The most popular combinations of sources (primary/secondary) were Text/Lecture (24.2%), Text/Homework (16.7%) and Lecture/Text (14.0%).

For the hybrid class, almost 80% of the students cited the text (40.5%) or CyberStats (37.9%) as their primary learning source. This is consistent with the philosophy of independent learning for the course. An additional 16.2% cited the homework as primary, and only 4 students (5.4%) cited the lectures, as would be expected with only one lecture a week. The most popular combinations were Text/CyberStats (24.3%, almost identical to the Text/Lecture for the traditional students), CyberStats/Text (16.2%) and CyberStats/Homework (14.9%).

By course schedule, the traditional group spent 4 hours a week in class while the hybrid group spent 1.5 hours. Using the Carnegie work per unit calculation [number of units (4) times 3 = 12 hours per week], the time spent outside of class should have been 8 hours and 10.5 hours per week, respectively. The mode was 3 to 6 hours for both groups (reported by 46% of the traditional class and 29% of the hybrid class). Relative to the traditional class, fewer of the hybrid class spent “0 to 3 hours” (43% vs. 29%) and more of them spent more than 6 hours (11% vs. 18%), reflecting the extra workload they were expected to carry outside of class to compensate for the shorter amount of time they spent in formal lectures and discussion. In retrospect, we should have asked how much time students spent on the class overall, including attending class. For the students in the hybrid class we can find that figure by adding 1.5 hours to the reported time because they were required to come to class to take the quiz. But for the traditional class there was no requirement that would necessitate attendance, so there is no way to know how much time they spent on the course overall.

It is interesting to note that, while admittedly a biased sample, students in the focus groups felt that the hybrid class was much more work, and some of them thought that the workload was excessive. Students in the traditional class did not feel that way, even though the same material was covered and approximately the same number of homework problems was assigned. One even commented "I spent less time with stats than any other class. We did a chapter a week."

The anonymous student evaluations produced significant differences in ratings for the two versions of the course on only five questions - ratings were higher for the traditional course on the first four of these, and higher for the hybrid class for the last one: (1) This course was well organized, (2) I knew what was expected of me in this class; (3) The course material was presented at a good pace; (4) This was a good course; and (5) This course has improved my computer/Internet skills. It is not surprising that the students in the traditional class were more satisfied with the organization, pace and expectations, since they are familiar with taking classes with that format. The remaining questions on student satisfaction with the course and instructor, instructor availability, clarity of presentation, and so on, did not produce significantly different responses.

4.4 Instructor Investment of Time

The overall instructor time spent on each version of the course was almost identical, with a difference of only 3 hours across the entire 11 weeks of the regular quarter and final exam period (slightly higher for the hybrid course, see Table 5). The total time spent for both courses was 229.5 hours, or an average of about 21 hours a week, including the week of final exams.


Table 5. Instructor time by class type and task.

Online Traditional Joint Difference:
Trad - Online
1. Grant-related formalities
2. Planning the course, developing
    materials and lecture content
3. Preparing the course for online
    delivery. Reviewing materials for
    online delivery.
4. Delivering the course in a particular
    term. Preparing for lectures,
    delivering lectures, dealing with
    problems with the online and
    traditional delivery of course content
    to students.
5. Interacting with students outside of
    class
6. Evaluating student performance.
    Preparing and grading examinations,
    grading papers and projects.
7. Training and supervising TA's and
    other assistants.
8. Other
1.5
19.5

5.0


20.0





16.0

34.5


4.5

1.5
0.0
19.5

0.0


34.5





18.5

24.0


3.0

0.0
7.5
2.0

0.0


0.0





5.0

12.0


0.0

1.0
-1.5
0.0

-5.0


14.5





2.5

-10.5


-1.5

-1.5
Total Time 102.5 99.5 27.5 -3.0


Course format accounts for the few differences. For time spent on "delivering the course in a particular term” the traditional course met for a total of 30 hours over 10 weeks (a TA delivered the additional discussion hour each week); the hybrid course met for 15 hours over the 10-week quarter.

The other notable difference was "Evaluating student performance," which included writing exams. The extra time spent in this category for the hybrid class was due to the weekly quizzes. The traditional students were given three exams during the quarter. For both classes, the instructor was assisted in grading by teaching assistants, so the times given here do not reflect the total time spent evaluating student work. Also, an assistant graded homework for both classes.

The traditional class had about three times as many students as the hybrid class, but this difference had little impact on the instructor's time investment. Grading of exams and homework were done by teaching assistants. If this had not been the case, the instructor's investment of time for student evaluation would have been higher for the traditional class. However, this difference would not have been a function of the way the classes were taught. Obviously time for grading student work depends on the number of students no matter what teaching method is used. It was fortuitous that the differing numbers of students did not cause much difference in instructor time, except for a slight difference in computing final grades at the end of the quarter.

The only time category for which the number of students in the class would make much difference in instructor investment of time is "interacting with students outside of class." In this study, the time investment was similar for the two classes but the content and number of hours per student differed. The traditional students were more likely to attend office hours, while the hybrid group was more likely to send questions by email. As there were more than twice as many students in the traditional class, the similar investment of time may reflect the fact that it is more time consuming to answer questions via email than in person. Notice that the total investment of 18.5 hours in the traditional class represents about 5 minutes per student, while the 16 hours for the hybrid class represents about 12 minutes per student. Of course in both cases there were a small number of students who requested lots of time, and many who requested no time at all outside of class.

The time spent "Planning the course" (second category in Table 5) was also allocated in different ways. Most of the time for the traditional class involved finding good examples and constructing lectures. Much of the time for the hybrid class was spent organizing and coordinating the textbook, CyberStats and the lecture content. There will be a time savings for the hybrid course the more times it is offered because the task of coordination will have been refined.

5. Discussion and Recommendations

5.1 Lessons Learned

This was the first offering of the hybrid course. Our results show that a hybrid offering has the potential to be a useful alternative to the traditional class. The quantitative measures show that students performed equally well in both classes. The students in the virtual class were slightly less happy with certain aspects of the course, but their feedback has provided valuable insight for improvement.

Although the focus groups were not a representative sample of students from either class, based on the instructor's observations their opinions on certain issues probably did represent the views of their classmates. For example, in the focus group, a student from the virtual class stated

"I think the book and CyberStats are a good combination because CyberStats just tells you how to do the problem and the book goes deeper into why. The book is like a deep lecture and CyberStats is like a light lecture."

Our original idea about a hybrid class was that the role of the instructor should be to motivate the students and explain the concepts, and that the students could work out the details using the written materials (textbook and online). Therefore, the single weekly lecture explained the concepts for the upcoming week's material, illustrated some of the interactive features in CyberStats (java applets, data analysis tools and self-assessment questions with feedback), and provided examples of the week’s topics. It became clear from the focus groups and other feedback that the students in the hybrid class wanted more discussion time with the instructor. When asked if there should be a separate discussion class with a teaching assistant for the virtual class, one student responded:

"I don't think the discussion lecture in addition to the weekly lecture is necessary at all. The weekly lectures are VERY good. What we really need is more opportunity to ask questions or do extra problems -- to work on details."

In future offerings the weekly face-to-face time will be more interactive. It is still the case that the face-to-face time should be used to discuss concepts rather than details, but students may be more able to absorb concepts when they have already worked through the details. The instructor should be a motivator and explainer, but the students will have more input into what needs to be explained.

The evaluation specialist who conducted the focus groups concluded that:

"Statistics 13 is well on its way to becoming an excellent on-line offering. The students love CyberStats, and they feel that CyberStats and the book are a good complement. However, they feel that the instructor and the teaching assistants would serve a better (more constructive) role if they could make themselves available to answer questions, preferably to groups of students in a live setting so that students could interact and learn from one another. At these times, the instructor (and teaching assistants) could also help the students understand the underlying concepts and how to interpret findings. CyberStats appears to be quite sufficient in providing instruction on the mechanics of calculating statistics."

As a side note, there are other computer-based teaching tools that may be preferable to web-based tools, especially for students who do not have fast Internet access. Alldredge and Som (2002) conducted a study in a course with a traditional format, but that used either CyberStats or ActivStats in the laboratory portion of the course. ActivStats is a self-contained CD featuring cartoon animation, applets and other interactive features. They found that students performed better overall using ActivStats. However, as the authors note, the class sizes were different for the two groups, with the smaller classes using ActivStats so the comparison may be confounded with class size. Also, the students used these materials in the laboratory, unlike in our study in which students were required to use CyberStats on their own time only, outside of the classroom, without direction.

5.2 Connections with Principles of Learning Statistics

Based on extensive reviews of research and observation, Garfield (1995) developed ten principles of learning statistics. Our experience with the construction and implementation of the virtual course fits well with many of them, and the course design and outcomes are reviewed and compared with the traditional offering in that context. The ten principles, with relevant comments, are:

  1. Students learn by constructing knowledge. The self-instruction required for the virtual class is an excellent tool for helping students construct their own knowledge, with continual guidance provided by the immediate feedback on self-assessment quizzes and practice material in CyberStats. Based on the responses to the primary and secondary learning tools, it appears that students in the traditional class also preferred knowledge construction on their own time, since only 25.3% of them cited the lectures as their primary learning source. The web-based materials could be used in conjunction with the traditional course as well, although the time burden of attending lectures, reading the textbook and using the web-based material may be excessive. Therefore, the virtual class may have an advantage for this learning principle.

  2. Students learn by active involvement in learning activities. The traditional course offered hands-on projects in the weekly discussion sections, an opportunity not afforded to the students in the virtual class. The projects were not simply recipes to follow; they required students to work creatively in teams. (They were selected from the 36 projects offered on the Instructor's Resource Manual CD accompanying the textbook.) CyberStats offers a different, more limited type of involvement, with hundreds of on-line interactivities. A combination of both types of hands-on learning experiences would be ideal.

  3. Students learn to do well only what they practice doing. Ten to twelve homework problems were assigned each week in both classes, and thus all students had practice. However, CyberStats has practice material accompanying every page or two, for which the students receive immediate feedback after submitting it online, so the students in the virtual course had more opportunity for practice. There is a record of these submissions that the instructor can view, and it became apparent throughout the quarter that although these practice exercises were not required for the class, a majority of the students did use them. There is no way to know if the students in the traditional course did extra practice exercises.

  4. Teachers should not underestimate the difficulty students have in understanding basic concepts of probability and statistics and

  5. Teachers often overestimate how well their students understand basic concepts. The unfortunate reality is that the class sizes in both offerings did not allow for the kind of interaction between instructor and students that our experience has shown is crucial for overcoming these problems. In the current offering of the hybrid class, each weekly class period begins with a question and answer session on the prior week's work. It is already apparent that this is an excellent technique for helping the instructor sort out misunderstandings and reinforce the correct interpretation of basic concepts.

  6. Learning is enhanced by having students become aware of and confront their misconceptions. One of the nice features of CyberStats is that the self-assessment quizzes display immediate feedback. The questions are multiple choice, and the feedback includes an explanation of why the incorrect choices are incorrect. This feature allows students to confront their misconceptions immediately and to correct their thinking.

  7. Calculators and computers should be used to help students visualize and explore data, not just to follow algorithms to predetermined ends. CyberStats is built on this principle. In the traditional class the students are not required to use the computer so they had little opportunity to take advantage of this learning tool. To the extent possible, the projects conducted in the discussion sections required them to visualize and explore data with their team.

  8. Students learn better if they receive consistent and helpful feedback on their performance. Again, the practice material in CyberStats allows for this. For each practice problem, students submit their answer, then are shown a pop-up screen with the author's suggested answer for comparison. In both courses, homework, quiz and exam solutions were posted on a class website immediately after they were due, but that's a poor substitute for immediate feedback. Student papers were graded, but due to the class sizes and availability of the solutions, individual comments were rarely provided.

  9. Students learn to value what they know will be assessed. The format of the virtual class made it easier to focus students on material of importance. For each assigned CyberStats unit, students were directed to one or two interactivities or other features as particularly important. The content was reinforced by homework, the weekly lecture and quiz. The format of the traditional class, with three lectures a week and weekly homework, but with only three midterm exams, did not focus as directly on specific material.

  10. Use of the suggested methods of teaching will not ensure that all students will learn the material. Overall, both classes performed equally well on the technical and conceptual questions presented on the final exam, so the method of teaching did not seem to alter how well students learned the material. As Garfield, Hogg, Schau, and Whittinghill (2002) emphasize, there are other important outcomes of an elementary course besides performance, for example using the knowledge after leaving the course, and attitudes and beliefs about statistics. Students in both offerings believed that "This course has expanded my knowledge of the topic” (mean 4.16 traditional and 4.03 hybrid, 5 = strongly agree). This slight difference was not significant.

5.3 Recommendations

Our findings suggest that universities and colleges experiencing enrollment pressure might benefit from offering elementary or introductory statistics courses partially online, with fewer instructor contact hours. Based on our experience and data, we offer the following recommendations for incorporating web-based learning materials:

  1. Students need to interact with a knowledgeable instructor. Student feedback indicates that this interaction should focus on material they have already read and worked with on their own, rather than provide an overview of material before they have covered it.

  2. It is important to provide weekly meetings to keep students on track. Use this face-to-face time to discuss concepts and material students have already covered, and to test them on it. If weekly meetings aren’t possible, require every student to participate in online discussion at least weekly.

  3. Give short weekly quizzes to motivate students to work on the material. Many completely online courses have high attrition rates because students fall behind. We did not have that problem.

  4. Provide a textbook for the course in addition to material online. Students appreciate having something they can read offline. (Other users of CyberStats have commented that students want a print version to supplement the online version. A "print companion" consisting of a printout of the web pages is available.) Textbooks are more adept at providing details and in-depth elaboration. Extensive reading is more difficult on-line than in a book.

  5. Web-based material can be interactive and more engaging than printed text. In the current offering students are encouraged to read the text first, then try the interactivities and immediate feedback practice materials in CyberStats. This combination is working.

  6. Although the investment of instructor time for the two offerings was almost identical, we think that in subsequent offerings the virtual class will provide a savings in instructor time. In the current offering we have added an optional evening discussion hour the evening before the class meetings, to help students finish their homework and prepare for the quizzes. This session is conducted by a teaching assistant and reduces the time spent answering questions by email. It takes substantially less time to prepare for one rather than three class meetings a week. This savings is somewhat, but not completely, offset by having to construct weekly quizzes. This time savings would not apply for a completely online class, in which answering email may be a significant time concern for the instructor.

  7. The real value of this study is that it has shown that students seem resilient to the method of presentation. Ultimately, learning the material falls to them. They seem to adapt to whatever method is provided to them for accomplishing that goal.

5.4 Follow-up and Conclusions

The first author of this paper had a chance to teach the Elementary StatisticsV course again in Fall 2002. The format was changed slightly to reflect the information learned in this study. Rather than starting the class period with a quiz, the first twenty minutes were devoted to answering questions on the previous week’s material. The next twenty to thirty minutes were used to administer the quiz and the remaining time was used for a brief introduction to the coming week’s material. Unfortunately the question period at the beginning of class became more like a homework help session. Most of the questions were related to the homework due that week, which was to be turned in by the end of class. This structure left very little time for explaining the coming week’s material.

Based on the study and the subsequent year’s experience, it appears that the best use of class time is for review of material the students have already studied during the previous week. In future offerings we will use the class time solely for that purpose and for administering the quiz. Based on our study it appears that students are able to learn the material with the online format, and that subsequent review is an added bonus. However, the role of the instructor is very different from that in the traditional class, and it admittedly feels as if the instructor’s role has been reduced to answering questions rather than providing insight and guidance - a role that is less satisfying for a dedicated educator.

As a final note, this study did not address the use of online or interactive materials such as CyberStats or ActivStats in a traditional class format. That type of course would provide the benefits of online interaction and immediate feedback, while still allowing sufficient class time for discussion and explanation. A potential problem is that students may be overwhelmed by the available resources and would not use any of them to full advantage, so guidance by the instructor would be crucial. Finding the right balance between interactive modes of instruction and the traditional lecture-textbook format will take exploration and experience, and the optimal balance will surely differ for different institutions, instructors, and student audiences.


Acknowledgement

This work was supported by a grant from the Andrew W. Mellon Foundation's Cost Effective Uses of Technology in Teaching (CEUTT) program initiative.


References

Alldredge, J. R., and Som, N. A. (2002), "Comparison of Multimedia Educational Materials Used in an Introductory Statistical Methods Course," Proceedings of the Sixth International Conference on Teaching Statistics, ed. B. Phillips, Voorburg, The Netherlands: International Statistical Institute.

Flowers, C. P., and Algozzine, R. F. (2000), "Development and Validation of Scores on the Basic Technology Competencies for Educators Inventory," Educational and Psychological Measurement, 60, pp. 411-418.

Garfield, J. (1995), "How Students Learn Statistics," International Statistical Review, 63, pp. 25-34.

Garfield, J., Hogg, B., Schau, C., and Whittinghill, D. (2002), "First Courses in Statistical Science: The Status of Educational Reform Efforts," Journal of Statistics Education [Online], 10(2). (jse.amstat.org/v10n2/garfield.html)

Hilton, S. C., and Christensen, H. B. (2002), "Evaluating the Impact of Multimedia Lectures on Student Learning and Attitudes," Proceedings of the Sixth International Conference on Teaching Statistics, ed. B. Phillips, Voorburg, The Netherlands: International Statistical Institute.

Stephenson, W. R. (2001), "Statistics at a Distance," Journal of Statistics Education [Online], 9(3). (jse.amstat.org/v9n3/stephenson.html)

Zhang, J. (2002), "Teaching Statistics On-Line: Our Experiences and Thoughts," Proceedings of the Sixth International Conference on Teaching Statistics, ed. B. Phillips, Voorburg, The Netherlands: International Statistical Institute.


Jessica Utts
Department of Statistics
University of California, Davis
Davis, CA 95616
USA
jmutts@ucdavis.edu

Barbara Sommer
Teaching Resources Center
University of California, Davis
Davis, CA 95616
USA
basommer@ucdavis.edu

Curt Acredolo, deceased
Human and Community Development Department
University of California, Davis
Davis, CA 95616
USA

Michael W. Maher
Graduate School of Management
University of California, Davis
Davis, CA 95616
USA
mwmaher@ucdavis.edu

Harry R. Matthews
School of Medicine (Emeritus)
University of California, Davis
Davis, CA 95616
USA
hrmatthews@ucdavis.edu


Volume 11 (2003) | Archive | Index | Data Archive | Information Service | Editorial Board | Guidelines for Authors | Guidelines for Data Contributors | Home Page | Contact JSE | ASA Publications