Full Terms & Conditions of access and use can be found at
http://www.tandfonline.com/action/journalInformation?journalCode=vjeb20
Download by: [Universitas Maritim Raja Ali Haji] Date: 11 January 2016, At: 20:39
Journal of Education for Business
ISSN: 0883-2323 (Print) 1940-3356 (Online) Journal homepage: http://www.tandfonline.com/loi/vjeb20
Business Statistics: A Comparison of Student
Performance in Three Learning Modes
Gerald R. Simmons
To cite this article: Gerald R. Simmons (2014) Business Statistics: A Comparison of Student Performance in Three Learning Modes, Journal of Education for Business, 89:4, 186-195, DOI: 10.1080/08832323.2013.836470
To link to this article: http://dx.doi.org/10.1080/08832323.2013.836470
Published online: 02 May 2014.
Submit your article to this journal
Article views: 140
View related articles
JOURNAL OF EDUCATION FOR BUSINESS, 89: 186–195, 2014 CopyrightC Taylor & Francis Group, LLC
ISSN: 0883-2323 print / 1940-3356 online DOI: 10.1080/08832323.2013.836470
Business Statistics: A Comparison of Student
Performance in Three Learning Modes
Gerald R. Simmons
Texas A&M University-Central Texas, Killeen, Texas, USA
The purpose of this study was to compare the performance of three teaching modes and age groups of business statistics sections in terms of course exam scores. The research questions were formulated to determine the performance of the students within each teaching mode, to compare each mode in terms of exam scores, and to compare exam scores by age group. The research hypotheses predicted there would be a difference between the three teaching modes and age groups. The results found significant differences in both the online teaching mode for exam 3 and age groups for the mean of the three exams.
Keywords:blended, business statistics, face-to-face, Hawkes Learning Systems, online, teach-ing modes
As we fast approach the closing of the first quarter of this cen-tury, methods of receiving higher level education continues to change from the traditional mode of university learning in face-to-face (F2F) lectures and labs. Online learning is quickly becoming the new traditional method of learning. Radford (2011) found that between 2000 and 2008 the per-centages of undergraduates enrolled in at least one distance education class expanded from 8% to 20%. Additionally, the percentage of undergraduates enrolling in distance edu-cation programs increased from two to four percent during the same period (Radford, 2011). Of all students, Radford also found 24% of the business students enrolled in distance education courses and 6% enrolled in distance education programs. In terms of enrollment, the 2009 figures show University of Phoenix, Online Campus, has the highest en-rollment of all U.S. postsecondary institutions, with 380,232 students (National Center for Education Statistics, 2011). Kaplan University is second, with 71,011 students, and Ari-zona State University is third with 68,064 students (National Center for Education Statistics, 2011). The other two insti-tutions that made the top five were Miami Dade College at 59,120 students and Ohio State University, Main Cam-pus, at 55,014 students. The University of Minnesota, Twin
Correspondence should be addressed to Gerald R. Simmons, Texas A&M University-Central Texas, Department of Management and Mar-keting, 1001 Leadership Place, Founders’ Hall, Killeen, TX 76549, USA. E-mail: [email protected]
Cities, was rated ninth; the University of Texas at Austin was rated 10th; Texas A&M University was 12th; and the University of Washington, Seattle Campus was rated 18th (National Center for Education Statistics, 2011). What is in-teresting to note is the fact the top two institutions are fully online universities. Additionally, most of the listed traditional universities either have fully online or fully distance degree programs.
This regionally accredited university provides it students with upper level (baccalaureate) and graduate programs and provides both online, blended, and traditional courses and degree programs. The objective of this study was to expand the current knowledge concerning the perceptions of online learning versus the perceptions of traditional learning; there-fore, this study presents a comparison of online, blended, and traditional F2F business statistics sections at a single univer-sity. The purpose of this quantitative study was to compare the performance of F2F, blended, and online business statistics sections in terms of course exam scores, controlling for age, gender, previous experience in statistics, and course length, of the sections that met between the fall of 2010 and the sum-mer of 2012 at a regionally accredited university. This study was differentiated from other online, F2F, and blended com-parison studies in that the data collected were from sections that were taught using the same textbooks, as in the studies by Dotterweich and Rochelle (2012) and McLaren (2004), but also administered the same assignments and exams in the same manner; thus reducing possible instructor bias. These details will be explained in the following discussion of the Hawkes Learning System (HLS)
This study differed from Dotterweich and Rochelle whose online students completed online assignments while the F2F students completed the traditional assignments from the text-book. This study differed from McLaren (2004), whose F2F and online assignments were also presented in a different form and the exams were administered differently. F2F stu-dents could take their exams with a sheet of notes, whereas the online students had access to all notes and textbooks, with the request they do not receive outside assistance (McLaren, 2004). Additionally, this study sought to determine the ef-fects of the course length. The university taught the courses in eight and sixteen week sessions. The following sections provide a brief background of online learning, the online and F2F pedagogies of the regionally accredited university’s business statistics courses, methodology, analysis, and finally the results of the study.
BACKGROUND
Online Learning
The roots of distance education and online learning stemmed from correspondence courses. These courses were taught over a geographic distance where instructors and students corresponded through the postal system. With the introduc-tion of the Internet and the world wide web, in 1989, online and distance education began its expansive growth. It was at this time that institutions, such as University of Phoenix, established their online programs (University of Phoenix, 2012). The advantage of the Internet and world wide web was the quick transmission and correspondence between the instructors and the students. Online learning grew from email transmissions, to newsgroups, to the online learning manage-ment systems (e.g., Blackboard, Moodle, Desire2Learn).
Online Versus Blended Versus Traditional Learning
Online learning consists of students receiving instruction, conducting research, completing assignments, quizzes, and exams, and interacting with the instructor in a format that uses the world wide web as the basis for a learning management system. This interaction can be either in a synchronous mode, or in an asynchronous mode (Mills & Raju, 2011; Summers, Waigandt, & Whittaker, 2005; Tallent-Runnels et al., 2006). Blended learning courses consists of traditional F2F lecture periods, coupled with online components such as group or team study, assignments, quizzes, and exams (Mills & Raju, 2011). Traditional learning consists of the F2F interactions according to scheduled periods, where students receive in-formation (or lectures) in a classroom.
The idea that traditional learners perform better than on-line learners has pervaded the thoughts of professionals and academicians since the advent of distance learning. This idea became more prevalent with the growing numbers of
ac-credited online universities and their growing populations of students and the online programs within traditional universi-ties, spurring many studies comparing the effects of online learning and traditional learning (DeNeui & Dodge, 2006; Harrington, 1999; Mills & Raju, 2011; Summers et al., 2005; York, 2008). Studies of comparison sought to determine if there were true differences in student learning between online and F2F (Bernard et al., 2004; Harrington, 1999; Summers et al., 2005), blended and online (Arbaugh, Desai, Rau, & Sridhar, 2010), and all three (Ashby, Sadera, & McNary, 2011; York, 2008). The results of these studies included a poorer performance in the traditional classroom (Ashby et al., 2011) as well as no significant difference in the three modes of instruction (Harrington, 1999; Summers et al., 2005; York, 2008). All researchers recommended further study, especially with the expected increase of online learning.
Business Statistics and Hawkes Learning System
Technology has always taken a role in the education of stu-dents, beginning with the abacus and continuing on to the laptop and notebook computers of today. Without the use of technology, courses such as statistics became the dread of many students whose intent was to learn only the basics, not actually major in the subject.
Technologies used in many introductory statistics courses, regardless of teaching mode primarily consisted of some form of computational software. These software packages included the Texas Instruments 84 (Dallas, TX) family of pocket calculators, Microsoft Excel (Seattle, WA; and the add-in software for data analysis), Statistical Package for the Social Sciences, Minitab (State College, PA), and STATA (StataCorp LP, College Station, TX) (EL Hajjar, 2011; Gomez, 2010; Hamadu, Adeleke, & Ehie, 2011; Mills & Raju, 2011; Spinelli, 2001). Few studies used some form of computer managed learning tools such as The Learning Man-ager (WIN, Kingston, TN) (Cybinski & Selvanathan, 2005), or STATLAB (Yale University, New Haven, CT) (Mills & Raju, 2011). Other learning tools included Hawkes Learning System, which was used at this university.
Hawkes Learning System. HLS is a mastery-based software package, which is uploaded onto students’ personal computers (Hawkes Learning Systems, 2012). The software is an automatic homework and testing system and forms the basis from which each student learns, practices, and masters concepts within the topic being studied (HLS). The assign-ments are structured such that students must achieve an 80% mastery level before they are certified in a particular topic area. HLS provides computational problems, where students must conduct required calculations outside of the program (using pocket calculators, Microsoft Excel, or statistical soft-ware). The exams, which are actually created by the instructor by selecting question types from a database of questions, will
188 G. R. SIMMONS
never be the same for any two students because the problems within a question set are algorithmically generated.
HLS validity. HLS has not conducted formal investiga-tions as to the validity and reliability of its exams. According to R. Hendrix (personal communication, June 5, 2013), of Hawkes Learning Systems, all questions used for either cer-tifications or instructor developed exams are from the same database of questions and have at least content validity.
Business Statistics Pedagogies
The School of Business teaches undergraduate business statistics at the junior level. Students taking the course, re-gardless of teaching mode, receive instruction in descriptive statistics and in the foundations of inferential statistics (Texas A&M University-Central Texas, 2012). Each section of the business statistics course is offered in one of three modes: F2F, blended, and online. All modes use Blackboard 9.2 (Washington, DC) as the online component interface. Addi-tionally, all modes use the HLS as the method of teaching the foundations. Course length is either eight or 16 weeks. F2F sections that are taught on main campus are 16 weeks. Generally, all online sections are either in eight- or 16-week sections, and all blended sections and any section taught in the summer are eight weeks. The reason the blended courses are eight weeks is because they are taught at the education center on a local military base, where there is a contrac-tual agreement that any course taught on base conforms to an eight week course length. The F2F section receives tra-ditional classroom lectures; the blended section downloads and listens to recorded lectures and then attends question-and-answer sessions during class time; and the online sec-tion listens to recorded lectures and participates in discussion threads.
The military base section is not a military-only section. The course is offered on the course schedule with the other sections and all students, regardless of military affiliation, are free to register. Soldiers are not restricted to only taking the military base section and may register for any section. There-fore, all sections, regardless of delivery mode and location, have a homogeneous makeup of the university’s population. The summer F2F sections may be taught either in blended or traditional format, depending on the instructor’s desire. Regardless of the length of the section, all students receive the full course material as outlined in the master syllabi. Ex-ams are administered to all students, regardless of teaching mode or location, in the same manner, using the HLS. As the instructor develops the exams by selecting question sets (not individual questions), each exam tests the same concepts.
Study Limitations and Delimitation
A limitation and a delimitation of this study were identified. First, the 8-week course length is a study limitation because students are required to learn the course material in half the
time of the traditional 16-week term. Learning might be im-pacted by increased stress due to the intensive course sched-ule. Next, a delimitation of the study is that only exams were used to measure progress and learning performance. Students were required to only complete the HLS assignments and ex-ams. The HLS assignments cannot be measured in that the only grades identified if the student achieved 80% mastery. Upon a student’s receipt of this mastery, the student would receive full points for the assignment. Partial points were granted based on lateness of submission of the assignment.
Research Questions
In achieving the comparison of the performance in terms of the three teaching modes, the following questions were developed:
1. What was student performance within each teaching mode in terms of course grades of the sections that met between the fall of 2010 and the summer of 2012 at a regionally accredited university?
2. How did the performance of F2F, blended, and online business statistics sections compare, in terms of course exam scores, controlling for age, gender, experience with statistics, and course length of the sections that met between the fall of 2010 and the summer of 2012 at a regionally accredited university?
3. How did the performance by age group compare in terms of the mean exam scores of the sections that met between the fall of 2010 and the summer of 2012 at a regionally accredited university?
The hypothesis that was tested was based on the second and third research questions:
Hypothesis 1A (H1A): There would be a difference in the comparison of the performance of F2F, blended, and online business statistics sections in terms of course exam scores, controlling for age, gender, experience with statistics, and course length of the sections that met be-tween the fall of 2010 and the summer of 2012 at a regionally accredited university.
H2A: There would be a difference in the comparison of
the performance by age group, in terms of the mean exam scores, of the sections that met between the fall of 2010 and the summer of 2012 at a regionally accredited university.
METHOD
Multivariate analysis of covariance (MANCOVA) was se-lected as the method to achieve the comparison of the modes. MANCOVA was an appropriate method of analyzing the data as the dependent variable performance was subdivided by the three exams (Mertler & Vannatta, 2010). The remainder of
this section describes the variables used, population and sam-pling, data collection, and analysis methodology.
Variables
The dependent variable under test in this study was perfor-mance, in terms of the first three exams of each student in each course section. The first exam score measured the per-formance of each student in understanding descriptive statis-tics and general probability principles. Descriptive statisstatis-tics included frequency distributions, measures of location (cen-tral tendency), and measures of dispersion (variance) as de-scribed in Hawkes and Marsh (2005). The second exam score measured the performance of each student in understand-ing discrete distributions, includunderstand-ing the binomial distribution, sampling, and sampling distributions. The third exam score measured the performance of each student in understanding the estimation of means and proportions, hypothesis test-ing, and comparisons of population means and population proportions. There was a fourth test, which measured the understanding of three or more population comparisons and the relationship of two or more continuous or categorical variables; however, this course module was only recently instituted and only a few of the course sections taught this module; therefore, this module was not selected for inclusion in the study. The independent variable was mode, which was the teaching mode: F2F, blended, and online; gender, which was the sex of each student (male or female); the covariates were age; previous statistics experience (yes or no), which was defined as an undergraduate 100- or 200-level statistics course; and course length, which was either long (16 weeks) or short (8 weeks). Grades were defined as the traditional letter grades (A, B, C, D, and F) and were further defined as
A≥90%, 80%≤B≤89.99%, 70%≤C≤79.99%, 60%
≤D≤69.99%, and F≤60%. For the purpose of this study,
the course grades were calculated based on the mean score of the three exams.
Population and Data Collection
The population in this study consisted of 13 total sections of business statistics offered from the fall semester, in 2010 through the summer semester, in 2012. All sections were taught by this researcher. Because all scores were available to the researcher, and they could be easily manipulated in sta-tistical software, all population units were used in the study. Data that were excluded from this study were those from students who had missing exam scores. Therefore, the to-tal population, of exam scores (exams 1, 2, and 3) for all students within the selected sections, was 440. The popula-tion was then divided into three subpopulapopula-tions (or teaching mode) named F2F (face to face or traditional), blended, and online. The sizes of each of the subpopulations were 99, 122, and 219, respectively. All exam scores were collected from the repository grade books in the instructor’s files. The students that made up the population ranged in age from
20 to 62 years old. They were military members (soldiers, airmen, or Marines), a Department of Defense employee, a contractor for the Department of Defense, a family member of the military (either spouse or child), or nonmilitary af-filiated civilian. Additionally, some members of the military were deployed overseas either in combative or noncombative theaters of operation.
Analysis Methodology
The data collected from the grade books was cleaned and entered into Minitab statistical software. Because the data in-cluded three response variables and a predictor variable, and four covariates, a MANCOVA was selected as the statistical method (Mertler & Vannatta, 2010; Ott & Longnecker, 2010). All hypothesis tests were conducted with a significance level
of α = .05. The types of analysis included were the
de-scriptive statistics for each sup-population (means, standard deviations, and variances), mean effects plots, an ANOVA for the differences found, including Bonferroni confidence inter-vals for comparing the significant differences as outlined in Ott and Longnecker (2010).
ANALYSIS
The following analysis was conducted using Minitab software and is presented below grouped into descriptive statistics, MANCOVA tests, and comparison tests. All tests were conducted according to the theories and concepts presented by Mertler and Vannatta (2010) and Ott and Longnecker (2010). These theories and concepts are also the basis from which the tests could be analyzed in the Minitab software.
Descriptive Statistics
As shown in Table 1, the means between each mode of exam 1 and exam 2 seem to be comparable; however, there seems to be a difference in exam 3. The online exam 3 value is much lower than either blended or F2F. The standard deviations show some variability between each mode, within each exam. In exam 1, online seemed to have the lowest variability and F2F seemed to have the lowest variability in exams 2 and 3. Figure 1 shows the variability within each exam. The blended and online modes within exam 1 do show several extreme outliers and mild outliers within the other exams. A test for univariate normality revealed none of the exam scores
could be considered from a normal distribution (p<.005 for
each exam). Levene’s test for equal variances provided ap
value of .413 for exam 1; apvalue of .079 for exam 2; and ap
value of .006 for exam 3; therefore, there was failure to reject the equality of the variances in exams 1 and 2, but rejection of the equality of variance in exam 3.
Figure 2 displays the scores by letter grade within each mode by each exam. The line plots show that students
190 G. R. SIMMONS
TABLE 1
Descriptive Statistics: Performance by Teaching Mode
Variable Mode Total count M SD Variance IQR
Exam 1 Blended 122 24.651 4.697 22.058 5.074
F2F 99 24.121 4.231 17.905 5.001
Online 219 25.157 3.803 14.461 4.341
Exam 2 Blended 122 20.698 6.719 45.151 11.939
F2F 99 21.318 5.578 31.115 9.672
Online 219 20.280 5.747 33.023 9.363
Exam 3 Blended 122 22.024 6.680 44.617 11.306
F2F 99 22.745 5.368 28.810 7.140
Online 219 18.921 6.779 45.960 10.161
Note: F2F=face to face; IQR=interquartile range.
performed better in exams 1 and 2, in terms of those scoring Ds and Fs. In exam 3, the students scoring Ds and Fs scored lower than in exams 1 and 2, with the students in the Online mode scoring the lowest.
Table 2 shows the distribution of letter grades across the different teaching modes.
Multivariate Analysis of Covariance
In conducting the MANCOVA, performance (for exam 3) was statistically significant when controlling for age, gender, experience with statistics, and course length (Pillai’s Trace
=.108),F(6, 864)=8.211,p<.000. Age (p=.015) was
retained as the only covariate in the model to the conduct of the ANCOVA to determine the actual Bonferroni differences within exam 3. The other covariates, gender, previous
statistics, and course length (p = .517, .244, and .970,
respectively), were found to be not significant.
Through the ANCOVA, in which age was controlled, differences were found between the online mode and both blended and F2F, but not between blended and F2F. The Bonferroni confidence intervals were blended–online,
t = –4.630 p < .000 (–5.190, –1.643); and
F2F-online, t = –5.138 p < .000 (–5.903, –2.141). The
main effects plot in Figure 3 graphically depicts the differences.
30
25
20
15
10
Online F2F
Blended
30
20
10
0
Online F2F
Blended 30
20
10
0
Exam 1
Mode
Exam 2
Exam 3
FIGURE 1 Boxplot of exam mean by teaching mode.
F
FIGURE 2 Line plot of means (exams 1, 2, 3 vs. grade) (color figure available online).
Age
Age was also found to be statistically significant in the MAN-COVA. Figure 4 is a histogram of the distribution of the ages within the courses. As shown in the histogram, the distri-bution of ages is for a nontraditional university, where the mean age is approximately 35 years (from Table 3). The ages ranged from 20 to 62 years.
Table 3 shows the descriptive statistics for the age dis-tribution. The age groups in Table 3 were based on the age groups used by the National Center for Higher Education, in their on-going research of higher education (National Cen-ter for Higher Education Management Systems Information Center for Higher Education Policymaking and Analysis, 2013). The table shows the mean age of the students in these courses was 34 years, which represents the largest age group (25–34 years old). Approximately 54% of this age group took the business statistics course online.
TABLE 2
Letter Grade Distribution by Teaching Mode
Grade Blended F2F Online
A 34% 24% 9%
B 16% 27% 16%
C 14% 17% 22%
D 10% 12% 11%
F 26% 19% 42%
Total 100% 100% 100%
Note: F2F=face to face.
The 45–64 years old age group represented approximately 15% of the population. This age group was represented in the three modes of instruction, blended (42%), F2F (27%), and online (31%). Within each mode, the age group also had the lowest numbers: blended (22%), F2F (17%), and online (9%). It is important to note the 45–64 years old age group, because this group had the lowest performance, in terms of exam mean scores as shown in Figure 5.
Figure 5 shows that the 45–65 years old group had the lowest average test score in each learning mode, except the online mode, where the 25–34 years old group score highest and the 45–65 years old group scored second highest, albeit with low scores. In all year groups, except the 25–34 years old group, students scored highest, within their year groups, in the F2F learning mode.
In conducting a comparison of mean exam scores by age group, a Kruskal-Wallis test was performed because the mean
exam score was determined to be nonnormal (p = .005).
A difference was found between the age groups using the
Kruskal-Wallis test, H(3)=10.64, p=.014. In the actual
comparisons, the 25–34 years old age group was significantly
different from the 45–64 years old age group (p=.005) and
from the 35–44 years old age group (p=.029).
RESULTS
From the analysis (Table 1 and Figure 2), the blended and F2F sections scored lowest on exam 2, on average, and had an overall negative scoring trend, on average, as the sections
192 G. R. SIMMONS
Online F2F
Blended 23
22
21
20
19
Mode
M
ea
n
FIGURE 3 Main effects plot for exam 3.
56 48
40 32
24 16
12
10
8
6
4
2
0
Age
Per
c
en
t
Mean 34.61 8.562 440 N
SD
FIGURE 4 Age distribution of the population (color figure available online).
TABLE 3 Age Distribution
Blended F2F Online
Age group (years) Count Mage (years) SD n % n % n %
18–24 36 23.111 1.090 7 6 5 5 24 11
25–34 207 29.121 2.884 49 40 46 47 112 51
35–44 133 38.992 2.922 39 32 31 31 63 29
45–64 64 49.734 4.195 27 22 17 17 20 9
Total 440 34.611 8.562 122 99 219
Note: F2F=face to face.
progressed through the course. The online sections had a con-tinuous negative scoring trend, on average, as they progressed through the course, scoring the lowest of each section, on av-erage, on exam 3. The first research question asked about student performance within each teaching mode in terms of course grades. From Table 2, 50% of the students in the blended sections earned As and Bs, whereas 36% earned Ds and Fs. Within the F2F sections, 51% of the students earned As and Bs, and 31% earned Ds and Fs. However, only 25% of the students in the online sections earned As and Bs and 53% earned Ds and Fs.
The second research question asked how the performance of F2F, blended, and online business statistics sections com-pared, in terms of course exam scores, controlling for age, gender, experience with statistics, and course length. As
noted in the analysis section, statistical differences between the blended and F2F sections were not found. However, the linear combination between mode and exam 3, while con-trolling for age, was statistically significant. Additionally, the significant differences were found between both the blended and F2F modes and the online mode. Based on this differ-ence, the null hypothesis was rejected.
The third research question asked how the performance by age group compared in terms the mean exam scores. In the analysis, the 35–44 and 45–64 years old age groups were significantly different from the 25–34 years old age group. The 18–24 years old age group was not significantly different from any other age group. Based on the differences from the 25–34 years old age group, the null hypothesis was rejected.
Online F2F
Blended 24
23
22
21
20
Mode
Ex
a
m
M
ea
n
18–24 25–34 35–44 45–64 Age Group
FIGURE 5 Student grades by mode and age group (color figure available online).
194 G. R. SIMMONS
CONCLUSIONS
The purpose of this study was to compare the performance of F2F, blended, and online business statistics sections in terms of course exam scores, controlling for age, gender, previous experience in statistics, and course length, of the sections that met between the fall of 2010 and the summer of 2012 at a regionally accredited university. The results found in the blended and F2F teaching modes that nearly half of each sub-population scored As or Bs, on average, of the exams, but only 25% of the online students scored As or Bs, on average. A significant difference was found in the online teaching mode for exam 3 leading to the rejection of the null hypothesis and a significant difference was found between the age groups.
Student performance in terms of course grades in the first research question showed students performed better in the both the blended and F2F classes, than in the online classes. The reasons for the differences between the modes was pos-sibly due to the online students not viewing the recorded lectures prior to beginning practice problems or the home-work assignments. The single common factor between the F2F and the blended was direct or physical access to the instructor during class time.
Student performance in terms of the three teaching modes and exam scores, in the second research question, showed no significant difference between the F2F and blended modes, but did show a significant difference between those modes and the online mode in the linear combination of the third exam. This led to the rejection of the first hypothesis. There was a trend of lower tests scores as the course progressed. The differences in the third exam for the online students, seems to be a combination of the difficulty and unfamiliarity of the concepts and possibly the exam was taken at the end of the term, where other personal factors or stresses, not considered in this study, may have affected the students.
Student performance in terms of age groups and mean exam scores, in the third research question, showed differ-ences found with age groups, the 45–64 years old age group consistently performed lower than the other age groups in blended and F2F. This difference led to the rejection of the second hypothesis. The differences in the 45–64 years old age group might possibly be the result of the span of time because they were last in an academic learning environment. Students in this age may need to consider taking or retak-ing math or other quantitative courses as a refresher prior to taking the business statistics course.
RECOMMENDATIONS FOR FUTURE STUDIES
This study has surfaced problem areas in learning business statistics, at this institution, primarily for the online students. Further research is required to determine the root causes of these problems. At the time of the writing, the institution
identified courses, including business statistics, that would only be taught over 16 weeks in the online mode. This rule was be implemented in the fall 2013 term. There may be additional factors or student stressors that were not consid-ered, but should be identified. Future studies should identify these additional factors, perhaps time management, procras-tination, or test anxiety on the part of the students, and in-clude them in the analysis. Finally, identify levels of previous mathematical or other quantitative skills, by age group, to de-termine the causes of the differences in exams scores.
REFERENCES
Arbaugh, J. B., Desai, A., Rau, B., & Sridhar, B. S. (2010). A review of research on online and blended learning in the management disciplines: 1994–2009.Organization Management Journal,7, 39–55. http://dx.doi. org/10.1057/omj.2010.5
Ashby, J., Sadera, W. A., & McNary, S. W. (2011). Comparing student success between development math courses offered online, blended, and face-to-face.Journal of Interactive Online Learning,10, 129–140. Re-trieved from ProQuest database.
Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L.,. . .Huang, B. (2004). How does distance education compare with
classroom instruction? A meta-analysis of the empirical literature. Re-view of Educational Research,74, 379–439. Retrieved from ProQuest database.
Cybinski, P., & Selvanathan, S. (2005). Learning experience and learning effectiveness in undergraduate statistics: Modeling performance in tra-ditional and flexible learning environments.Decision Sciences Journal of Innovative Education,3, 251–271. http://dx.doi.org/10.1111/j.1540-4609.2005.00069.x
DeNeui, D. L., & Dodge, T. L. (2006). Asynchronous learning networks and student outcomes: the utility of online learning components in hybrid courses.Journal of Instructional Psychology,33, 256–259. Retrieved from ProQuest database.
Dotterweich, D. P., & Rochelle, C. F. (2012, Mar/Apr). Online, instruc-tional television, and tradiinstruc-tional delivery: Student characteristics and success factors in business statistics.American Journal of Business,5, 129–138. Retrieved from http://ehis.ebscohost.com.zeus.tarleton.edu:81/ ehost/pdfviewer/pdfviewer?vid=9&hid=3&sid=965a771a-4662-47b0-8080-8da49b6f4146%40sessionmgr112
El Hajjar, S. T. (2011). An empirical study about the use and implementa-tion of software in statistics at higher educaimplementa-tion instituimplementa-tions.International Journal of Engineering & Technology,11(6), 45–54. Retrieved from Pro-Quest database.
Gomez, R. (2010). Innovations in teaching undergraduate statistics courses for biology students.Review of Higher Education and Self-Learning,3(7), 8–13. Retrieved from ProQuest database.
Hamadu, D., Adeleke, I., & Ehie, I. (2011). Using information technology in teaching of business statistics in Nigeria Business School.American Journal of Business Education,4(10), 85–92. Retrieved from ProQuest database.
Harrington, D. (1999). Teaching Statistics: A comparison of traditional classroom and programmed instruction/distance learning approaches.
Journal of Social Work Education,35, 343–352. Retrieved from ProQuest database.
Hawkes, J. S., & Marsh, W. H. (2005).Discovering statistics(2nd ed.). Charleston, SC: Hawkes Learning Systems.
Hawkes Learning Systems. (2012).How the software works. Retrieved from http://www.hawkeslearning.com/Instructors/HowTheSoftwareWorks.htm
McLaren, C. H. (2004). A comparison of student persistence and perfor-mance in online and classroom business statistics experiences. Deci-sion Sciences Journal of Innovative Education,2, 1–10. http://dx.doi.org/ 10.1111/j.0011-7315.2004.00015.x.
Mertler, C. A., & Vannatta, R. A. (2010).Advanced and multivariate statis-tical methods(4th ed.). Glendale, CA: Pyrczak.
Mills, J. D., & Raju, D. (2011). Teaching statistics online: A decade’s review of the literature about what works. Journal of Statistics Ed-ucation,19, 1–27. Retrieved from http://www.amstat.org/publications/ jse/v19n2/mills.pdf
National Center for Education Statistics. (2011).Fast facts. Retrieved from http://nces.ed.gov/fastfacts/display.asp?id=74
National Center for Higher Education Management Systems Information Center for Higher Education Policymaking and Analysis. (2013). Edu-cational levels of the population: EduEdu-cational attainment by degree-level and age-group (decennial census). Retrieved from http://www.highered info.org/dbrowser/index.php?submeasure=201&year=2000&level=nat ion&mode=data&state=0
Ott, R. L., & Longnecker, M. (2010).An introduction to statistical methods and data analysis(6th ed.). Belmont, CA: Brooks/Cole–Cengage. Radford, A. W. (2011, October). Learning at a distance:
undergradu-ate enrollment in distance education courses and degree programs.
NCES 2012-154.Stats in Brief.Retrieved from http://nces.ed.gov/pubs 2012/2012154.pdf
Spinelli, M. A. (2001). The use of technology in teaching business statistics.
Journal of Education for Business,77, 41–43. Retrieved from ProQuest database.
Summers, J. J., Waigandt, A., & Whittaker, T. A. (2005). A comparison of student achievement and satisfaction in an online versus a traditional face-to-face statistics class.Innovative Higher Education,29, 233–250. http://dx.doi.org/10.1007/s10755-005-1938-x
Tallent-Runnels, M. K., Thomas, J. A., Lan, W. Y., Cooper, S., Ahern, T. C., Shaw, S. M., & Liu, X. (2006). Teaching courses online: A review of the research.Review of Educational Research,76, 93–135. Retrieved from ProQuest database.
Texas A&M University-Central Texas. (2012). Academic class syllabi. Retrieved from http://www.tamuct.edu/departments/syllabi/ index.php
University of Phoenix. (2012). History. Retrieved from http://www. phoenix.edu/about us/about university of phoenix/history.html York, R. O. (2008). Comparing three modes of
instruc-tion in a graduate social work program. Journal of
So-cial Work Education, 44, 157–172. Retrieved from ProQuest
database.