ACM Logo  An ACM Publication  |  CONTRIBUTE  |  FOLLOW    

Blended Learning Research: Components critical to student learning from a meta-analysis

Special Issue: Paradigm Shifts in Global Higher Education and eLearning

By Leslie Cordie, Xi Lin, William Fowler, Michael Wooten / May 2019

TYPE: HIGHER EDUCATION, INTERNATIONAL ONLINE EDUCATION
Print Email
Comments (1) Instapaper

Widespread access to the Internet, coupled with the availability of efficient and cost-effective access devices has served as the impetus for dynamic instructional change throughout higher education. Integration of technology has taken a multitude of forms—simple electronic presentation methods, remote classrooms, and fully online degree programs, to name a few. One common model of technology-enhanced instruction that falls in the middle of this spectrum is blended learning. The literature in the area of blended learning includes a multitude of disciplines, topics, and research formats [13]. Any review of the educational literature will quickly note a wide range of instruction modalities that have been associated with the term. Approaches described in the literature under the blended learning moniker include hybrid instruction, flipped classrooms, station rotation, individualized learning, and numerous others. Despite this substantial degree of variation, instructional approaches grouped under the blended category all share a commonality of mixing face-to-face interactions with technology-based learning [13]. Most definitions of blended learning in the higher-education environment incorporate these elements, with the additional feature that a portion of the course involves asynchronous, independent work by students (or student groups) in a technology-enhanced environment [2].  Because of these features, blended courses can be highly complex and involve diverse instructional and teaching patterns, which in turn, affect successful course implementation and student learning outcomes. The optimal technology mix for teaching blended courses necessitates a shift in pedagogy toward a more andragogical, or self-directed, approach, along with a heightened focus on communication and interactive technologies.

Research and data show demand for blended learning has exhibited near exponential growth, as new forms of student-content interactions are being created in this digital age [4]. Student expectations have changed for their learning environments. These factors provide significant implications for educators and instructional designers in higher education. Technology is allowing students to become more engaged in constructing their own knowledge, which studies have shown as a key component of learning success [5]. New media formats and the increased use of emerging technologies for content delivery are creating wider access to all forms of education, including formal, informal, and non-formal contexts for the higher-education environment. As more user-friendly technologies are developed, their use will be interwoven into the fabric of both every day and educational lives for learners, thus providing greater opportunities to facilitate a central goal of higher education; i.e., the development of lifelong learning.

More technologically dependent, but with similar instructional models as blended learning, are completely online courses. A common breakpoint between the two models is at the 50 percent mark. In many institutions, courses are classified as “online” when more than 50 percent of the material/lectures are delivered asynchronously [1]. Online and blended courses are similar in being highly technologically dependent and requiring both students and instructors be computer savvy. Both of these methods can provide effective options for dealing with student course access, classroom space, and enrollment management in higher education. As brick and mortar institutions cope with ever-increasing tuition costs and burgeoning student numbers, a premium is placed on classrooms and facilities space [6]. One approach to these problems is development of curriculum that require the first year of degree programs to be taken fully online and off campus. Students, thus, essentially only come to campus to complete higher-level coursework. This scheme is considered more cost efficient for both the student and the institution, which has the potential to see it become widespread across academia [6]. Given the rapid growth in the numbers of both blended and online course and program offerings, it essential that research continues to build a more complete understanding of how to design these courses to best serve student in higher education.

Research Questions and Concept Model

Over the past two-plus decades, the authors have designed, provided peer and faculty development, and taught both blended and online courses in the higher-education environment. We have observed, as have many others, that students respond differently to blended versus online approaches of instruction. Our belief is that students often perceive there being substantial differences between the two approaches, even in cases where identical materials are presented in a similar manner. This study was initiated to evaluate this perception using a meta-analysis approach. In doing so, we hoped to enhance our understanding of the conditions and practices associated with effective blended and online teaching and learning in the higher-education environment. Given the rapid growth in the numbers of both blended and online course and program offerings, it essential research continues to build a more complete understanding of how to design these courses to best serve student in higher education.

This study was based on Hillman and colleagues’conceptual model on interaction in distance education and its importance to student learning outcomes [7]. This model framed what we apriori considered to be the potential relationship between student satisfaction and interactions with instruction and content in fully online learning and blended learning approaches. Satisfaction in any context is a multi-faceted concept. Elements underlying student satisfaction often include motivation provided by the course content, perceived value of the information, contribution to student success in the curriculum, grading structure, and multiple other tangible / intangible dimensions. Despite this complexity, assessment instruments that rely on indistinct measures of student satisfaction are widely used as a mainstay of higher education course evaluations [8]. Research aimed at better understanding the impacts of these dimensions on learning outcomes is an active area of scholarly activity.  

To guide our analysis, we formalized our perception and made predictions based on our experiences with student feedback and course evaluations. The overall hypothesis was: "Factors driving student satisfaction differ between fully online and blended learning approaches in higher education." Our predictions were: (1) Content will be more influential than instructional components in fully online learning environments; and (2) instructional and content components will be more influential in blended-learning environments. These relationships are illustrated in Figure 1, with predicted influences highlighted by arrow sizes and dimension of components.

Figure 1. Interaction Influences in Online and Blended Learning.

[click to enlarge]

Methods

To gather the data required to test our predictions, we conducted a broad survey of the existing literature. As the number of research studies on technologies and learning are diverse and number in the thousands, the main research approach for this study was based on the concept of meta-analysis [9]. As noted above, the literature in the area of blended learning includes a multitude of disciplines, topics, and research formats [13]. Our initial literature review examined studies reporting comparisons of face-to-face (F2F), online, or blended learning. This review included more than 600 abstracts from the research literature and the online academic databases (Academic Search Premier, Lexis Nexus, Google Scholar, ProQuest, Inter-library Loan, and others). Search keywords included blended learning, distance education, hybrid, mixed, flipped, digital, higher education, post-secondary, college, university, and others. In addition, subject matter and technologies/media format (LMS, videoconferencing, synchronous/asynchronous) were noted in the coding process.

From this large dataset, a subset containing all studies reporting comparisons between fully online and blended courses that were correlational in design [10] was generated. These papers and the references within them were used to further identify studies that included satisfaction as the learning outcome. This process resulted in an additional broad literature search using further keywords that included instruction, faculty, teaching, content, interaction, course evaluation, quality, perception, and others. This review found an additional 250 studies in the literature.

Ultimately, only data from studies that included correlations for content, instruction, and measured the outcome of satisfaction in the higher education environment were selected for the final analysis set. Our rationale for inclusion in the study included research design (quantitative), blended learning definition, higher education environment, learning outcomes, and year of study (post 2000). Analysis of the quality of the studies was done by examining the reported reliability of the survey instruments and measurement of the factors related to content, interaction and satisfaction. Given all of these conditions, reports from 18 were judged to contain adequate comparisons and reliability measures to support evaluation (10 cases for online and eight for blended).

The final dataset consisted of Pearson correlation values from comparisons among content, instruction, and satisfaction for each of the 18 studies (see Table 1 and Table 2). Given the data format and sample size, a variety of statistical methods were used. Directional comparisons between individual content – satisfaction and instruction – satisfaction correlations were made using the method of Meng et al [11]) as implemented in the statistical program cocor ([10]). Tests of general trends between groups of correlations were made using Wilcoxon sign-rank tests (W).  However, small sample sizes prevented calculation of probability estimates from these tests, thus they were used only to evaluate directional trends. Probability values (p-values) for differences between correlation groups were estimated from empirical distributions produced using randomization, with replacement modeling [12] with ∑ rcontent -   rinstructor  as the statistic.  For all analyses, one-tailed p values equal to or less that 0.05 were considered significant.  Overall, all of the various statistical approaches yielded similar outcomes.

Results and Conclusions

This study examined student satisfaction in online and blended learning environments, focusing on course content and instruction format as factors of course design in student learning. The literature in the area of blended learning includes a multitude of disciplines, topics, and research formats [13]. Despite this substantial degree of variation, instructional approaches grouped under the blended category all share a commonality of mixing face-to-face interactions with technology-based learning [13]. A systematic review of the literature using meta-analysis started with the review of more than 600 research abstracts. From this large dataset, a subset containing all studies reporting comparisons between fully online and blended courses that were correlational in design was generated. These studies were used to further identify research that included satisfaction as the learning outcome. This process resulted in an additional broad literature review.

Ultimately, only data from studies that included correlations for content, instruction, and measured the outcome of satisfaction in the higher-education environment were selected for the final analysis set. The final selection of studies for the meta-analysis required quantitative research design, blended definition, higher-education environment, satisfaction as a learning outcome, and year of study post 2000 (when online and blended learning became significant in the research literature). Examining the reliability of the survey instruments used in the research limited the meta-analysis to 18 final studies adequate in quality.

Table 1 contains the data and results from each of the 10 online studies. Data all of these 10 studies showed significant directional relationships among content – instruction – satisfaction. In each case, content was more highly correlated with student satisfaction than was instruction method, with five of the 10 comparisons being highly significant. Randomization-based p-values indicated overall significant differences between content-satisfaction and instruction-satisfaction r values with p = 0.004 for the 10 online studies. The weighted (adjusted for sample sizes) means for online content (0.639) and instruction (0.567) also reflected this pattern.

Contrasting results were observed for the eight blended learning studies (Table 2).  Four of the studies reported stronger correlations between instruction – satisfaction, while the remaining four reported the opposite outcome. Only two of the eight comparisons resulted in significant one-tailed relationships. This result is mirrored in the weighted means where contents average was 0.592 and instructions was 0.570. A non-significant randomization-based p-value = 0.487 for the eight blended learning studies offers further evidence for this conclusion.

How then do these results relate to our anecdotal perceptions that led to this study? Overall, the meta-analysis results supported a conclusion that key elements impacting student satisfaction do vary between the two learning approaches. Consistent with our ad-hoc predictions, students appeared to view content as a more important element in fully online courses, as opposed to both content and instruction interactions both being critical in blended courses. From these results, both course designers and instructors need to understand how to utilize interaction and engagement with learners. Overall, course format was found to be significant in terms of student satisfaction when conducting this broad meta-analysis.

Table 1. Correlations of Content and Instruction Components to Student Satisfaction in Fully Online Learning.
Study sample sizes, Direct correlation between Content and Instruction components (C x I), Probability value of Content vs. Instruction r


[click to enlarge]

Table 2. Correlations of Content and Instruction Components to Student Satisfaction in Blended Learning.
Study sample sizes, Direct correlation between Content and Instruction components (C x I), Probability value of Content vs. Instruction r


[click to enlarge]

Recommendations

This study examined two broad categories related to student satisfaction (course content and instruction format). The final number of available studies examined was small, but this itself speaks to the need for continued research into this topic. It is clear from our perceptions and these results, in the higher-education environment it is essential that both the method of interaction and the content be given focused attention during course development and presentation of course materials. Also, while the two categories (course content and instruction format) examined in this study are important, future research should be aimed at identifying the critical subcomponents of these two categories. Research on course design factors in both the online and blended learning environments related to instruction should be conducted using a multi-disciplinary model or framework. Our results indicate that elements of the model should include content design that is appropriate for both approaches, but with special emphasis for online courses. In addition, the hybrid approach of blended learning seems to integrate the Instructor more fully into positive student perceptions and higher overall learning outcomes. Lastly, this study demonstrates that meta-analysis can provide valuable insight into research that is worthy of focus in higher education as instruction and learning continue to adapt to the use of technology in this digital age.

References

[1] Means, B., Toyama, Y.  Murphy, R.,  and Baki., M. The effectiveness of online and blended learning: A meta-analysis of the empirical literature. Teachers College Record 115, 3 (2013), 1-47.

[2] Bernard, R. M.,  Borokhovski, E., Schmid, R. F., Tamim, R. M., and Abrami, P. C. A meta-analysis of blended learning and technology use in higher education: From the general to the applied. Journal of Computing in Higher Education 26, 1 (2014), 87-122.

[3] Bernard, R. M., Abrami,  P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., Wallet, P. A., Fiset, M., and Huang, B. How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research 74, 3 (2004), 379-439.

[4] I. Allen, E.  and Seaman, J. Learning on Demand: Online Education in the United States, 2009. Sloan Consortium. 2010.

[5] Trowler, V. Student engagement literature review. The Higher Education Academy 11, 1 (2010), 1-15.

[6] Johnson L., Becker, S. A., Cummins, M, Estrada, V, Freeman, A., and Hall, C. NMC Horizon Report: 2016 Higher Education Edition. The New Media Consortium. 2016.

[7] Hillman, D. C., Willis, D. J., and Gunawardena, C. N. Learner???interface interaction in distance education: An extension of contemporary models and strategies for practitioners. American Journal of Distance Education 8, 2 (1994), 30-42.

[8] Marsh, H. W. Students’ evaluations of university teaching: Dimensionality, reliability, validity, potential biases and usefulness. In The Scholarship of Teaching and Learning in Higher Education: An Evidence-based perspective. Springer Netherlands, 2007, 319-383. https://doi.org/10.1007/1-4020-5742-3

[9] Glass, G. V. Primary, secondary, and meta-analysis of research. Educational Researcher 5, 10 (1976), 3-8.

[10] Diedenhofen, B. and Musch, J. cocor: A comprehensive solution for the statistical comparison of correlations. PloS one 10, 6 (2015) e0121945. https://doi.org/10.1371/journal.pone.0121945

[11] Meng, X.-l.Robert R. Rosenthal, R. R., and Rubin, D. B. Comparing correlated correlation coefficients. Psychological Bulletin 111, 1 (1992), 172-175. http://dx.doi.org/10.1037/0033-2909.111.1.172

[12] Noreen, E. W. Computer Intensive Methods for Testing Hypotheses: An Introduction. Wiley, New York, 1989.

About the Authors

Leslie Cordie is an assistant professor at Auburn University. She has over 20-plus years combined interdisciplinary experiences across many organizations, including higher ed, healthcare and industry. Her background includes online learning, adult education, faculty/professional development, instructional design, and program development. Cordie holds a Ph.D. in education, with specializations in distance learning, adult learning theory, technical writing, instructional design and curriculum development. Cordie also holds a MBA, and her professional experience includes positions with the airline industry and the US Air Force.

Xi Lin is an assistant professor in the Department of Interdisciplinary Professions at East Carolina University. Her research focuses on instructional strategies for distance learning, mobile learning, learning motivations of adult learners, and international students and international faculty in the U.S. higher education system. Lin has won multiple awards for her research, including several from the American Association of Adult and Continuing Education (AAACE).

William Fowler holds an MSW from the University of Alabama, Ph.D from Auburn University in adult education.  He currently holds a position as a geriatric social worker in Fort Collins, Colorado.

Michael Wooten is professor of genetics and assistant chair in the Department of Biological Sciences, Auburn University and serves as the online course coordinator for the College of Sciences and Mathematics. For 30-plus years he has instructed on various genetics and biological sciences courses. Wooten has been an early-adopter of digital technologies and developed the first fully online course in his department (Genetics). Wooten is an author of 80-plus research articles on topics ranging from endangered species to neurological disease. He is currently formalizing a new educational model for higher education that focuses on this critical transitional period of learning.

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Copyright © ACM. 1535-394X/2019/09-3331168 $15.00

https://doi.org/10.1145/3331168



Comments

  • Wed, 05 Aug 2020
    Post by David

    Do you think models that provide more inclusive definitions of 'satisfaction' can be considered, such as the Community of Inquiry model (Garrison, Anderson, and Archer, 2000; 2020), which encompasses teaching, social, cognitive, and emotional presence? In your article, you mention that a multi-disciplinary approach is encouraged. I was surprised by the relatively small number of studies that met the threshold for your purpose.

    I teach in a setting where English is a foreign language (EFL), and I am struck by the tendency of administrators to use 'satisfaction surveys' which aren't tailored to students' language proficiency. I think there is a gap in the literature for English language learner (ELL) contexts, let alone the gap in the literature for what models best describe or measure 'satisfaction' or 'learning experience'. Thanks for sharing your work for how to modify or test a model toward this goal.