- Home
- Articles
- Reviews
- About
- Archives
- Past Issues
- The eLearn Blog
Archives
To leave a comment you must sign in. Please log in or create an ACM Account. Forgot your username or password? |
Create an ACM Account |
When evaluating teaching effectiveness, online learning offers one distinct advantage over its face-to-face counterpart: tangible artifacts. While the principles of effective teaching don't vary as a function of instructional mode, instructors in the online classroom create teaching footprints, concrete evidence of each and every interaction. Evaluating online teaching simply becomes a matter of identifying, operationalizing and assessing the trail of instructional footprints.
Traditional evaluations of online teaching rely on an observer to review the compilation of instructional artifacts and make quality judgments based on the evidence. To ensure a valid and reliable evaluation, these observations generally follow a rubric, standards checklist, or some other pre-established criteria. While these types of observer-based reviews are indeed effective, they require considerable time, effort, and resources to implement. Thus, despite the utility of observer reviews, practical factors often limit their effectiveness in large, scalable online programs.
But, with advances in learning management technology, evaluations of online teaching have a new weapon. A weapon that can quickly and easily examine countless teaching footprints with minimal time, effort, or resources: data analytics. The concept of data analytics isn't new; since the emergence of learning management systems, analytics have allowed us to view basic artifacts such as login time, login frequency, number of posts, etc. But, the advances in data analytics underlying learning management technology are taking the possibilities of data analytics to a new level.
At present, learning management systems can track every login, keystroke, date stamp, time stamp, and related indicators of instructor behavior. But, this is only the tip of what is possible; advances in educational technology, learning management systems, and predictive modeling may soon allow for more integrated, informative analytics.
Currently, technology exists to recognize key words and string them together to identify themes and patterns. As thematic analysis becomes more advanced, evaluations of teaching could integrate content analysis of instructional posts to determine the validity and relevance on instructor-student interactions. While automated thematic analysis is not likely to serve as a stand-alone metric, the efficiency offered through this type of analysis provides an indicator to prompt attention toward faculty members who may be struggling.
Much like the push for predictive analytics to analyze student demographic characteristics as a means of predicting student success, predictive analytics may be utilized to identify which faculty are likely to need additional support and who would be most likely to benefit from additional training or mentoring. Predictive dashboards (based on factors such as teaching experience, online experience, disciplinary background, age, etc.) allow for more tailored faculty development programming to target instructional resources and support to those most likely to benefit from it.
Complementing the information available via predictive dashboards, data analytics focusing on faculty behaviors in the classroom provide an efficient means of focusing evaluation efforts on the faculty who are most in need. Rather than wasting resources to evaluate all faculty on the same schedule or timeline, data analytics on teaching behavior (time in online classroom, number of discussion posts, length of discussion posts, length of feedback, timeliness of feedback, number and nature of announcements, patterns of classroom interaction, student evaluations, end-of-course evaluations, etc.) could pinpoint faculty who need additional attention. The efficiency of this type of data allows evaluation (and mentoring) resources to be quickly aligned with those who need it most. Not only does this provide additional support for the struggling faculty member, but it also fosters increased educational quality during the active course.
In most online programs, all instructors are evaluated on the same basic criteria to support best practices in online teaching. The integration of data analytics allows for predictive modeling to determine if there are disciplinary differences in the value and effectiveness of various online instructional behaviors. It is possible the online teaching activities that effectively support math are different than the instructional activities necessary in psychology. Data analytics and predictive modeling allow for the development of more targeted evaluation criteria.
If data analytics can tell us how a particular student can best learn then it is feasible that analytic data can also help pinpoint how a particular instructor can best teach. At present, because all faculty are evaluated in the same way, faculty who receive higher evaluations are more likely to be retained and given ongoing teaching assignments. But, what if the "lower- performing" faculty are no longer "lower performing"? What if they are allowed to adjust their teaching to conform to different expectations based on student needs that align to their natural teaching style versus being forced to align with institutional expectations of "good teaching"? The use of data analytics to identify instructors' patterns of behavior in relation to the success (or failure) of particular types of students opens the door for targeted enrollment that maximizes "fit" to promote success for both faculty and students.
Predictive analytics is seeing growing popularity as it applies to the promotion of student success; the future of analytics must also examine the role of learning management system artifacts and predictive modeling for maximizing the success of faculty in the online classroom. This is not to suggest that institutions fully embrace data-driven decision-making to guide faculty evaluations, but rather to prompt the inclusion of data analytics to allow for efficient and effective targeting of faculty development and evaluation resources.
Kelly Palese-Sanderson is the Vice President of Academic Operations at Grand Canyon University. She has 15 years of experience working in higher education in student and faculty development as well as teaching. Kelly's areas of interest lie within faculty development in both the online and campus-based modalities. She has trained, coached, and mentored hundreds of instructors and continues to research in this area. Kelly has a B.S. in secondary education, an M.A.Ed. in educational counseling and is currently pursuing a Ph.D. in psychology.
B. Jean Mandernach is Research Professor and Director of the Center for Innovation in Research and Teaching at Grand Canyon University. Her research focuses on enhancing student learning through assessment and innovative online instructional strategies. In addition, she has interests in examining the perception of online degrees, the quality of online course offerings and the development of effective faculty evaluation models. Jean received her B.S. in comprehensive psychology from the University of Nebraska at Kearney, an M.S. in experimental psychology from Western Illinois University and Ph.D. in social psychology from the University of Nebraska at Lincoln.
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]
2015 Copyright held by the Owner/Author. Publication rights licensed to ACM. 1535-394X/15/01-2696534 $15.00
To leave a comment you must sign in. |
Create an ACM Account. |