Research-based Frameworks for Addressing and Assessing Online Learning Engagement

decorative image

I begin this post with two hypotheses about online learning, based on the one hand on my experience as a community college composition and humanities teacher in both face to face and digital formats, and on the other, on my experience as a graduate student who has taken digital courses from two public research institutions and one private university.

As a teacher, I’ve seen the fear manifested by students who are required to use new technologies, even when these are supported, and the disengagement that can be amplified in an online environment when novice students use avoidance strategies such as procrastination or opting out of course assignments either in toto or by failing to meet minimum requirements. As a student, I’ve seen online instructors create barriers with difficult-to-solve and unsupported technology requirements, assign group work that provided little opportunity for online collaboration or accountability while penalizing students who engaged the assignment for the choices of others who did not, neglect to provide clear expectations, and fail to update courses before the beginning of a class such that students experienced continual revising of due dates, resource links, assessment criteria, and even of unit designs throughout the course of a class.

The first hypothesis is that the ratio of nontraditional to traditional students is greater in digital than in brick and mortar formats. The second hypothesis is that despite the prevalence in #edtech instructor training of “frameworks,” lists of “best practices” and tutorials in new technologies, most college teachers and institutions that are implementing online learning formats could do more to align online course design and instructor behavior with cognitivist and constructivist learning theory, with empirically verified online pedagogical strategies, and with systematic piloting and review of digital innovations at the course and program levels (the reciprocal relationship suggested for design and assessment suggested but not fully articulated by ISTE Standards for Educators 5 and 7).

At the intersection of these two hypotheses is the crux of the matter: if a higher ratio of first generation, English language learner, adult, and other nontraditional students is enrolling in online (defined here as any combination of synchronous or asynchronous learning) courses, and the shift to teaching in digital spaces and/or new technological innovations requires teachers to develop new communication, technology and pedagogical design skills while amplifying the negative effect of the lack of such skills, do resulting negative impacts on student retention and motivation create a significant disparate impact for nontraditional students? On the flip side, how can implementation of learning theories such as andragogy and social constructionism, together with more evidence-based review of online teaching approaches, result in increased success for the less traditional student population that tends to take online courses?

A number of studies confirm that nontraditional students comprise the majority of online learners (Chen, Lambert & Guidry, 2010; Thompson, Miller & Pomykal Franz, 2013). Teachers and institutions who create online learning experiences thus need to consider both the assumptions of adult learning theory, such as that adult learners have and use more personal experience in learning, maintain responsibility for their own learning and resist situations where learning appears to be dispensed or controlled by others, and are more intrinsically than extrinsically motivated (Holton, Swanson, & Naquin, 2001), as well as issues of diversity, such as adaption, acculturation, identity formation, and diverse practices and understandings of knowledge acquisition and demonstration (Nielsen, 2014).

I would like to posit that a question as complex as how to build capacity at the course and institutional level for supporting such learners in online formats cannot be addressed effectively without systematic analysis, design, and evaluation of possible solutions as well as consideration not only of what innovations may work but of why they work and whether they are scalable.

In this post, I respond to recent literature seeking to put principles of learning theory into conversation with systematic qualitative analytical approaches to problems of student engagement by suggesting that educators join other stakeholders in early implementation of these models as well as in systematic review of results. I’ll discuss two research-based frameworks for online course design and one for course/program review, and suggest a connection between the two for implementation.

 

An Online Engagement Framework

Redmond, Abawi, Brown, Henderson and Heffernan (2018) have published a conceptual framework for student engagement intended for use as a planning and auditing tool to guide instructors, designers, institutional assessors, administrators and policymakers in improving engagement of online learners, and have invited other instructors and researchers to use the framework to assist in its validation process.

Like Lehman and Conceição (2013), whose book-length presentation of a model designed on similar research principles is discussed below, the authors of the Online Engagement Framework for Higher Education recognize that the massive shift in higher education toward increased online learning (Allen, Seaman, Poulin, & Straut, 2016) is both a shift to a more nontraditional student population that has resulted in higher rates of student attrition and a shift to a need for teachers of online courses to change their thinking and develop new skills in order to better engage and retain those learners. Citing Chen, Lambert, and Guidry (2010) and others who caution that without analytical problematizing of this complex learning environment the result will be a “tiered system of educational segregation” (p. 186), the study authors propose a conceptually robust framework for approaching online course design and evaluation based on the analytical Framework Method (Gale, Heath, Cameron, Rashid, & Redwood, 2013). The Framework Method and the six-stage deductive process used by Redmond, Abawi, Brown, Henderson and Heffernan (2018) result in a theoretical tool for analyzing engagement, teaching, and course design based on categorization of the factors involved in student engagement generated through a multi-disciplinary literature review. The authors’ table providing the resulting five elements of engagement with selected indicators is reproduced here.

Redmond, Abawi, Brown, Henderson & Heffernan (2018), Online Engagement Framework for Higher Education.
Redmond, Abawi, Brown, Henderson & Heffernan (2018), Online Engagement Framework for Higher Education.

Redmond, Abawi, Brown, Henderson and Heffernan (2018) describe their framework as a “multidimensional construct with interrelated elements that impact on student engagement in online settings” (p. 190). In this published presentation of the first stage of their work, the authors flesh out how each of the five elements, together with a list of indicators tied to sources in the literature, might serve as a guide for design and practice in online learning situations. For example, teachers may consider how as instructors they may create informal opportunities for social engagement to build respect and trust in a learning community and how they may impact cognitive engagement by designing activities and assessment tasks that engage deeper levels of cognitive engagement such as reconciling new and previous knowledge. In addition, the authors detail how this framework of elements and indicators can be used as a tool for instructors, designers, policymakers and others to “reflect upon their respective decision-making processes prior to and during the process of supporting the student online learning journey” (p. 196). For example, are some forms of engagement privileged while others are de-emphasized? What types of strategies are being used to facilitate different types of engagement? Which types of engagement are most important for fulfilling the learning outcomes of a course?

I’m particularly interested in the authors’ suggestion that the Online Engagement Framework for Higher Education offers not only a framework for instructional design, but one for program and even teacher evaluation. Connecting classroom assessment with program and institutional assessment seems critical if we desire to “explain and clarify the elements, strategies, and pedagogies employed by [our] institutions to support learning and teaching that enhance student engagement” (p. 198) among the diverse populations whom we know to be online learners. And as online learning materials that may employ principles of andragogy and data analytics (such as adaptive software platforms) as well as other means of engagement (for instance, gaming, social media, and bite-sized chunks of reading interspersed with application or review) are being marketed to educators and institutions by publishing companies in lieu of textbooks (Oremus, 2015), the online framework might also serve as a point of reference for selecting online learning strategies and technologies.

 

The Persistence Model for Online Student Retention

Lehman and Conceição’s (2013) text for teachers, Motivating and Retaining Online Students: Researched-Based Strategies that Work, similarly recognizes that new student populations and new ways of (online) learning call for new teaching competencies, as well as new instructional and institutional supports. They find that reasons for lack of student persistence in online formats include “feelings of isolation, frustration, and disconnection; technology disruption; student failure to make contact with faculty; inadequate contact with students by faculty; lack of student and technology support; lack of instructor participation during class discussion; lack of clarity in instructional direction or expectation; and lack of social interaction” (p. 5), and assert that as online courses continue to grow in number, it is important to base instructional and institutional support on a knowledge of student characteristics and behaviors.

The research base for the authors’ Persistence Model for Online Student Retention is similar to that of the authors of the Online Engagement Framework in that comparative analytical methodology was used to categorize the data in the empirical literature on student persistence and motivation. The authors also coded the results of their own surveys of students and teachers investigating student strategies for staying motivated in order to write “for instructors from both student and instructor perspectives” (p. 87). Their model, pictured below, has at is center a cyclical process of student strategies for success.

Lehman and Conceição (2013), Persistence Model for Online Student Retention
Lehman and Conceição (2013), Persistence Model for Online Student Retention

For instructional design to remain student centered, in the Persistence Model for Online Student Retention, instructional support strategies and institutional support strategies provide students with, respectively, a sense of control over their learning and an environment conducive to learning. Examples of instructional support provided by the authors include when teachers employ communication and facilitation skills that are strategized to support learning in an online environment such as proactive contact with individual students in creating and maintaining engagement; when teachers assess student readiness for the online learning environment before a course begins; when teachers evaluate digital products and provide students with knowledge of how to use those technologies; when they provide a syllabus with a “big picture” view of the course; and when they manage their own tasks and time when designing an online course as well as provide ways for student to manage their tasks and time. Institutional support strategies include not only the availability of library, technical support, tutoring, and student services, but proactive ways of connecting students to those sources through instructional support.

Two helpful tools provided in Lehman and Conceição’s (2013) text, though not directly based on the Persistence Model, are a blueprint for a course design that provides a “big picture of the course” units, readings, formats, strategies, technologies, and instructor roles; and a template for managing tasks before, during, and after teaching an online course. While their model is not as elegant in design and perhaps not as conducive to productive problematizing of the realities of online learning or as seamlessly applicable to assessing learning, teaching, and program review as the Online Engagement Framework, the Persistence Model facilitates an understanding of learners’ needs and the creation of a corresponding learning environment incorporating the key nested layers of institutional supports and student- and literature-informed online teaching competencies.

 

A Design-Based Research Model

Research-based innovation and research-oriented program review go hand in hand. The following figures from Allen, Seaman, Poulin, and Straut (2016), who have tracked online education in the US since 2006, underscore the importance of evidence-based evaluation of online learning products by the institutions that offer them.

Allen, Seaman, Poulin, and Straut (2016) report that distance education enrollments continue to grow, that “institutions with distance offerings remain steadfast in their belief that it is critical for their long term strategy (77.2% agreeing in 2014 and 77.1% in 2015),” and that “the percent of academic leaders rating the learning outcomes in online education as the same or superior to those in face-to-face instruction was 71.4% in 2015. This represents a drop from the 2014 figure of 77.0%, but still much higher than the 57.2% rate in 2003” (pp. 4-5). In other words, the outcomes of online learning have not yet caught up with the perceived demand and the rate at which institutions are rolling out online offerings.

Ideally, those who plan online pedagogical policies, those who engage in online course design and those who evaluate the effectiveness of those designs and policies should work in a highly collaborative manner. Unfortunately, not all institutions demonstrate collaboration between online education policymaking, design, and assessment. I conclude this post with a suggestion for how the Online Engagement Framework (Redmond, Abawi, Brown, Henderson & Heffernan, 2018) could be paired with design-based research (DBR), even on a small scale, to create such a collaborative and long-term approach to building capacity for effective online learning.

Ford, McNally, and Ford (2017) present one application of the DBR model used at the University of Maryland’s Center for Innovation in Learning and Student Success, the university’s laboratory for “conducting applied research that focuses on continuous improvements to the university’s instruction” and for “identifying promising innovations for underserved populations in adult higher education” (p. 1). The application in question in their study is RealizeIt, an adaptive learning software program similar to those referenced by Oremus (2015) and in my previous post.

DBR provides a model for how institutions and teachers who are willing to adopt a research-based framework for online course design may want to pair design with assessment of effectiveness both before and after the launch of a new innovation. Although Redmond, Abawi, Brown, Henderson & Heffernan (2018), the developers of the Online Engagement Framework, plan to use the framework itself as tool for assessing student engagement in their courses in the second phase of their project, I suggest in response to their invitation for academic research, that further evaluation of the framework could be accomplished productively through the use of DBR as practiced at the University of Maryland.

In the assessment project explained by Ford, McNally and Ford (2017), the problem of student persistence in a high enrollment but low success rate online course (Principles of Accounting I, a gateway requisite course for several majors) was treated with an adaptive learning software called RealizeIt on the basis of the andragogical premise that adapting the learning process to the needs of the individual learner in real time will facilitate both learning and re-diagnosis of learners’ needs. DBR, defined by the authors as more a collaborative approach than a precise methodology, involves controlled study of the effect of a learning treatment (innovation) over several iterations in order to understand how and why the treatment works in practice. The DBR process involves several iterations of design and design testing in order to refine the design, and is thus distinguished from other forms of educational research in its multiple cycles. Two important aspects of the approach is that three iterations of study are planned in advance, but that innovation, and thus refinement, is built into the study design. Thus, a treatment can be evaluated but also refined in a practical way as the innovation is tested and developed. Two other features that make DBR attractive for institutional design testing is the careful consideration put into how to least harm students through innovation testing (as opposed to a failure to consider how student learning by study participants may be affected by a “soft launch” that nevertheless does not take student impact into account) and the combination of controlled aspects of the study with innovations between iterations in order to build an institution’s capacity to implement the treatment at scale.

In the study chronicled by Ford, McNally and Ford (2017), RealizeIt was first tested in a single course section, then, in the next academic quarter, across several sections in random instructor assignments (to control for instructor bias toward technology among other variables). Based on the results of the second iteration, “instructor training became an area of greater focus” as the need for faculty to shift their mindsets regarding student engagement were identified, and a resulting faculty mentor program was created (p. 57). A third iteration involved 15 treatment and 16 control sections and 797 students as well as a survey of student demographics, outside employment, previous use of adaptive tutoring software, etc. and showed a significant positive impact on course success. The approach allowed for continual improvement in the vetting and development of the educational innovation, as well as opportunities for the institution to learn about “technical problems, faculty training, and design” (p. 63) and to implement a fourth iteration.

The DBR approach to measuring the efficacy and nature of technological solutions could be combined with the Online Engagement Framework, which could both supply a starting point for selecting potentially effective innovations, and serve as a lens to develop measures of effectiveness such as survey questions specifically targeting a research-based understanding of student engagement. Although both DBR and the Online Engagement Framework require a time investment on the part of policymakers, instructional designers, institutional researchers and assessment specialists and faculty, this time investment is necessary for these stakeholders to master research-based models that effectively problematize the real nature of online learning. The models themselves are simple enough that a small institution could productively implement them to conceptualize, plan, and evaluate their online learning initiatives.

 

References:

Allen, I., Seaman, J., Poulin, R., & Straut, T. (2016). Online report card: Tracking online education in the United States. Retrieved from http://onlinelearningsurvey.com/reports/onlinereportcard.pdf

Chen, P., Lambert, A., & Guidry, K. (2010). Engaging online learners: The impact of web-based learning technology on college student engagement. Computers & Education, 54, 1222-1232. doi:10.1016/j.compedu.2009.11.008

Ford, C., McNally, D., & Ford, K. (2017). Using design-based research in higher education innovation. Online learning, 21(3), 50-67. doi:10.24059/oli.v%vi%i

Gale, N., Heath, G., Cameron, E., Rashid, S., & Redwood, S. (2013) Using the framework method for analysis of qualitative data in multi-disciplinary health research. BMC medical research methodology, 13(117). doi:10.1186/1471-2288-13-117

Holton, E.F., Swanson, R.A., & Naquin, S.S. (2001). Andragogy in practice: Clarifying the andragogical model of adult learning. Performance improvement quarterly, 14(1), 118-143. Retrieved from https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1937-8327.2001.tb00204.x

Lehman, R.M. & Conceição, S. (2013) Motivating and retaining online students: Research-based strategies that work, Jossey-Bass / Wiley. Retrieved from http://ebookcentral.proquest.com/lib/spu/detail.action?docID=1376946

Nielsen, K. (2014) On class, race, and dynamics of privilege: Supporting generation 1.5 writers across the curriculum. In Zawacki, T.M. & Cox, M. (Eds.), WAC and second-language writers: Research towards linguistically and culturally inclusive programs and practices (pp. 129-150). Retrieved from https://wac.colostate.edu/books/perspectives/l2/

Oremus, W. (2015, October 25). No more pencils, no more books: Artificially intelligent software is replacing the textbook—and reshaping American education. Slate. Retrieved from http://www.slate.com/articles/technology/technology/2015/10/adaptive_learning_software_is_replacing_textbooks_and_upending_american.html

Redmond, P., Abawi, L., Brown, A., Henderson, R., & Heffernan, A. (2018). An online engagement framework for higher education. Online learning, 22(1), 183-204. doi:10.24059/olj.v22i1.1175

Thompson, N., Miller, N., & Pomykal Franz, D. (2013). Comparing online and face-to-face learning experiences for non-traditional students: A case study of three online teacher education candidates. The quarterly review of distance education, 14(4), 233-251.

2 Replies to “Research-based Frameworks for Addressing and Assessing Online Learning Engagement”

  1. Stephanie, great work! I always learn so much from reading your posts. I really liked how you approached this issue both from the perspective of a teacher and a student. It added a lot of value to your analysis and reflection on the research.

  2. Stephanie, this post on the frameworks for online learning environments is wonderful. I appreciate your approach to Redmond, Abawi, Brown, Henderson and Heffernan’s multidimensional model using key engagement elements to address students’ needs and participation motivation. The DBR framework is interesting. I like that it is a research-based approach which include formative evaluation of the design both before and during the implementation phase particularly since it gathers data to improve learning experiences and outcomes. I’m curious, if you could implement either the DBR or the Online Engagement model, which would you choose between the two? Or would you recommend combining them for more effective evaluation?

Leave a Reply

Your email address will not be published. Required fields are marked *