What is engagement
in a learning experience?

BY: MARIANA AGUILAR, KAYLA SHELDON

DESIGNED AND BUILT BY:
CANDICE DAVIS, PAOLA MENDOZA, KIRK HODGMAN

GoGuardian believes the digital learning environment must be at its core an engaging learning experience, so we conducted a field research study with over 350 educators and students across the country to better understand and define student engagement. GoGuardian’s focus on understanding engagement is a reflection of our commitment to student outcomes. In fact, research demonstrates that high levels of student engagement are related to “higher grades, achievement test scores, and school completion rates”. 5Additionally, engaged students typically have “lower rates of delinquency, substance use, and depression”. 5Given these benefits and protective factors, GoGuardian is committed to understanding engaging digital learning experiences.

As a first step, GoGuardian’s Research & Insights team sought to understand the behaviors and emotions that define engaged and disengaged learning and to identify the factors that influence student engagement. While there is extensive literature on engagement going back to the 1980s, there is not a single agreed upon model or definition of engagement. 23Given the varying engagement models in existence, GoGuardian designed a field research study to elicit insight from users (teachers, school leaders, students, and IT Admins) about how they define and describe engagement.

We initiated this research study to
answer the questions:

1What are the emotions, behaviors, and cognitive habits that occur when students are engaged, or disengaged, in a learning experience?
2What are the factors that contribute to an engaging learning experience?

These questions are based on the assumption that there are multiple, observable dimensions of engagement, each with indicators that signal engagement in a learning experience.

Study Design

In order to conduct this study on engagement, we used a qualitative study design to capture the experiences, opinions, and perspectives of the participants. Given the lack of a consistent definition for engagement in the research 17and, rather, the ubiquitous perspective that engagement is a multidimensional concept, we felt it was critical to document the nuanced experiences of the target population: students, teachers, school leaders, and IT Admins in K-12 schools. Furthermore, this qualitative approach allows us to capture a more holistic understanding of the human experience in the selected environment. 14

Field Research Methods

The qualitative study leveraged multiple field research methods, including direct observation, interviews, and focus groups. We leveraged a diversity of methods so we could capture both the easily observable behaviors and the more nuanced insights that result from prolonged experience with an environment. The direct observations enabled us to document specific behaviors, while the interviews and focus groups facilitated the collection of reflections and perspectives directly from the population being studied. Additionally, we conducted both interviews and focus groups to accommodate the availability and access to participants. Focus groups were typically used for groups that were already convening, and the already scheduled time could easily be repurposed. On the other hand, interviews were typically used for individuals who could not accommodate a focus group into their schedule. The variety of methods allowed us to capture both observable behaviors and the nuanced perspectives of the participants within the logistical constraints.

Participant Distribution by Stakeholder

We conducted nine instances of direct observations in elementary, middle school, and high school classrooms during already scheduled lessons. Observation periods ranged from 20 minutes to 45 minutes and included individual work time, student presentations, teacher lectures, group work, rotational workshops, and testing periods during which students completed an assessment. We conducted five interviews in both an in-person and virtual setting. All interviews were conducted using a semi-structured approach in order to enable comparison across the responses while also allowing for relevant follow up questions. 6We conducted nine focus groups both in-person and virtually. All of the focus groups were semi-structured to allow for consistency and flexibility of follow up questions. Focus groups were conducted with each stakeholder group: teachers, students, IT Admins, and school leaders.

Field Research Methodologies

Participants

The participants were recruited through purposive sampling: Participants were eligible if they 1) were a teacher, student, school leader, or IT Admin, and 2) were a part of a K-12 school community in a region served by GoGuardian. Participants in the study were recruited from existing customers and the professional network of GoGuardian's Research & Insights team. The initial target list of research participants was equally distributed amongst grade levels, school type, customer satisfaction level, and geography. However, many of the initially selected participants did not make it into the final study due to logistical and operational constraints. In the end, participants represented 19 different schools, including elementary, middle, and high school, from seven different states: California, Ohio, New York, Florida, Wisconsin, Iowa, and Washington.

Student Representation by Grade Level

In order to synthesize the data from the 23 instances of field research and surface key themes, we began by employing an open coding approach in which we identified and labeled the concepts present in the notes from the field research. 10Once all concepts were noted, each concept was transcribed onto a sticky note to facilitate easier axial coding, the combination of the discrete concepts identified during open coding into broader phenomenon that can be further organized into a conceptual model. 10Once the observations were grouped into phenomenon, we wrote thematic statements to reflect the broader narrative. After three cycles of coding concepts and recategorization into phenomenon, a total of 43 thematic statements were identified. These 43 thematic statements were then reviewed for similarities and categorized into a four-part framework.

The Framework

After completing the open coding analysis of the field research notes, we identified 43 themes. These 43 themes were further grouped into a four-part framework. This holistic framework provides us a lens through which to understand the factors that contribute to and emerge in an engaging learning environment.

mountains
Things to consider

Contextual Variables
Impacting Engagement

This part of the framework captures the eight thematic statements that relate to the variables that shape how a student experiences a learning environment. These types of contextual factors have been found to significantly impact engagement; 2as such, it is critical to explore these variables as a part of the research.

Things to Do

Qualities of an Engaging Learning Experience

This part of the framework encompasses the thematic statements that describe the qualities of an engaging learning experience. Many of these statements were observed and expressed as true in both a digital and non-digital learning environment.

Personalized and Differentiated

Blended:
Digital & Analog

Interactive
& Gamified

Models, Explanations
& Stories

Positive Emotional Experience

Student-Centric

Social Learning

Recognition
& Validation

Things to Know

Industry Trends

While the following themes do not relate directly to the concept of engagement, they reflect the landscape and context in which schools are operating. As we aim to understand what makes an engaging digital learning experience, it is important for us to consider the contextual factors affecting education practices and the deployment of instructional technology. The themes below represent industry trends present in the field research.

Current State

Education software and applications vary significantly within a district, and oftentimes even within a school, but there is a desire for consistency.

One IT Admin underscored the inconsistency that characterizes the current state:

“It has been like the wild west at times. They [teachers] are buying different products. One might buy this program and the other buys that one, and there’s been some slipping through some cracks.”

Admins seek to implement scalable and interoperable EdTech solutions that deliver insights on academic outcomes and teacher usage.

“We have some teachers that are using technology and others not that much. But when it comes to tracking that piece of information—that becomes part of the problem. Some may be using the technology more than others.” Given the variability in usage, even within a school, there is an emerging desire to understand how technology is being used and the impact it is having on students.

Teacher digital literacy levels span a wide spectrum, and additional professional development is a priority.

When a school leader of a large high school was asked, “What mechanisms and strategies exist at the school level to nurture student engagement?”, the school leader confidently asserted, “Teacher Professional Development on how to implement blended learning.” Without hesitation, this school leader identified the importance of professional development for using instructional technology effectively. While some schools have been able to provide teachers with the support to improve their levels of digital literacy, other schools name this as an essential challenge to address.

21st Century Learning

Intrapersonal (e.g., growth mindset, self-efficacy, resilience) and interpersonal skills (communication, collaboration, conflict resolution) are critical for 21st century success.

“I think a large part of engagement includes that collaborative discussion piece, not just passively listening, but the ability to converse about a project—to listen to your partner and ask those clarifying questions.” Other focus group participants built upon this contribution and shared that the ability to engage in this way was becoming more critical as lessons were more commonly providing opportunities for collaboration and group work. While the primary focus of the field research was to understand engagement in the digital learning environment, the importance of these intrapersonal and interpersonal skills continuously emerged as essential for success in the classroom today.

In order to be successful today, students need to develop the critical-thinking and problem-solving skills required to exercise agency and self-direct their learning.

Whether explaining the need for students to be able to evaluate the quality of a cited source, to know where to go to learn a new math skill, or to be able to set relevant learning goals, participants in the field research reiterated the need for students to develop the critical-thinking and problem-solving skills necessary for self-directed learning. One of the ways schools are doing this is by making grades more transparent and readily available online. When explaining the goal of increasing availability of assessment scores, one school leader explained, “It puts the ownership back on them. There is not this mystery as to why you’re failing. This way, we can promote ownership of learning by teaching them to be more responsible.”

Teaching digital literacy to students includes a broad spectrum of skills.

As teachers considered the number of skills necessary to succeed in the future, digital literacy emerged as a common trend; however, the definition and examples of digital literacy spanned a wide spectrum. At one end of the spectrum, there were educators focused on the physical skills of using a mouse, and at the other end, there were administrators focused on developing the awareness of nuanced social norms for different platforms. One school leader explained, “I mean, just like literacy—a life literacy, like digital literacy. You know? Like, are you able to engage and understand the impact that this information is having on you and the impact you are having on it?” In this instance, the school leader was no longer identifying digital literacy as a distinct literacy, but rather an integral part of interacting with information and navigating the digital world today.

Core Beliefs

As great as EdTech is, there is no one silver bullet for improving education.

Participants often highlighted the myriad of factors outside the classroom that shape a student’s learning experience, underscored the impact of resource constraints on public education, and emphasized the complexity and variability of student needs. One IT Admin captured this theme succinctly when asked, “Is there anything else you want to make sure we capture?”, to which the IT Admin replied, “There is no silver bullet, and there is no one tool or suite of tools that are going to accomplish this work.”

Technology will never be able to replace a teacher.

This comment came up again and again by both school leaders and teachers, and it reflects a level of apprehension about the role of education technology. One middle school leader shared, “The teacher still plays a crucial role. We’ve seen those extremes. Neither are good. The successful classrooms are just the right balance. The digital platform should be a tool rather than [trying to be] the teacher.”

Defining, observing, and measuring engagement is ambiguous and difficult.

“I actually think we are prone to misperceive when a student is off-task.” The teacher elaborated to include examples of how even the most experienced teachers can misinterpret student behavior if they do not have all the context, like a student having a traumatic experience the day before. In this case, a student may look disengaged, when in reality, the student may be engaged in their learning overall, but he or she may just be distracted by the prior day’s experience. Further complicating the matter, some students intentionally mislead teachers to perceive a certain level of engagement. One school leader provided the following example: “Just like your kids know how to fake read with a book, they know how to do it online too.” In this case, determining if a student is engaged becomes increasingly difficult as students intentionally demonstrate the behaviors indicative of engagement.

Things to Recognize

Indicators of Engagement

The themes in this section focus specifically on the indicators of engagement. These indicators typically emerged in response to questions like, “What does engagement look like?” or “How do you know when your peers are engaged?” or “What does engagement sound like in your highest performing classes?” Despite the diversity of participants in the field research, the responses to these questions were remarkably consistent. In fact, after reviewing the field research findings, it was evident that the indicators of engagement could be broken into four categories: behavioral, cognitive, social-emotional, and classroom collective.

Behavioral

Behavioral indicators are composed of the active and passive actions taken when engaging or disengaging in a learning activity. According to the literature, behavioral engagement has been defined as “participation, effort, attention, persistence, positive conduct, and the absence of disruptive behavior”. 5

In the field research, examples of active behavioral engagement expressed themselves as students making the choice to remain on-task amidst distractions while an example of passive engagement was looking in the direction of the teacher when paying attention to a presentation. In one example, the student is intentionally taking an action that engages them in the learning by ignoring potential distractions. In the latter example, looking in the direction of the teacher is simply a habitual side effect of paying attention to what the teacher is saying. Both the behavior of remaining on-task and looking in the direction of the teacher exemplify the types of indicators in this category.

Just like there are active and passive actions that indicate engagement, there are also active and passive indicators of disengagement. Examples of passive disengagement might include fidgeting with a pencil, while active disengagement might look like a student working on an assignment for another class. Like the behavioral indicators of engagement, the distinction between active and passive behaviors of disengagement lies in the level of automaticity and intention. In sum, the behavioral indicators of engagement can be separated into passive engagement, active engagement, passive disengagement, and active disengagement.

Cognitive

The cognitive indicators capture the intellectual habits of engaged students. The indicators in this section typify the actions taken by students who are intentionally undertaking the thought processes required to understand complex ideas and progress beyond the basic understanding of the information (Finn & Zimmer, 2012; Mahatmya, Lohman, & Farb, 2012).

Examples of cognitive engagement shared by the field research participants include asking questions, making connections to prior knowledge, and going above and beyond what is required.

Like the behavioral indicators, the cognitive indicators are also observable actions, but they are differentiated by the deliberate investment made to understand the material. 312

Social-Emotional

The indicators in this section are composed of both the social behaviors and the emotions associated with feelings of involvement in the learning process and the school community. In the literature, social indicators of engagement can refer to the extent to which students adhere to the official and unofficial norms of the school and classroom; 3meanwhile, emotional engagement, or sometimes called affective engagement, is characterized by feelings towards the learning process and the belief that education is a worthwhile pursuit. 312 In the field research, these two concepts were often discussed in tandem, so we have grouped them as such in our findings.

Examples from the field research include students demonstrating a sense of joy in the learning process or having a robust network of adults they engage with in the school community.

Classroom Collective

This final category reflects indicators of engagement that exist at the classroom level. While a significant amount of the literature on engagement examines indicators of engagement that appear for individual students, one of the key findings from the field research was a consistent expression of indicators that display themselves in the collective behavior of the class.

For example, a number of educators from schools in different states who taught different subjects and grade levels used the same terminology to describe the palpable sensation in an engaged classroom: “a buzz.”

Additionally, a number of teachers across distinct field research instances talked about how an engaged classroom sounds different. Based on the field research, it appears that there are indicators of engagement that exist at the classroom level.

Want your own copy of this report?