When we talk about engagement, we mean all the actions that students must take in order to learn. Engagement includes paying attention in class, doing homework, studying, asking questions, persisting when work is challenging, and many other behaviors that are critical for learning.
With rare exceptions, students only learn when they engage. This means that academically engaged students are far more likely to learn and succeed in school. Of course, it’s not just about grades or test scores—engaged students are better positioned for a productive, healthy, and intellectually stimulating life. Fortunately, there’s a great deal of research showing what learning conditions cultivate students’ engagement, and, thereby, set them on the path to a more productive, enriching, and secure life.
Simply put, learning conditions are the internal and external experiences students have as they learn. Research shows that many conditions affect students’ desire and ability to engage with learning. When deciding which learning conditions to focus on, PERTS considered several factors:
If we know why students are disengaged, it is much easier to take targeted actions to re-engage them. PERTS helps educators assess students’ perceptions of learning conditions that influence engagement so that educators can zero in on the causes of disengagement and more effectively target their efforts to foster engagement.
We ask students carefully designed questions that have been tested by Stanford University researchers with students and teachers. The questions help educators identify how students’ classroom experiences could be getting in the way of learning. This enables educators to take better targeted actions to influence these experiences and, thereby, increase students’ engagement over time.
These questions enable educators to understand whether or not their student is experiencing a specific learning condition. For example, we ask students to agree or disagree with statements like, “I feel like my teacher is glad that I am in their class.” All of the learning condition questions are listed alongside their learning condition at perts.net/conditions.
We also ask students to explain their answer to one randomly selected question. This provides teachers with more insight into what’s driving students’ answers. And it prompts students to slow down and be thoughtful about their answers.
We designed open ended questions to survey different learning conditions but educators can choose which ones to ask their students.
PERTS surveys periodically ask students several questions in addition to the learning condition questions described above. Some of these questions enable us to disaggregate data in different ways (e.g., to show teachers how students from different groups might experience their class differently). Other questions enable us to continuously improve the quality of the questions we use—or to prepare more comprehensive reports for communities that request them. To keep the surveys short, most of these questions are only asked periodically or only from random subsets of students. These questions inquire about students’ race, gender, prior grades, learning mindsets, comfort answering questions honestly, level of effort in given class, perceptions of their teachers’ mindsets, and other similar factors. You can always preview the survey.
You can preview the survey that your team has configured by signing into Copilot, selecting your team, clicking “Settings,” then “Survey Settings,” and then “Preview Survey.”
Fostering student engagement is less like turning on a light switch, and more like growing a garden. It’s a developmental process that takes time and is influenced by multiple factors:
Participation is free for teacher teams (of up to 8 teachers). We also offer additional services to teams who would like more support and training. Teams can ask a professional facilitator to train staff at their school or district. Additionally, teams can set up a multi-team community license (more than 2 teacher teams participating at the same organization). For more information, please contact firstname.lastname@example.org and put the program name in the subject line.
Our surveys are appropriate for students in grades six and up. Reading comprehension of surveys is the main barrier to younger students participating.
Our surveys are primarily designed for use by academic teachers: those teaching math, ELA, science, social studies, and foreign languages. However, certain learning conditions are highly relevant to non-academic subjects as well. For example, many advisory period teachers rightfully focus on developing strong relationship with their students, and they may find it helpful to assess students’ perceptions of “teacher caring.” Regardless of subject area, we strongly recommend against surveying students in more than 4 of their classes at the same time because of survey fatigue.
Any teacher who supervises one or more classes of appropriately aged students can participate. Teams can have no more than 8 teachers. Typically, a team is made up of 2-8 teachers who teach the same subject or grade-level, but that is not required. In some cases, a teacher will work one-on-one on a team with their instructional coach.
Our materials are only available in English at this time.
PERTS Copilot is the online platform that hosts PERTS Programs. Teachers use it to collect student surveys, schedule meetings, and generate reports. You can access Copilot at copilot.perts.net.
PERTS Copilot programs are organized in cycles of inquiry and action. Cycles typically last 2-6 weeks, depending on the cadence preferred by the team. In order for reports to display properly, teams must enter start and end dates for each cycle.
In Copilot “Team Settings,” the captain will see a section called “Survey”. In this section, teams can control which learning conditions are measured in their survey. If your team decides not to ask about a particular learning condition, your team can switch it off.
Students (and adults too) get fatigued by surveys that are long and repetitive. They also grow tired of answering questions if no one acknowledges their responses or addresses their concerns. Unfortunately, many school surveys are designed without regard to students’ or teachers’ experiences, and we (at PERTS) find that troubling.
We would be frustrated if someone asked our opinion over and over five different ways and then gave no indication that anything meaningful would change as a result of our answers. We empathize with the students and with the teachers who are sometimes required to take, or to administer, less-than-useful surveys.
In sharp contrast, we designed our surveys to be part of an empowering process that gives students voice and provides teachers with rapid insights that they can act on immediately. The surveys do two things:
Set up in advance. Make sure each of your students has access to an internet-enabled device like a laptop, tablet, or smartphone. Students do not necessarily need to take the survey at the same time, so it is not necessary for each student to have a device, as long as they can each access a device at some point. Finally, make sure your students have their Roster ID handy so that they can log on to take the survey.
Administering the survey. It’s important to appropriately introduce students to the survey so that they understand and buy into its purpose . You can find detailed recommendations for administering the survey (including unique sign-in codes for each your classes) by following these steps:
If your team has notifications enabled, you will also receive a direct link to your survey instructions in every week during which surveys are scheduled.
Handling makeups. Students who were absent can take the survey at a different time, as long as it is within the same cycle.
Receiving reports: You should receive new reports on the Monday afternoon following your student survey.
Please see our sample PERTS Copilot report.
In our school system—and in society more generally—systematic disparities exist between the opportunities afforded to members of different groups. Disparities in opportunity are especially pronounced by race, gender, and socioeconomic status. For example, Black, Latinx, and Native American students are more likely to be suspended than their White peers for the same offenses. They are also less likely to receive the support they need to reach high academic standards. PERTS reports help educators compare the experiences of members of different groups so that, if significant disparities exist, educators can take additional steps to understand and mitigate those disparities.
When students self-identify their race and ethnicity in PERTS surveys, they can choose from 17 different categories and identify with multiple races. However, results in reports are not broken out by those fine-grained categories because that level of disaggregation would make it impossible to maintain the confidentiality of students’ responses. (To maintain students’ privacy, PERTS Copilot reports only show results for a given group when that group has at least five members. That means, for example, that reports could not show results from Native American students if there were fewer than five Native American students in a given class.)
Even though we cannot break out data by each distinct race and ethnicity for privacy reasons, we believe it is important to give educators insight into whether students’ experiences in their classrooms are reinforcing or interrupting the nationally observed disparities in educational opportunity.
As a compromise between the need to maintain student confidentiality and the desire to provide teachers with disaggregated data about opportunity gaps, PERTS Copilot reports group together certain races and ethnicities. Using national statistics on disparities in academic and disciplinary outcomes as a guide, reports group together students who self-identify as Black, Latinx, and/or Native American. Reports contrast their experiences to those of White and Asian students (who are comparatively advantaged according to the same national statistics).
This compromise enables educators to get some insight about opportunity gaps in their classrooms while simultaneously protecting student confidentiality. However, to state the obvious, this approach is limited because the experiences of White students are likely to be different from those of Asian students, and the experiences of Black, Latinx, and/or Native American students are likely to differ as well. Furthermore, opportunity gaps at a particular school might not mirror the national trends. Therefore, we encourage teams to set a custom target group that more accurately represents the groups of students in their own classes who are situated farthest from opportunity.
In many schools and classrooms, specific groups of students are situated further from opportunity. Often, these students are members of specific racial, ethnic or gender groups, English language learners, or students who are members of multiple demographics (for example, boys of color). If teachers are aware that a specific group of students is situated further from opportunity, we encourage them to use the “Target Group” feature in Copilot to define a custom target group that includes those students.
When a target group is set, reports will display results separately for students in the target group and for those who are not. This enables teachers to see whether members of the target group experience their classroom differently. In this way, teachers can recognize whether and how students’ experiences in their own classes may be reinforcing—or mitigating—the opportunity gaps observed in other data. For example, research suggests that teachers who successfully convey a sense of care to African American students are significantly less likely to elicit the defiant behaviors for which African American students are disproportionately disciplined. This implies that teachers can mitigate disproportionate rates of suspensions and referrals by being systematic and intentional about making sure that African American students feel cared for.
The Target Groups feature is intended to support such efforts—to help teachers implement a Targeted Universalist approach towards creating a more equitable classroom climate that fosters engagement in all students.
To set up a target group, the team captain should:
We take privacy extremely seriously for both students and teachers.
To protect teacher privacy, each classroom entered into Copilot has a main contact who receives survey instructions and reports for that classroom. The only person on a Copilot team who can see a classroom’s reports is the main contact of that classroom. PERTS encourages team members to share and discuss their reports in Cycle Meetings to the extent they feel comfortable doing so.
One side note: it is possible for the team captain and team sponsor to change the setting for any classroom on Copilot, including the main contact. This means that team captain Alice could, in theory, change the main contact of Bob’s classroom to be Alice herself, which would allow Alice to see Bob’s reports. However, Bob will automatically get an email informing him that Alice changed his classroom settings. In other words, there is no way to see another teammate’s class-level reports without having that teammate know that you did so.
To protect student privacy and to remove incentives for student dishonesty, Copilot takes several measures to prevent the disclosure of individual students’ responses. With the exception of open response data, we never report data from individual students or subdivisions of students, e.g., by race, if any resulting subgroup would be smaller than 5 students. In some cases, way may also "blur" or "perturb" data (see below) to prevent unintended disclosure. For example, we prevent any metric as showing up as zero percent because that would "give away" all students' responses. Of course, we are careful not to distort data in any way that would meaningfully alter its interpretation.
Blurring is a disclosure limitation method which is used to reduce the precision of the disclosed data to minimize the certainty of individual identification. There are many possible ways to implement blurring, such as by converting continuous data elements into categorical data elements (e.g., creating categories that subsume unique cases), aggregating data across small groups of respondents, and reporting rounded values and ranges instead of exact counts to reduce the certainty of identification. Another approach involves replacing an individual’s actual reported value with the average group value; it may be performed on more than one variable with different groupings for each variable.
Perturbation is a disclosure limitation method which involves making small changes to the data to prevent identification of individuals from unique or rare population groups. Data perturbation is a data masking technique in that it is used to "mask" the original values in a data set to avoid disclosure. Examples of this statistical technique include swapping data among individual cells to introduce uncertainty, so that the data user does not know whether the real data values correspond to certain records, and introducing "noise," or errors (e.g., by randomly misclassifying values of a categorical variable).
— From Privacy Technical Assistance Center (2012). Data De-identification: An Overview of Basic Terms.
We do not permit anyone to see individual students’ responses (with the exception of open ended responses) because doing so could incentivize students to be dishonest and destroy the integrity of the process. Of course, we encourage you to have frank face-to-face conversations with your students, but one of the goals of the surveys is to give you an easy, confidential way to get honest feedback.
In order to track your students’ data over time and accurately show you how scores change, Copilot needs a way to recognize and distinguish between the students in your class — and across multiple classes on a team. To facilitate this, teachers set up rosters of students for their classes.
PERTS strongly recommends that you use students’ school email addresses as their Roster IDs. This will make it easier for students to remember their Roster ID, and it will prevent ID conflicts (where two teachers on the same team inadvertently use the same ID for different students, resulting in garbled results).
Educators using the PERTS Copilot platform typically organize themselves onto teams. However, sometimes an instructional leader or coach may want to group several teams together into a community so that they can monitor and support progress across all of them. For example, say that 24 teachers at the same middle school are all using Copilot. They form three separate teams of 8 teachers — one corresponding to each grade level. If an instructional leader or coach would like to track progress across all three teams, the best way to do that would be to group those teams into a community.
When a team joins a community, it gives the community’s administrator(s) the same permission over its team settings and reports data as are granted to the team’s captain.
Community administrators are able to:
Community administrators are not able to:
Yes, you can associate a team with up to 5 communities. For example, maybe you want your team to be accessible to an instructional coach who works with three teams and also to a district leader who works with twenty teams.
Survey reports prominently flag problems with student buy-in. For example, they show if students are not comfortable being honest or if they do not believe their teacher will use the data to improve their learning experiences. If a teacher is struggling to get high student buy-in, we recommend their teammates help them by trying some of these strategies:
If you have issues accessing Copilot, your district may have a firewall that is blocking access to the site. Check with your IT department to see if they can resolve the issue. You can test your network access here: perts.net/network-test.
Copilot and the learning condition questions and resources were developed by the Project for Education Research that Scales (PERTS), an applied research center at Stanford University. PERTS helps educators apply insights from the learning and developmental sciences so that every learner gets the support and stimulation they need to thrive.
PERTS works with developmental scientists and educators across the nation to build tools and services that help schools foster student engagement and success. Through our work with schools, teachers, and other educational organizations, millions of students around the world have benefited from evidence-based practices.
Susan Colby, Founder & CEO, Imagine Worldwide
Carol Dweck, Professor, Stanford University
Camille Farrington, Managing Director, UChicago Consortium for School Research
Becky Margiotta, Principal, Billions Institute
Jason Okonofua, Assistant Professor, UC Berkeley
Greg Walton, Associate Professor, Stanford University
The Raikes Foundation
The Overdeck Family Foundation