Research shows that many conditions affect students’ desire and ability to learn. When deciding which learning conditions to focus on, PERTS considers several factors:
Even when educators recognize that specific learning conditions are important for student success—and even when educators try their best to create these conditions—doing so can be challenging. One of the main barriers is that, often, the messages educators intend to send are not the messages students receive. In other words, there’s often a big gap between educators’ intentions and students’ experiences. Absent feedback from students, it can be very hard to recognize or close that gap.
Copilot makes it easy for educators to collect student feedback using validated measures. It also provides for confidentiality so that students are comfortable being honest.
Students are asked carefully designed questions that have been developed and tested by leading researchers. The questions help educators identify how students’ experiences in the classroom could be supporting or getting in the way of learning. This enables educators to take targeted actions to influence these experiences and, thereby, improve learning conditions over time.
Copilot surveys periodically ask students several questions in addition to the learning condition questions described above. Some of these questions enable us to disaggregate data in different ways (e.g., to show educators how students from different groups might experience a learning environment differently). To keep the surveys short, most of these questions are only asked periodically or only from random subsets of students. These questions inquire about students’ race, gender, prior grades, learning mindsets, comfort answering questions honestly, level of effort in a given class, perceptions of their teachers’ mindsets, and other similar factors. You can always preview the survey.
To preview the student survey, sign into Copilot, click “Settings” on the navigation bar on the left, then click “Survey Settings,” and then click “Preview Survey.”
Fostering student success is less like turning on a light switch, and more like growing a garden. It’s a developmental process that takes time and is influenced by multiple factors:
Copilot is free to U.S. schools and colleges thanks to support from the Raikes Foundation, Overdeck Foundation, and Gates Foundation. For more information, please contact email@example.com.
Most Copilot surveys are appropriate for students in grades six and up. Reading comprehension of surveys is the main barrier to younger students participating. Copilot-Elevate is optimized for students in grades 6-12 while the College Student Experience Tracker (C-SET) is designed for college students of any age.
Our surveys are primarily designed for use by academic instructors of math, humanities, foreign languages, and the physical and social sciences. Regardless of subject area, we strongly recommend against surveying students in more than 4 of their classes at the same time because of survey fatigue.
Our materials are only available in English at this time.
Copilot is an advanced professional learning platform that helps educators create supportive and equitable conditions for learning. Copilot enables educators to get rapid feedback from their students about how they are experiencing key learning conditions. Pilot data suggests that Copilot helps educators systematically improve learning conditions and that better learning conditions promote higher and more equitable academic achievement. You can access Copilot at copilot.perts.net.
It takes significant skill to create a supportive and equitable learning environment that enables all students to reach their potential. To master that complex skill, educators need ongoing feedback and practice. That’s why Copilot is set up as an iterative process, organized around multiple cycles of inquiry and action.
In each cycle, educators start by collecting feedback from their students through a student survey. Then educators can investigate and test ways to improve those experiences. In the next cycle, educators use the survey to see if their test was successful and to continue to make adjustments to their practice. Cycles typically last 2-6 weeks, depending on the preferences of the educators using Copilot.
In the Copilot “Settings” screen for your project, the project lead or host will see a section called “Survey”. In this section, they can control which learning conditions are measured in their survey and turn off (or on) specific learning conditions that you are interested in measuring. In the C-SET program, measures cannot yet be turned off.
Students get fatigued by surveys that are long and repetitive. They also grow tired of answering questions if no one acknowledges their responses or addresses their concerns. Unfortunately, many school surveys are designed without regard to students’ or teachers’ experiences, and we (at PERTS) find that troubling.
We would be frustrated if someone asked our opinion over and over in five different ways and then gave no indication that anything meaningful would change as a result of our answers. We empathize with the students and with the educators who are sometimes required to take, or to administer, less-than-useful surveys.
In sharp contrast, we designed our surveys to be part of an empowering process that gives students voice and provides educators with rapid insights that they can act on immediately. Unlike many other surveys, Copilot surveys:
Set up in advance. Make sure each of your students has access to an internet-enabled device like a laptop, tablet, or smartphone. Students do not necessarily need to take the survey at the same time, so it is not necessary for each student to have a device, as long as they can each access a device at some point. Finally, make sure your students have their Roster ID handy so that they can log on to take the survey.
Administering the survey. It’s important to appropriately introduce students to the survey so that they understand and buy into its purpose . You can find detailed recommendations for administering the survey (including unique sign-in codes) by following these steps:
Handling makeups. Students who were absent can take the survey at a different time, as long as it is within the same cycle.
Receiving reports: You should receive new reports on the Monday following your student survey.
Debriefing: In order to get authentic feedback from your students consistently, students need to trust that their responses are being taken to heart. Intentionally debriefing your report reflections with students can contribute to cultivating that trust—it demonstrates to students that you value their voice and care about their experiences in your class. Here are a few suggestions to engage students in this type of dialogue.
In our school system—and in society more generally—systematic disparities exist between the opportunities afforded to members of different groups. Disparities in opportunity are especially pronounced by race, gender, and socioeconomic status. For example, Black, Latinx, and Native American students are less likely to receive the support they need to reach high academic standards. Copilot reports help educators compare the experiences of members of different groups so that, if significant disparities exist, educators can take additional steps to understand and mitigate those disparities.
By default, Copilot reports disaggregate student experience data by gender and by race-ethnicity group membership. Certain Copilot programs also disaggregate results by financial stress, while others enable educators to create a custom target group.
The first time that students complete a Copilot survey, they are asked what racial and ethnic groups they identify with. Our priority in asking students to identify their racial and ethnic background is to (1) help educators understand how students from structurally disadvantaged groups are experiencing the educational environment; and (2) to provide students from all racial, ethnic, and nationality backgrounds the chance to select at least one option that closely matches their self-identification and lived experience. In doing this, we recognize that the experiences of sub-populations within large race and ethnicity categories may be meaningfully different from one another and that it is important to reflect to students an understanding of this diversity, rather than forcing them to identify with broad categories that may not resonate well with their lived experiences. For these reasons, Copilot survey inquires about race and ethnic group members using the following question.
With which group(s) do you identify? Please select the box(es) that apply.
When students self-identify their race and ethnicity in PERTS surveys, they can choose from 17 different categories and identify with multiple races (see How are race-ethnicity data collected?). However, results in reports are not broken out by those fine-grained categories because that level of disaggregation would make it impossible to maintain the confidentiality of students’ responses. (To maintain students’ privacy, Copilot reports only show results for a given group when that group has at least five members. That means, for example, that reports could not show results from Native American students if there were fewer than five Native American students on a given roster.)
Even though we cannot break out data by each distinct race and ethnicity for privacy reasons, we believe it is important to give educators insight into whether students’ experiences in their classrooms are reinforcing or interrupting the nationally observed disparities in educational opportunity.
As a compromise between the need to maintain student confidentiality and the desire to provide teachers with disaggregated data about opportunity gaps, Copilot reports group together certain races and ethnicities. Using national statistics on disparities in academic and disciplinary outcomes as a guide, reports group together students who self-identify as Black, Latinx, and/or Native American. Reports contrast their experiences to those of White and Asian students (who are comparatively advantaged according to the same national statistics).
This compromise enables educators to get some insight about opportunity gaps in their classrooms while simultaneously protecting student confidentiality. In order to support more fine grained analysis, certain Copilot programs enable educators to create a custom target group that more accurately represents the groups of students in their local context who are situated farthest from opportunity.
In many schools and colleges, specific groups of students are situated further from opportunity. Often, these students are members of specific racial, ethnic, or gender groups; English language learners; or students who are members of multiple intersecting demographics (for example, men of color). If educators are aware that a specific group of students is situated further from opportunity in their context and want to disaggregate their results separately to better attend to their experiences, we encourage them to use the “Target Group” feature in Copilot to define a custom target group that specifically includes those students. The Target Groups feature is intended to help educators adopt a Targeted Universalist approach towards creating a more supportive and equitable learning environment.
When a target group is set, reports will display results separately for students in the target group and for those who are not. This enables educators to see whether members of the target group experience their classroom differently. In this way, educators can recognize whether and how students’ experiences in their own classes may be reinforcing—or mitigating—the opportunity gaps observed in other data.
To set up a target group, the project host should:
Note that the target group feature is not yet available in C-SET.
We take privacy extremely seriously for both students and educators.
To protect educator privacy, each roster entered into Copilot has a main contact who receives survey instructions and reports for that roster. The only person in a Copilot project who can see a roster’s reports is the main contact of that roster. PERTS encourages project members to share and discuss their reports with colleagues in order to get advice and share insights to the extent they feel comfortable doing so.
Note: it is possible for the project host or project lead to change the main contact setting for a roster. This means that project host Alice could, in theory, change the main contact of Bob’s roster to be Alice herself, which would allow Alice to see Bob’s reports. However, in that event, Bob would get an automated email informing him that Alice changed his roster settings.
We generally do not share individual students’ responses (with the exception of open ended responses) because doing so could incentivize students to be dishonest and destroy the integrity of the process. Of course, we encourage you to have frank face-to-face conversations with your students, but one of the goals of the surveys is to give you an easy, confidential way to get honest feedback.
PERTS strongly recommends that you use students’ school email addresses as their Roster IDs. This will make it easier for students to remember their Roster ID, and it will prevent ID collisions (where two students are accidentally given the same ID, resulting in their responses overwriting each other).
Schools and colleges have different rules about when an Institutional Review Board (IRB) needs to review and approve an activity. Typically, IRBs only need to review formal research activities that are intended to result in generalizable knowledge. In contrast, IRBs typically do not need to review quality improvement activities—activities that are intended to support an organization in improving its own practices and processes. PERTS Copilot programs are intended to help educators collect and use information for quality improvement purposes; therefore, IRB permission should not be required under most circumstances. However, if you intend to use the information you collect for formal research purposes, then you probably do need IRB approval. Also, some schools and colleges ask IRBs to review even quality improvement activities.
A Copilot “community” is a group of projects that can be administered together by a community administrator. For example, say that three different instructors in the same math department each have their own Copilot project so that they can decide on their own measures and cycle cadence. The chair of the math department may set up a new community and ask those instructors to attach their projects to that math department community. Doing so would enable the department chair (and any other community administrators) to review and compare results across each of those projects and to adjust project settings, as needed.
When a project joins a community, it gives the community’s administrator(s) the same permission over its settings and reports data as are granted to the lead or host.
Community administrators are able to:
Yes, you can associate a project with up to 5 communities. For example, maybe you want your project to be accessible to an instructional coach who works with three projects and also to a district leader who supervises twenty projects.
Survey reports prominently flag problems with student buy-in. For example, they show if students are not comfortable being honest or if they do not believe their educator will use the data to improve their learning experiences. If an educator is struggling to get high student buy-in, we recommend trying some of these strategies:
If you have issues accessing Copilot, a firewall may be blocking access to the site. Check with your IT department to see if they can resolve the issue. You can test your network access here: perts.net/network-test.
Copilot was developed by the Project for Education Research That Scales (PERTS) in partnership with leading researchers and educational advocacy organizations. PERTS in a non-profit research and development institute that translates insights from psychological science into cutting-edge tools, measures, and recommendations that educators anywhere can use to foster student success effectively and equitably.
Susan Colby, Founder & CEO, Imagine Worldwide
The College Transition Collaborative
Carol Dweck, Professor, Stanford University
Camille Farrington, Managing Director, UChicago Consortium for School Research
Becky Margiotta, Principal, Billions Institute
The National Equity Project
Jason Okonofua, Assistant Professor, UC Berkeley
University of Chicago Consortium for School Research
Greg Walton, Associate Professor, Stanford University
The Raikes Foundation
The Overdeck Family Foundation
The Bill & Melinda Gates Foundation