Participation Measure

TECHNICAL PAPER

Participation Measure

Participation Measure as a Practical Measure

Characteristics of Practical MeasuresDescriptions of the Participation Measure
Is closely tied to a theory of improvementFollowing the increase in educational exclusion during the pandemic, the Un Buen Comienzo (UBC) Improvement Network shifted its focus from attendance to participation and developed an aim for 80% of preschool and kindergarten children participating in at least three learning activities each week. The Participation Spreadsheet/Tracker is a tool to track a leading outcome measure. It monitors student engagement in learning experiences and helps teachers, school administrators, and network leaders identify individual students who are falling behind or at risk of not finishing successfully so they can develop personalized interventions for them. This measure is tied to the network’s theory of improvement, and tracking student engagement in learning activities supports their efforts to prevent students from experiencing further educational exclusion.
Provides actionable information to drive positive changes in practiceThis measure relies on teachers to collect weekly data on each student’s participation. Using this information, the hub team generates weekly classroom- and school-level reports to share with teachers and school leaders. School teams can then review the disaggregated data to better understand patterns in participation and discern which classroom practices are helping enable student engagement and which are hindering engagement. The participation data are also shared in monthly district meetings, led by trained participation/attendance managers, to engage different school teams in collective sense-making, surface bright or blind spots, and identify timely strategies for improving participation.
Captures variability in performanceOnce teachers record the student participation data, the Participation Spreadsheet/Tracker automatically feeds and produces weekly Data Studio reports that record student participation rates across classrooms, schools, districts, and the entire network over time. The data is used to populate a run chart, included in the weekly reports shared with teachers, that monitors the percentage of students who have achieved the participation goal each week and those with a 0% participation rate. This visualization tool captures the different participation rates by school and classroom so teachers and school leaders can then use the available data to attend to the variability in student performance.
Demonstrates predictive validityAccording to the UBC Network’s theory of improvement, tracking student participation through a more holistic approach focused on engagement in different types of learning activities (synchronous, asynchronous, or onsite) can help school teams prevent educational exclusion (children with 0% weekly participation) and attain their goal of 80% of children participating in three learning activities a week. Additionally, the student participation data could function like an on-track indicator. While related outcome data is not available, we would expect the participation data to predict other variables like course completion and academic achievement. 
Is minimally burdensome to usersFor most teachers, recording participation on the prepared Google Sheet was easy and not very time consuming. However, some teachers were less familiar with using Google Apps and needed initial support as they transitioned from recording participation/attendance in the SIGE database to utilizing this new tool. After completing the Tracker/Spreadsheet, weekly data reports are automatically generated and sent to teachers for review. This process of collecting and documenting data was therefore systematized for teachers and integrated into their daily routines.
Functions within social processes that support improvement cultureThe UBC Network had strong structures and processes in place to support peer-to-peer learning around their improvement effort. Monthly meetings were held in each district to share the best pedagogical and leadership practices. These meetings were facilitated by the district’s assigned attendance manager who was trained by the Network in strategies for improving attendance and participation. During these meetings, the attendance managers shared the district participation results, analyzed the data, and provided the team an opportunity to engage in collective sense-making. During these monthly meetings, teams were able to come together in a safe, low-stakes environment to study and learn from the participation data so they could improve their outcomes. In addition, representatives from each district met with the hub team to hear from each other and identify lessons learned.
Is reported on in a timely mannerTeachers were expected to log student participation data into the Spreadsheet/Tracker each week. On the following Monday, weekly data reports were made available for school teams, which then used that data to evaluate how they were doing and identified students at risk of educational exclusion. The monthly district meetings, in particular, provided a structure for the team to make timely decisions based on their data that directly responded to the needs of their local contexts.

Question on Practical Measures Inspired by the Participation Measure

How do you maintain standardization across the network, while ensuring local needs are sufficiently addressed?
A core improvement principle insists that “variation in performance is the core problem to address.” This process requires us to center data, elevate different voices, and be disciplined about standard work processes in order to ensure that the change idea can be carried out reliably by various people, even working under different conditions. To do so, we must use practice-based evidence to inform standard work processes.

Across the UBC network, 14 districts shared this participation measure and data collection tool. What was unique about this process is that the network did not assign the learning activities for participating districts. Instead, they empowered the schools to select the activities that were most suitable for their students given the available resources and unique circumstances of the communities they served, and provided guidance to ensure quality learning experiences were implemented as needed. To support the data collection and use of the participation measure, the network did define key terms related to participation (synchronous, asynchronous, etc.), so that data was recorded consistently across sites. This approach also allowed network leaders to analyze participation data across districts and build processes to support the overall improvement effort.


Related

BLOG

Participation Measurement Blog

In order to be “practical,” a practical measure for improvement must take into account the specific context in which it is…

Read

You may also be interested in…

RESEARCH BRIEF

Developing Exit Tickets in an Improvement Network

Ms. Johnson, an English teacher at an urban middle school, was concerned that her students were not doing as well in her class as they could be. Her district had recently…

Read

RESEARCH BRIEF

Case Study of ORF as Practical Measure Brief

In this post we describe the identification and use of Oral Reading Fluency (ORF), a common and widely used Curriculum Based Measurement (CBM) measure, as a…

Read