Developing a Data Program on a Limited Budget

Brilla.PNG

Brilla College Prep, located in New York City, opened in August 2013 with 200 kindergarten and first graders; today, it serves more than 1,300 children through 8th grade across five campuses. As the school network grew, the Brilla leadership team faced a challenge familiar to many expanding schools: building a data program on a limited budget. While data is critical to a school’s growth and performance, it can be difficult for a growing school to build data capacity at the expense of other critical resources. Setting up data systems, synthesizing multiple data sources, and getting actionable insights to all stakeholders requires a significant investment of time and money.


The Challenge

Brilla’s leadership team understood how valuable a strong data program is. To make decisions, the school’s leadership wanted to utilize multiple assessments and blended learning platforms while simultaneously tracking non-academic metrics like attendance and discipline. Unfortunately, the built-in visualization tools in their existing systems were often inadequate to customize information to the standards of a high-performing academic team. 

As a result of these limitations, one of the school’s most important goals was to create a single-page dashboard that could be used to track and discuss performance with multiple stakeholders. The Chief Academic Officer, Data and Assessment Leads, and Operations Managers all needed a common data source to discuss the performance of different student groups. This was challenging for a few reasons:

  • Content and Design: All of the stakeholders at the school had different data needs. The academics, operations, and senior leadership team need to see performance indicators at different levels of detail, and it is difficult for a single page of data to meet these divergent requirements. 

  • Data systems: Assessment platforms, student information systems and internal school tracking tools all have different export formats. To develop a dashboard that draws on all these data sources, their output must be manipulated into a common, analysis-friendly format. 

  • Adaptability: The data reporting needs of various stakeholders can change quickly. Any dashboard developed needed to be adaptable to changing needs as different types of data are more important at different times of the school year. Additionally, the data and visualizations needed to be flexibly structured to reflect the most important data for decision-making at a given point. 

To address these challenges, Brilla partnered with EdOps in late 2017. EdOps began building a flexible dashboard and data system that could not only provide actionable insights to the organization but also could adapt to the school’s needs as it grew. Together, we created a weekly dashboard for each school and grade level combination, which served as a common touchpoint for the academic and operations teams. (See an example of those reports in Exhibit 1 and Exhibit 2 below.) The system we created was flexible enough to respond to the school’s evolving needs. For example, as Brilla transitioned to remote instruction due to COVID-19 we were quickly able to incorporate new data sources and collection methods to create a dashboard for senior leadership that tracked critical student engagement metrics. (See Exhibit 3 below.) Our system also made it easy to roll up these data points into high-level summary dashboards for the school’s board. All of this was accomplished through a collaborative process that combined Brilla’s knowledge of their students and EdOps’s expertise with data analysis and best practices. 


Our Solution

I. Content and Design

Brilla had developed robust internal performance goals focused on outperforming its peers, the city and the state. Through our regular check-in meetings with the school, we determined which metrics were the most important to track on a weekly and monthly basis. From there, we identified appropriate targets on each of the multiple assessments and blended learning platforms that were used to assess student performance and progress towards those larger goals.

The three key performance areas and their components tracked were:

  • Attendance: Weekly attendance, YTD attendance trends, tardies, chronic absentees and a list of student-based action steps for staff (For example, if a student is absent for more than 15 days in a year, the action step for front office staff is to set up a parent meeting.)

  • Behavior: Major and minor behavioral offenses with location and type of offense and student lists of repeat offenses

  • Academics: Skills/standard mastery, highest/lowest performing standards, activities completed/time spent on blended learning platforms

The data philosophy that we bring to our partnerships is that the best chart is one people can understand and explain to others. We iterated on several drafts with the school to see which data views resonated with Brilla staff. We also created a “watch list” of individual students for the school to focus on. This connected abstract data points to specific students and helped to drive teacher action. Knowing that 50% of their students struggled on a certain exam wasn’t as valuable as seeing the names of the students they taught that day. We iterated on what criteria made the most sense for the watch list, ultimately settling on attendance, discipline, and assessments.

Based on our discussions, we finalized a design that provided school stakeholders with an overview of major trends across attendance, behavior and academics, as well as which classrooms, subgroups and students required additional focus.

Exhibit 1.PNG

Exhibit 1. Weekly dashboard used by different school teams to track Student Attendance, Academic Performance and Behavior trends by grades and school.

Finally, we worked with the school to align subsequent action steps with established school policies, i.e., when performance dipped under critical thresholds. For example, the attendance watch list thresholds correspond to thresholds in Brilla’s truancy policy. If a student is absent for 5-7 days, a teacher must call the parent; if the student is absent for more than 15 days, the Assistant Principal must schedule a meeting with the family. The dashboard allowed the school team to quickly see what actions were needed for which students. 

Exhibit 2.PNG

Exhibit 2. Weekly Watchlists to track individual students on three key indicators (Attendance, Academic Performance, Behavior) color-coded by action items for staff. This dashboard allowed school teams to identify students and take corrective actions.

The dashboard has served as a regular pulse check on critical administrative and academic performance indicators. For the academic teams, it helped the school identify class performance down to the standard level, channeled information to the instructional teams for data-driven lesson planning, and helped teachers identify low-performing areas/skills. It helped teachers decide whether they need to adjust their instructional patterns for a given class and whether they should reteach a standard. This information also helped content leads design effective assessments to test students around low-performing skills/standards and check whether their instruction is working. For front office teams, it helped them communicate with families on student truancies and behavioral implications.

EdOps Insights: 

Platform limitations: In most platforms, it is difficult to easily locate and use longitudinal data. Without data manipulation skills, teachers and leadership often find themselves comparing multiple PDF exports to understand how students performed on the same content compared to previous years or quarters. Our goal in developing our own reports is to build on what an assessment platform does well and point school leadership to the most critical places to spend their time.

Percents vs. numbers: Data people love percentages to make comparisons between groups of different sizes! However, we often find numbers of students can be much more relatable to school staff in certain contexts. For example, in Exhibit 1 our first iteration reported the percentage of students that were tardy in each homeroom. While reporting that 23% of students were tardy to class allows a fair comparison between homerooms, ultimately it didn’t help the teachers of those homerooms understand the impact of tardies on their class. By working with teachers, we found that sharing that 8 students were tardy one or more days to class that week was much more relatable. 

Longitudinal performance tracking: Tracking performance relative to key benchmarks is important, but understanding how students performed relative to their previous performance can be critical. Brilla conducted a Friday assessment each week that tested specific standards. This provided insights into how well students tested on content taught that week. Upon conversations with the school, we realized that it was important to understand how well students performed on that standard previously to put performance in context. If 60% of students mastered a standard it could show a decrease in understanding or a massive increase if few students understood this content previously. We were able to use our technical abilities to easily aggregate previous standard performance across all previous assessments to add this insight to the school.  

Metric sensitivity: Another discussion we had was around how closely metrics move with student performance. The middle school wanted to track aggregate GPA to get students prepared to see this metric in high school. However, grades changed slowly and an aggregate like GPA changes even slower. As an early warning metric of student performance, it did not provide actionable enough information. We switched to tracking whether or not students achieved mastery on course content. This provided immediate feedback to teachers on whether or not students understood course content. 

II. Data Systems & Technical Development

EdOps technical expertise in working with databases and writing complex code allowed us to create flexible systems for creating dashboards that we can readily change when necessary. We were able to create these dashboards every week in just a couple of hours. This shortens the time between when data is collected and when it can be analyzed. It also makes it feasible to pilot new analysis methods or conduct one-time investigations without it being too time intensive. 

The primary challenge facing most schools is that data lives in numerous platforms and the access to data within those platforms varies widely. The gold standard is a data warehouse that seamlessly integrates all of these data sources into an easy-to-use database. Unfortunately, this technology is cost-prohibitive for small and growing schools. Furthermore, it is difficult to justify the investment in data infrastructure without being able to pilot data-driven analysis and see how it works for your school. Using the available database access and by using bulk export features, EdOps gathered raw data from all of Brilla’s data and assessment platforms. We used advanced functionality within Google Sheets to combine and reformat all of these data sources into a usable format. The report automatically generates when these exports are updated making our process replicable, accurate and efficient. 

EdOps Insights:

Siloed information and analysis: Most school teams have some people with proficiency in spreadsheets through Excel or Google Sheets. When we start working with a school, we often find a plethora of these workbooks set up to track data. The many tabs and pivot tables have great insights into the school but staff often face a couple challenges. First, the reports are not set up to be replicated and school’s face a time consuming process to replicate their analysis when new data comes out. Second, it is difficult to share this data with stakeholders in a relevant way. In addition to data security concerns, processing student level data into a high level trends analysis for leadership and a detailed student level analysis for teachers using classic spreadsheet methods is very difficult. Finally, merging in demographic data from other sources to get insights into student subgroups or to compare performance across multiple metrics requires additional skill. It is prohibitive for a school with intermediate spreadsheet skills to overcome these challenges on a weekly basis to efficiently produce an insightful report.   

III. Adaptability

Any dashboard we developed would need to be adaptable to the changing needs of the school. This need was readily apparent once the COVID-19 pandemic hit. The shift to remote instruction was a huge shock to the system. For schools like Brilla that previously used blended learning platforms as a supplement, this technology became the core of their instructional program. In addition, new tracking methods for reading logs and parent contacts were needed to monitor the academic and emotional health of students and families. 

Academically, we were quickly able to use our existing exports from blended learning platforms to provide additional analysis in weekly dashboards (see Exhibit 3 below). However, one new vendor had great resources for teachers but did not provide an overview of platform usage and performance for administrators. They also did not provide a bulk export feature to analyze the data at a high level. EdOps used our knowledge to work closely with this vendor to develop a custom export for the school. We were then able to incorporate these progress metrics into our weekly reporting. This saved teachers from having to manually report data and administrators from needing to look at hundreds of pdf exports to get the level of insight they were used to. 

Exhibit 3.PNG

Exhibit 3. Network dashboard designed during pandemic to track student engagement, emotional health, program usage and performance. This report was used by the school leadership teams to gauge the effectiveness of the newly introduced programs during online instruction.

Without students in the building it was critical to develop systems for engaging with students in this new remote environment. Brilla started logging interactions with families and collecting reading logs to monitor student progress at home. Any inputting of data from parents or teachers tends to result in messy data. For example, if a survey asks to enter a student’s name, rarely will this name be input in a consistent format. Spreadsheet formulas are unable to recognize these students as the same person causing all sorts of havoc during analysis. However, asking families to scroll through hundreds of names in a drop down to find themselves creates barriers for families under enough stress. Asking teachers to repetitively enter data multiple times is an unnecessary waste of time. EdOps worked with Brilla to use the entry methods that worked best for teachers and families to add this new unstructured data source into their weekly analysis. 


The Results

The effectiveness of data-driven instruction has shown not only in Brilla’s student performance, which has outperformed the local district, charters, and city averages (see SY18-19 results below) but also in the school’s overall data culture.

Exhibit 4.PNG

Data dashboards have helped create a culture of accountability and putting data at the forefront before making decisions impacting student performance. It has also helped develop data structures and standardized data formats across all five campuses, making comparisons more effective. None of this would have been possible without emphatic school leadership and a willingness to invest in and implement a strong data-driven culture.


Key Outcomes

  • Compact Actionable Performance Reporting: EdOps used advanced coding skills to merge different datasets and create a holistic data dashboard for multiple stakeholders providing an overview of class and school performance while highlighting action steps at the same time. 

  • Standardized Data Storage Format: EdOps shared best practices around data storage format across all five campuses to increase the data flow. As a result, the majority of student data collection tools now follow a standard reporting and storage structure making it easier to readily use for compliance and analysis.

  • Customized Data Views: EdOps performance tracking solution included dynamic data dashboards to incorporate varying data needs of school leadership while incorporating new performance indicators for staff and students. Customizable data dashboards include teacher performance and family satisfaction surveys. 

  • Effective Assessments for Learning : Looking closely at the academic performance at the standards level, led to discussions around developing effective assessments aligned with student needs and testing the right skills at the right time of the year.

  • Improved Data Ecosystem: Increased visibility on performance metrics resulted in increased engagement from staff in trying to understand the reasons behind performance numbers. This has created a culture of accountability and critically analyzing program effectiveness. 

Dan Theisen