For Horizons at Greens Farms Academy (HGFA), New Beginnings Family Academy (NBFA), and Bridgeport Public Schools (BPS), the three investees that have partnered with TFF since 2013, the quantitative analyses found some positive trends in student outcomes (for example, attendance, exclusionary discipline, academic achievement, SEL skills, etc.) and organizational outcomes (for example, organizational capacity, SEL implementation, etc.) over the investment period. However, the TFF investment was only one of many factors affecting investees over this period, which means it is impossible to determine whether positive trends are the result of TFF investment.

For BPS, the case study team examined outcomes at both the district and school level. BPS schools vary widely on many metrics, from school size, staffing, leadership, academic and behavioral outcomes, and SEL implementation, but this variation is obscured when data are presented at the district level. The team expected that examining the association of school-level outcome data with school-level data on SEL implementation would reveal a positive link between these two factors. However, school-level analyses did not confirm this hypothesis. Given some limitations in the data, it is challenging to assess whether these findings indicate that there is no association or that the measures used were too imprecise to detect effects.

Quantitative analyses highlighted several considerations that might be relevant to other foundations and investees that are embarking on a similar journey:

Be clear from the outset about the strengths and limitations of your evaluation design: The design of an evaluation affects the conclusions stakeholders can draw from it. If the goal of an evaluation is to draw causal conclusions about the effectiveness of introduced practices, it is important to incorporate methods that can determine causality, such as randomly assigning participants to “intervention” and control conditions. On the other hand, if the goal is to use evaluation to continuously improve practice, it is more important to collect data across multiple timepoints in ways that can be quickly fed back to practitioners so they can act on the data. Lay the groundwork for evaluation based on what the foundation and its investees want to learn and what resources are available for evaluation.

Carefully consider the goal(s) for data use before collecting it: Collecting new data can be costly, especially when new measures must be developed. Existing data may provide meaningful insight at a low cost. When preparing to collect new data, consider the feasibility of using the measures (how cumbersome they are for investees to use), the timeliness with which data will be accessible (how much time and energy will be required to make raw data interpretable), and the usability of data for driving decision-making (how clearly the data informs action).

Maintain consistent measures over time where possible: Regardless of the goal of data use, to assess change in a measure, the measure must be comparable across timepoints. If concerns arise about a measure that is being used, carefully weigh the pros and cons before making a change. Will the new measure provide substantially more useful data? Does the improvement warrant the lack of comparability? Over time, investees may build the capacity to assess the usefulness of the measures they have adopted, and they may decide to make changes to those measures. For example, after the first year of the BPS SEL implementation survey, the district and its partners worked together to reduce the survey’s length by eliminating items that did not produce actionable data, thus reducing the burden on educators who completed the survey. Changes in measures, while not ideal from an evaluation perspective, sometimes demonstrates increasing sophistication among investees in understanding and using data.

Be realistic about investees’ capacity to collect, process, and use data: Investees can differ widely in their capacity to acquire and use data. Investees with little data infrastructure will need more support around building their capacity before they can collect or use meaningful data. This capacity can be built within the investees’ organization or through establishing partnerships. If investees have substantial data capacity, they can move more quickly towards focusing on how to make the most of their existing data and/or collecting new data. To establish and sustain data use practices, it is critical to meet investees where they are.

The quantitative and qualitative approaches used for this report have different and complementary strengths and limitations. The broader reach of the quantitative approaches allows for making more generalizations about the evaluation findings, but the findings are less detailed. The more focused and detailed qualitative approach limits the degree to which one can broadly apply the findings but provides a richness in perspectives that is difficult to achieve with quantitative approaches. Together, these two approaches provide a more complete picture of TFF’s sustained investment in Bridgeport.

Much progress and many meaningful outcomes are, by necessity, omitted from this study. The authors encourage readers to explore the Investee Journeys and Outcomes section of this report and visit the websites of each investee to learn more about their important work with the children of Bridgeport.