Quantitative analyses highlighted several considerations that might be relevant to other foundations and investees that are embarking on a similar journey:
Be clear from the outset about the strengths and limitations of your evaluation design: The design of an evaluation affects the conclusions stakeholders can draw from it. If the goal of an evaluation is to draw causal conclusions about the effectiveness of introduced practices, it is important to incorporate methods that can determine causality, such as randomly assigning participants to “intervention” and control conditions. On the other hand, if the goal is to use evaluation to continuously improve practice, it is more important to collect data across multiple timepoints in ways that can be quickly fed back to practitioners so they can act on the data. Lay the groundwork for evaluation based on what the foundation and its investees want to learn and what resources are available for evaluation.
Carefully consider the goal(s) for data use before collecting it: Collecting new data can be costly, especially when new measures must be developed. Existing data may provide meaningful insight at a low cost. When preparing to collect new data, consider the feasibility of using the measures (how cumbersome they are for investees to use), the timeliness with which data will be accessible (how much time and energy will be required to make raw data interpretable), and the usability of data for driving decision-making (how clearly the data informs action).
Maintain consistent measures over time where possible: Regardless of the goal of data use, to assess change in a measure, the measure must be comparable across timepoints. If concerns arise about a measure that is being used, carefully weigh the pros and cons before making a change. Will the new measure provide substantially more useful data? Does the improvement warrant the lack of comparability? Over time, investees may build the capacity to assess the usefulness of the measures they have adopted, and they may decide to make changes to those measures. For example, after the first year of the BPS SEL implementation survey, the district and its partners worked together to reduce the survey’s length by eliminating items that did not produce actionable data, thus reducing the burden on educators who completed the survey. Changes in measures, while not ideal from an evaluation perspective, sometimes demonstrates increasing sophistication among investees in understanding and using data.
Be realistic about investees’ capacity to collect, process, and use data: Investees can differ widely in their capacity to acquire and use data. Investees with little data infrastructure will need more support around building their capacity before they can collect or use meaningful data. This capacity can be built within the investees’ organization or through establishing partnerships. If investees have substantial data capacity, they can move more quickly towards focusing on how to make the most of their existing data and/or collecting new data. To establish and sustain data use practices, it is critical to meet investees where they are.