Horizons is a national nonprofit that partners with public, independent, and charter schools, colleges, and universities (host institutions) to create out-of-school academic, enrichment, and social and emotional learning programs for students from pre-K through high school. Horizons’ hallmark is an intensive six-week summer learning and enrichment program and they also offer programming throughout the school year. In 2019, Horizons Bridgeport, a regional network of Bridgeport-based Horizons programs, was formed in partnership with its affiliated sites, Horizons at Greens Farms Academy, Horizons at Sacred Heart University, and Horizons at Notre Dame High School. Horizons Bridgeport aims to expand capacity by opening new programs and expanding current programs. Over the next several years, Horizons Bridgeport plans to serve over 1,000 Bridgeport students; increase and diversify funding to support program growth; identify and support best practices to strengthen program quality; strengthen community relationships; and, establish regional infrastructure and governance based on advancing equity.

Progress Snapshot

TFF Investment Total To Date (2013-2021): $1.3M
(HGFA received $643,000 in support from 2013-2018)

TFF’s support to Horizons programs began when Horizons at Greens Farms Academy (HGFA) was selected for its first investee cohort. After convening community stakeholders to assess need in 2018, TFF helped to establish Horizons Bridgeport (HB), in an effort to consolidate governance, growth, administrative, and fundraising infrastructure for the Bridgeport affiliates and ultimately achieve greater impact for students, families, and the community. TFF extended support to two additional HB sites: Horizons at Sacred Heart University (HSHU) and Horizons at Notre Dame High School (HND). HB has begun its work to fortify each affiliate site; for example, efforts are underway to more firmly establish a feeder school pattern with TFF investees BPS, CAB, and NBFA the Bridgeport Public Schools, Catholic Academy of Bridgeport, and New Beginnings Family Academy, as well as other Bridgeport schools.

One of the major benefits of Horizons Bridgeport is the opportunity for more intentional cross-site learning around SEL Kernels, a group of targeted strategies focused on the development of social and emotional skills, developed by Harvard Graduate School of Education’s (HGSE) EASEL Lab. The Lab began working with HGFA in 2015 and the partnership expanded over time and continued throughout the shift to pandemic virtual learning. Laura Stickle, EASEL Lab’s project coordinator, works with HGFA, HSHU, and HND to support implementation of the Kernels and assessment. Using a train-the-trainer model, Stickle has helped each site integrate SEL Kernels into academic instruction in context specific ways.


SEL Kernels aim to tackle the challenge of implementing prescriptive SEL programs/curricula that are, in some cases, time intensive and difficult to adapt to different cultures and contexts. To develop the Kernels, the EASEL Lab analyzed more than 50 evidence-based SEL programs spanning early childhood, elementary, middle, and high school. They then coded the specific skills, instructional strategies, and practices (e.g., storytelling, music, discussion, game, etc.) found in the programs. The common practices identified across the programs became the foundation of the SEL Kernels. Kernels target knowledge, skills, and competencies within five broad domains of SEL: cognitive skills, emotion processes, social skills, character/values, and mindsets. Comprehensively profiled by Education sector innovation clearinghouse HundrED, SEL Kernels can be accessed for free on Greater Good in Education’s site.

Joe Aleardi, Executive Director of Horizons Bridgeport and former Executive Director of HGFA, comes from a family of educators, but earned his JD and an MBA before beginning his career working in law, banking and technology, then transitioned to a classroom teacher role with Teach for America, and eventually became a Horizons educator. Bringing this constellation of experiences to the role, Aleardi applies a mission-driven, business-like mindset to the work of mitigating summer learning loss and boosting social and emotional skills for the children Horizons Bridgeport sites serve. Aleardi described some of the ways TFF’s investment helped to change outcomes for students at Horizons at Greens Farms Academy: “The focus on SEL has changed the culture… the classroom climate has improved immeasurably.” Aleardi shared that as the SEL Kernels were implemented, teachers experienced the students as more engaged, and classroom behavioral challenges decreased. HGFA tracks student progress on a variety of outcomes, including SEL skills at the beginning and end of its summer program, as measured by the SEL Benchmarks assessment (see Appendix), also developed by the EASEL Lab.

Aleardi credits Horizons’ focus on SEL with multiple positive short- and long-term outcomes:

In addition to giving them more social and emotional skills to hopefully do better down the road in life… it makes [Horizons] a more fun place to be, I think. It is more welcoming, friendly, and familial, and because of that, more academic work gets done. And it shows in STAR results [academic assessment], attendance, and retention. [Improvements in] our SEL outcomes were statistically significant every year.”

Joe Aleardi, Horizons Bridgeport Executive Director

Starting in 2016, Aleardi and the HGFA team, including Ashley Nechaev, who served as a classroom educator, literacy specialist, and middle school academic director, over her six-year tenure at HGFA, worked with the EASEL Lab team to pilot the SEL Kernels and train the program’s summer educators in the strategies. Using detailed implementation and feedback data, they worked to revise both the training and a resources binder provided to each educator. The continuous improvement approach, in which educators could see their insights reflected in revised materials, generated buy-in among the staff, resulting in a strong commitment to implementation over the years.

Nechaev subsequently moved to HSHU, where she served as academic director and now executive director. There she has spearheaded the program’s adoption of a trauma-informed, restorative approach to implementing SEL. In addition to hiring a psychotherapist to provide mental health training and services to educators and students, a small group approach to Kernels was initiated allowing children who need more intensive social and emotional support to receive additional, differentiated instruction in the skills and competencies that Kernels promote. Small group Kernels data is collected and reviewed to assess the effectiveness of instruction. Nechaev also has a passion for using data to drive decision-making which, she says, syncs with TFF’s performance management focus. As at HGFA, weekly implementation surveys and pre and post summer SEL Benchmarks assessments measuring individual student SEL growth, are built into HSHU’s program. While academic data is important to Horizons and its programs have been shown to effectively prevent summer learning loss,[1] student self-reported improvement in affect toward reading–feelings of competence and enjoyment–is an important success indicator used by the program.

Ashley Nechaev, Horizons at Sacred Heart University Executive Director

Along with other program leaders, Nechaev has championed Diversity, Equity, and Inclusion (DEI) at HSHU, including building capacity to use culturally responsive, anti-racist practices, and honoring the validity of vernacular used by many students from diverse backgrounds. Nechaev asserts that SEL competencies are essential to engaging in respectful and productive conversations about DEI and trauma. Describing the program’s ability and responsibility to promote justice and equity, Nechaev asks: “if it doesn’t happen here [at Horizons programs], where will it happen?” 

In summer 2021, HSHU collected feedback from families about the impact of the program overall and the SEL curriculum specifically, which was positive. HSHU is also concerned about staff well-being and to monitor the effectiveness of the trauma-informed lens HSHU applies to its programming, they trialed a professional quality of life measure with the program’s educators: the ProQOL survey. ProQOL assesses compassion fatigue, burnout, and secondary trauma among those in helping professions such as teachers, medical professionals, and others. Unfortunately, in summer 2021, HSHU was forced to transition to a virtual program after just four weeks of its six-week program, due to rising COVID-19 cases, thus data collection was stymied. Overall, Horizons Bridgeport sites continue to build capacity and sophistication in implementing, collecting, and effectively using data to manage program performance and improve outcomes for students and the teaching experience for educators.

“Culturally responsive teaching is using the cultural knowledge, prior experiences, and performance styles of diverse students to make learning more appropriate and effective for them; it teaches to and through the strengths of students.”

- Geneva Gay,

Preparing for Culturally Responsive Teaching

Outcome Data Analysis

The HGFA program offers Saturday academies in the fall and spring semesters, along with an intensive summer program. Students apply to the program as preschoolers and HGFA aims to work with these students from kindergarten through grade 12, and in more recent years, through college. Fig. 17 presents data provided by HGFA on the percentage of students enrolled in their summer program. Note that the program operated in a virtual format in the summer of 2020. HGFA did not report attendance rates for the summer program or the Saturday academies.

* HGFA’s summer program operated in a virtual format in 2020.
Source of data: HGFA annual reports/data dashboards

HGFA uses Renaissance Star assessments[2] to report on the reading and math proficiency of rising K-8 students at the beginning and end of its summer program. Specifically, HGFA assesses reading proficiency by administering the Star Early Literacy assessment to students entering Kindergarten through grade 1 and the Star Reading assessment to students entering grades 2 through 9. To assess math performance at the beginning and end of the summer program, HGFA administers the Star Math assessment to students entering grades K through 8. Students consistently achieve substantial growth over the summer in both reading (between 2 and 3.5 months of growth) and math (between 3 and 6.4 months of growth). (Fig. 18)

* The Renaissance Star assessments were not administered in summer 2020, when HGFA’s summer program operated in a virtual format.
Source of data: HGFA annual reports / data dashboards

HGFA also uses the Star Reading assessment to track the percentage of students who enter grade 4 reading proficiently; values range from 60-70%. As well as using Star assessments to track reading and math proficiency from grades K through 8, HGFA tracks the percentage of students who enter grade 9 on time; this percentage has been 100% every year between 2013-14 and 2020-21. (Fig. 19)

* The Renaissance Star assessments were not administered in summer 2020, when the program operated in a virtual format.
Source of data: HGFA annual reports / data dashboards

HGFA uses SAT scores to track the reading and math proficiency of its high school students. Specifically, HGFA tracks the percentage of high school students meeting the College Board’s College and Career Readiness Benchmarks, which is defined as scoring at or above 480 on the Reading section of the SAT and at or above 530 on the Math section of the SAT. The percentage of students meeting these standards grew from 2018-19 to 2019-20, but it dropped in 2020-21, the first full school year of the pandemic.(Fig. 20)

*HGFA reported that 2020-2021 SAT participation was lower than in previous years, in part because of reduced ability to prepare for the test and in part due to the decision of many U.S. colleges not to require SAT or ACT scores for fall 2021 admission.
Source of data: HGFA annual reports / data dashboards

HGFA also tracks the percentage of students who graduate from high school on time; this percentage has been 100% for 7 of 8 years between 2013-14 and 2020-21. (Fig. 21)

Source of data: HGFA annual reports / data dashboards

After administering the Child Trends social and emotional skills assessment periodically throughout the year in 2014-15, 2015-16, and 2016-17, HGFA transitioned in 2017-2018 to administering the SEL Benchmarks assessment during the summer. With the exception of 2020, when its summer program operated virtually, HGFA has administered the SEL Benchmarks assessment to rising K-9 students at the beginning and end of the summer program for the past four years. The EASEL lab at Harvard Graduate School of Education developed this teacher-report assessment and partners with HGFA to administer the assessment and analyze the results. The current version of the SEL Benchmarks assessment includes 13 items for grades K-5 and 21 items for grades 6-8; each item focuses on a specific skill, such as expressing feelings effectively, using appropriate conflict resolution strategies, or attending to the task at hand. When completing the assessment, teachers record how often they see each child demonstrate each of the SEL skills on a 4-point scale of 1 = Never; 2 = Rarely; 3 = Sometimes; and 4 = Always. Two teachers complete the assessment for each child, and the two scores are averaged.

Each year, the EASEL lab reports SEL Benchmarks assessment data to HGFA in several ways. For example, EASEL reports average item scores by grade level and overall (across all students), which allows for comparison among grade-level cohorts and across skills. EASEL also reports average composite scores by grade level and for the entire group; EASEL calculates a student’s composite score by averaging their scores across all items. Fig. 22 shows the average beginning-of-summer and end-of-summer composite score for 2017 through 2021; average composite scores for HGFA students increased over the course of each summer. When the EASEL lab statistically compared the beginning-of-summer and end-of-summer average composite scores from the SEL Benchmarks, they saw a statistical increase[3] in scores each summer. The graph also shows that after rising over the course of each summer, average SEL Benchmarks composite scores are lower at the start of the next summer. Caution must be used when considering these results for four reasons:

Source of data: EASEL Lab at Harvard Graduate School of Education
  1. Familiarity: Teachers know their students better at the end of the summer, compared to the beginning of the summer, which means that teachers may underestimate their students’ skills at the start of the summer. In addition, end-of-summer ratings may be biased by teachers’ desire to see growth in the students with whom they work closely during the summer program.
  2. Comparability: As noted above, the current version of the SEL Benchmarks assessment asks teachers to rate 8 additional skills for students in grades 6-8, compared to students in PK-5. For this reason, composite scores for these two groups are measuring a different combination of skills, and it is challenging to calculate comparable average composite scores across all grades. In addition, it is important to remember that while the beginning- and end-of-summer composite scores for a specific year represent the same group of children at both points in time, scores from different years represent different groups of children as some children move on to the high school program and others enter at the younger grades. This shift in the population assessed may also explain the decrease in average SEL Benchmarks assessment scores between the end of one summer and the start of the next.
  3. Significance: Although EASEL determined that each summer’s increase in average SEL Benchmarks composite scores was statistically significant, it is difficult to determine whether these increases are practically significant. For example, the average composite score increases from 3.34 in June 2021 to 3.58 in August 2021, but it’s challenging to know whether that increase is meaningful in terms of its tangible benefits for students.
  4. Variability: Composite scores describe student performance across a group of skills, without providing insight on which skills are stronger and weaker. For that information, stakeholders must consider scores from individual items, whether by grade or overall—information that EASEL also provides to HGFA in the form of matrices, which are not included in this report. Further, average composite scores describe the performance of a group of students across the targeted skills, without describing the variability among students, some of whom may have stronger SEL skills than others.

Based on research indicating that students with different levels of social and emotional competence respond differently to SEL programming and supports, EASEL also presents SEL Benchmarks assessment data separately for groups of students with different initial levels of SEL skills. Specifically, EASEL groups students into four quartiles, based on their initial SEL Benchmarks composite scores. Quartile 1 includes the 25% of students with the lowest initial SEL Benchmarks composite scores, whereas Quartile 4 includes the 25% of students with the highest initial SEL Benchmarks composite scores. When comparing the beginning- and end-of-summer composite scores for each quartile, EASEL determined that each quartile experienced a statistical increase each summer, with the exception of quartile 4 in 2021 (Fig. 23).[4] In addition, EASEL observed an interesting trend in the beginning- to end-of-summer changes experienced by each quartile: every year, the quartile with the lowest initial composite scores (quartile 4) experienced the largest increase over the course of the summer; quartile 3 experience the second largest increase, and quartile 2 experienced the third largest increase, and quartile 1 (students with the highest initial composite scores) experienced the smallest increase. This trend implies that each year, the students with weaker SEL skills at the beginning of the summer experienced more growth between the beginning and end of the summer program.

Source of data: EASEL Lab at Harvard Graduate School of Education
* The SEL Benchmarks assessment was not administered during the summer of 2020, when HGFA’s summer program was virtual.

In addition, EASEL examined changes in SEL Benchmarks composite scores between the earliest administration in June 2017 and the most recent administration in August 2021. To ensure that they were considering the same group of students over time, these analyses included only data from students who were assessed consistently across the four-year period, with quartiles assigned based on students’ composite scores in June 2017. This analysis also revealed a pronounced difference among quartiles: the average composite score increased most for the students with the lowest composite scores in June 2017 (quartile 1), with a smaller increase for quartile 2, and smaller increase still for quartile 3, and the smallest increase for students with the highest composite scores in June 2017 (quartile 4) (Fig. 24).[5]

Source of data: EASEL Lab at Harvard Graduate School of Education

To assess HGFA’s organizational capacity in 2013-14 and 2014-15, members of the TFF staff completed the Organizational Management Capacity Assessment Tool (OMET), a tool developed by David Hunter/Hunter Consulting that external evaluators can use to assess the organizational capacity of an institution over time. HGFA’s overall rating for outcomes-focused management increased from low in 2013-14 to medium in 2014-15, while the other three domains were rated medium in both years (Fig. 25).

Source of data: Tauck Family Foundation

When the OMET became unavailable, TFF shifted in 2015-2016 to the Impact Capacity Assessment Tool (iCAT), a self-assessment developed by Peter York/Algorhythm that collects data from multiple stakeholders within each organization. (Note that sample iCAT items are available in the appendix.) Between 2016 and 2021, HGFA’s iCAT scores generally increased in all four domains, with the exception of a decrease in 2018 (Fig. 26).

Source of data: Tauck Family Foundation

1 Through the Horizons model, students achieve an average 4 months improvement in reading and math over the six-week summer session.

2 The Renaissance Star assessments are computer adaptive assessments developed by Renaissance Learning, Inc. to allow educators to assess students for screening, progress monitoring, and targeted instruction. For more information on the Renaissance Star assessments, including their development and relevant research studies, visit https://www.renaissance.com/products/star-assessments/evidence/. Horizons National, the parent organization of Horizons at Green Farms Academy and other Horizons programs in Connecticut and beyond, requires that all Horizons programs use Renaissance Star assessments to track student progress.

3 In other words, when the EASEL Lab at Harvard Graduate School of Education analyzed SEL Benchmarks assessment data, they detected a statistically significant increase in composite scores between the beginning and end of the summer program. Because the term statistically significant is often equated erroneously with the word significant in the everyday sense of the word (meaningful), this case study describes changes in scores as statistical differences, rather than statistically significant differences. More information on statistical significance is available in the Quantitative Analysis Technical Details appendix.

4 In other words, every quartile experienced a statistically significant increase each summer, with the exception of quartile 4 in 2021, which saw a small decrease that was not statistically significant.

5 The change between June 2017 and August 2021 was statistically significant for each quartile.