Community Schools: Models and Approaches
Al Passarella, Research and Policy Analyst
Alanna Bjorklund-Young, Senior Research and Policy Analyst
It is a sad truism that students from low-income families face daunting environmental barriers, both in and out of school, to academic success. Such factors may include housing insecurity, household unemployment, transportation difficulties, untreated health conditions, and unaddressed emotional and psychological issues (Rothstein, 2010). The premise behind community schools is that schools themselves should help to provide a safety net in these areas, either by offering additional services or by connecting students to larger networks of social support. All the various models of community schools seek to mediate intervening factors, prevent student disengagement, and support student success.
Barriers to Success In and Out of School
The schooling experience is a key determinant not only of academic outcomes but of lifelong attitudes about learning (M. Walsh, Gish, Foley, Theodorakakis, & Rene, 2016). Creating a positive experience often means supporting student development in ways that go beyond a focus on academic issues. This is particularly necessary in the case of minority and low-income urban students, many of whom experience more out-of-school stressors than students who are neither minority nor live in urban, high-poverty areas (Balfanz, Robert, Herzog, Lisa, & MacIver, 2007; Bowen & Bowen, 1999).
The first and perhaps most essential need is to keep underprivileged students in school. Young people at highest risk of disengaging need “a seamless system of wraparound supports for the child, the family, and schools, to target student’s academic and non-academic barriers to learning” (Anderson-Moore et al., 2014, p. 5). Skill development, job training, and extended learning opportunities are what keep students engaged and achieving (Castrechini & London, 2012).
Addressing barriers depends upon matching students to the right supports (Manekin, 2016). As we detail below, many school districts have responded with models and interventions that show some promise. Not all models are equally effective, however, and implementing any of them with fidelity is not easy. This leads us to the central questions of this review: What does research say about preventing student disengagement, re-engaging disconnected students, and improving student outcomes across the board? Do comprehensive community-school models have an empirical advantage over isolated components of support? What do we know about specific community-school models and their effects?
Components of Student Supports That Matter
Research has identified school attendance, behavior, and course performance (the A, B, C’s) as strong predictors of high school completion (Allensworth & Easton, 2005; Balfanz et al., 2007). Given the negative outcomes associated with non-completion of high school, (Allensworth, 2015) student supports should be identified early and intently focused on these areas (Balfanz, 2009; Cauley & Jovanovich, 2006).
Integrated Student Support (ISS) is a broad reform model that addresses these concerns and upon which community school models draw. ISS strategies seek to influence traditional academic measures (e.g. grades, attendance, and test scores) and to reduce the specific barriers to achievement (e.g. access to health service, family economic sustainability) (Anderson-Moore et al., 2014).
A Child Trends (2014) meta-analysis of ISS models found significant, positive impacts in student- or school-level academic measures of student progress, attendance/absenteeism, and academic achievement. No discernible impacts were found on non-academic outcomes such as school attachment, parent-school involvement, and conduct problems.
Two general lessons emerge from the meta-analysis on ISS. First, the most comprehensive ISS models employed five core components that drove academic improvements:
- Comprehensive (social emotional, and academic) needs assessments;
- Coordination and management of social, emotional, and academic supports for students;
- Integration of supports within schools;
- Extensive community partnerships;
- Use of student data to track progress.
These five components are anchors that comprise the community school model. The models that yield the strongest academic outcomes (i.e. City Connects and Communities in Schools) use a two-tiered approach of prevention and response. Tier I support aims at prevention (Anderson-Moore et al., 2014). Examples of Tier I activities include (but are not limited to): early-warning systems that identify academically at-risk students; universal application of social-emotional learning programs; extended learning opportunities; and connection to community based services.
The tools of prevention also enable diagnosis and response. This is where Tier II supports come into play. Tier II supports are more personalized; they are intended for those students who face significant achievement barriers such as youth returning from incarceration or those with intense mental health needs.
Bearing in mind the five core components of community schools (needs assessments, coordination of support, integration of support within schools, extensive community partnerships, and the use of student data to track progress) and the two tiers (prevention and response), we now turn to specific models.
Introduction to Specific Models and Interventions
Researchers have also investigated specific community-school models, each of which deploys several of the five components above and aims to give students the support they need to be academically successful. City Connects and Diplomas Now are whole-school interventions and, as such, come closest to an overall community schools model. Big Picture Leaning and Career Academies, by contrast, attempt to achieve the same ends through the creation of small, intimate learning communities. The models below are stratified by type of intervention (whole-school/small learning communities) and by the strength of the evidence supporting efficacy.
City Connects is a whole-school model that supports low-income students’ needs through individualized services (see below). Since launching in a single K-5 school, City Connects has expanded to 84 locations across eight states, with many locations serving secondary as well as elementary students.
City Connects hires and assigns a School Site Coordinator (SSC) to work within the school. Trained as a licensed school counselor or a social worker, the SSC’s role is to “connect students to a customized set of services through collaboration with families, teachers, school staff, and community agencies” (M. E. Walsh et al., 2014). SSCs work with school administrators and teachers to conduct a Whole School Review. This is where the customization of services begins. Every student in each class is evaluated to determine strengths and needs across four domains: academic performance, social and emotional skills, physical health, and family dynamics (Bowden et al., 2015).
Students identified as requiring more intensive services receive an Individual Student Review (ISR). Led by the SSC, the ISR team—“an existing structure in schools that can include school psychologists, teachers, principals, nurses, and, when appropriate, community agency staff”—develops specific strategies for at-risk students (M. E. Walsh, Madaus, et al., 2014). The SSCs host periodic reviews of students’ progress.
What drives the model is the SSC’s knowledge of, and relationships with, community service providers (Bowden et al., 2015). Being able to quickly identify and leverage supports allows for earlier and deeper interventions. Services are available to all students based on need, with participating students receiving an average of five supports. Supports range in the following levels of intensity:
- Preventive (e.g. tutoring, academic enrichment, summer arts programming);
- Early-intervention (e.g. mentoring and leadership skills programs, routine health care); and
- Intensive (e.g. psychological services, individual family counseling, emergency medical care) (Mary E. Walsh, Kenny, Wieneke, & Harrington, 2008).
Walsh and her colleagues conducted a quasi-experimental evaluation (QED) to test the program’s effect on student achievement. The study used data from 7,500 students over 9 school years (1999-2000 through 2008-2009). Students in participating schools received the full program treatment. Propensity scoring methods were used to match similar students, who did not receive City Connects services, to serve as a control group. Researchers then compared the outcomes of students who received City Connects services to the outcomes of students in the control group, in order to estimate the effect of City Connects.
Walsh et.al. found that City Connects had a positive impact on academic achievement, although the impact was not statistically significant across all grades (3rd-8th) and outcome measures (GPA and MCAS scores in both math and ELA). For example, elementary students showed statistically significant positive effects for participating in City Connects in 3rd-5th grades on ELA GPAs (with effect sizes ranging from 0.22-0.28 SD), only a significant positive effect on math GPAs in 5th grade (effect size of 0.16 SD), and no statically significant effect on MCAS scores in elementary school.
The results for middle school students were more consistent. Students who received City Connect supports showed statistically significant increases in their MCAS scores in both ELA and math in 6th-8th grades, with effect sizes ranging from 0.15-0.33S D in ELA and 0.18-0.45 SD in math. Students also had improved ELA GPAs in 6th and 7th grades (with effect sizes ranging from 0.19-0.35 SD), although the positive effect was not statistically significant in 8th grade. The researchers concluded that “the longer a student was enrolled in a City Connects elementary school, the better his or her middle school outcomes” (M. E. Walsh, Madaus, et al., 2014).
Additional research supports evidence of a lasting effect: a large-scale, quasi-experimental study of the program found that students who attend City Connects schools staring in elementary grades are less likely to drop out, compared to non-City Connects students (City Connects, 2014; M. E. Walsh, Lee-St. John, Raczek, Foley, & Madus, 2014). This was true at each high-school grade level, with a specific emphasis on ninth grade, when the likelihood of dropout is greatest (Manekin, 2016). Both studies support the notion that early identification and intervention of out-of-school barriers can positively impact student achievement initially and over time (Anderson-Moore et al., 2014).
What can districts expect to pay for City Connects? A cost benefit analysis of the program found an average cost of $4,570 per student, of which schools bore roughly 10% (Bowden et al., 2015, p. 27, Table 3). An additional analysis found the program to produce a benefit of $3 for every $1 spent; translating to a net societal benefit of $9,280 per student. One driver of the low costs is the work of the School Site Coordinator. Organizing already-available services and streamlining referrals produces savings simply by eliminating duplication of effort (M. E. Walsh et al., 2014).
Diplomas Now is a comprehensive, whole-school model focused on middle and secondary school. The program’s implementation comes through a partnership between Talent Development Secondary (based at Johns Hopkins University), City Year, and Communities in Schools. The division of labor is as follows:
- Talent Development Secondary provides organizational, instructional, and curricular support to school; organizes students and teachers into small learning communities; provides professional development; and coaches teachers to strengthen pedagogy.
- City Year provides AmeriCorps volunteers who support before- and after-school programs, provide tutoring and mentoring, and assist teachers in classrooms.
- Communities in Schools places site coordinators in schools to focus on students most at risk of disengaging by organizing non-academic services specific to their needs.
MDRC and ICF International conducted a rigorous random assignment evaluation of the impact of Diplomas Now and provided recommendations for implementation. The study used student data from 62 schools (33 middle, 29 high schools) across 11 urban school systems in school years 2011-12 and 2012-13. Thirty-two participating secondary schools were randomly assigned the Diplomas Now model. The remaining schools either continued their existing practices and programs or pursued other reform models (non-DN schools). Five of the schools were among the nation’s largest, with only one school falling outside the top 100. All participating schools were Title-I eligible and served large populations of low-income and minority students (Corrin et al., 2014).
These studies indicate that, in its first year of implementation, Diplomas Now had a positive preventative impact in reducing the percentage of students at risk for disengagement, but no significant impact on other key measures. Specifically, schools which participated in Diploma Now showed a statistically significant increase in the percentage of students who exhibited no early warning indicators. Diplomas Now also showed positive but not statistically significant impacts in attendance, behavior, and course performance.
Diplomas Now showed stronger results amongst middle school students than high school students. For example, the study found statistically significant increases in the percentage of sixth-graders with higher than 90% attendance rates and no early-warning indicators. Schools receiving Diplomas Now saw a 4.1 percentage point increase in students with better than 90% attendance rates and a 5.5 percentage point increase in students with no early warning indicators after two years of implementation. While there were no other statistically significant effects at the middle-school level, the estimated effect sizes across multiple specifications were larger for middle school students than for high school students (Corrin, Sepanik, Rosen, & Shane, 2016).
When researchers surveyed teachers and students in middle and high school on their perceptions of school climate and parental involvement, they found:
“The Diplomas Now model had a positive and statistically significant impact on how strongly teachers agreed that the climate at their school was conducive to teaching and learning, suggesting that by the second year of implementation, the Diplomas Now model was positively affecting teachers’ perceptions of their school environment.”
Students surveyed from the Diplomas Now schools reported greater rates of participation in school-sponsored after-school programs (with effect sizes of 0.11 SD) and better adult-student relationships than the comparison group (with effect sizes of 0.11 SD) (Corrin et al., 2016).
Career Academies is another model that focuses on small, learning communities to achieve student support and achievement. Pioneered in Philadelphia in 1969, Career Academies exists in districts throughout the country, including more than 1,200 Career Academies in nearly 500 California high schools. The most recent cost estimates (2004) for Career Academies are roughly “$600 per pupil more than a district’s average per-pupil expenditure (cost data refer to the California Partnership Academies)” (U.S. DOE, 2015).
The National Career Academy Coalition lists three features of a career academy (U.S. DOE, 2015):
- Small learning communities in which student cohorts share several classes each year and teachers collaborate around student [emotional and social] needs;
- An intensive [academic and vocational] curriculum with a career theme relevant to local industry and economic needs; and
- Develop partnerships with [local] employers, higher education institutions, and the non-profit community.
Career Academies operate as a “school-within-a-school” structure. Each academy has a career theme, such as health care, finance, technology, communications, and public service. Academy-based courses are often scheduled in blocks in the morning, leaving the remainder of the day for traditional academic coursework. Career-themed courses are taught by Academy teachers, who come from a variety academic and vocational disciplines. In most cases, student cohorts work with the same teachers, in both traditional and Academy courses, through the entirety of their high school career. (Kemple & Snipes, 2000).
Researchers at MDRC conducted an RCT evaluation of the Career Academies model, measuring student outcomes at the end of the student’s projected 12th grade year (Kemple & Snipes, 2000). Data from some 1,400 ninth-grade students across six states and the District of Columbia were randomly assigned one of three conditions: enrollment in Career Academies, waitlisted for Career Academies and enrolled elsewhere, and a comparison group that neither enrolled nor were placed on a waitlist. Two additional studies measured the original cohort’s outcomes at four and eight years’ post-graduation (Kemple, 2004; Kemple & Willner, 2008).
The analyses reveal surprising results: Career Academies had much stronger effects on labor market outcomes than academic outcomes. For example, Career Academies did not show a statistically significant increase in the percentage of students who graduated from high school or obtained their GED; there were no discernible effects found for staying in or progressing through school; and Career Academies students did not show increased standardized math or reading test scores (Kemple, 2004; Kemple & Snipes, 2000). However, the program produced statistically significant positive and sustained impacts for men on a variety of labor market outcomes. For example, men who participated in Career Academies earned 11%, or $2,088, a year more than their non-Career Academies peers. In addition, through a combination of increased wages, increased hours worked, and increased job stability, men in the Career Academies treatment group increased their real earnings by 17%, or $3,731, per year (Kemple & Willner, 2008).
These reports also show nuanced effects of the program. Career Academies impacted certain groups more, especially men and those at most risk for dropping out. For example, the increased earnings findings above were just for men; women who participated in Career Academies saw no impact on their labor market outcomes (Kemple, 2004). In addition, for students within the highest risk-level for school dropout, Career Academies produced substantial improvements in high school outcomes (Kemple & Snipes, 2000). For example, students at highest risk of dropping out of high school who participated in Career Academies showed significant improvements in the following areas: they were less likely to drop out of school; they were more likely to complete more credits toward graduation; they were more likely to complete more academic core courses; they were more likely to have taken 3 or more career or vocational courses;and they were more likely to have applied for college. These findings suggest positive effects for those at highest risk of disengaging from school (Kemple & Snipes, 2000).
Big Picture Learning
Big Picture Learning (BPL) does not market itself as a community schools model, but it includes many core components of one. BPL is described as a “blended, student-centered approach to learning.” BPL’s goal is to educate one student at a time through real-world work experiences, both in and out of school. The BPL model stresses the importance of social-emotional learning, workplace and college readiness, and academic mastery (Priest, Rudenstine, Weisstien, & Gerwin, 2012), matching many of the key components list earlier. The first BPL, the Metropolitan Regional Career and Technical Center (the “Met”), opened in 1995 in Providence, RI, serving 50 mostly academically at-risk African American and Latino students. Since then, the BPL model has expanded to 65 high schools across the United States and overseas.
Student’s entering BPL schools come from various cultural and socioeconomic backgrounds (Littky et al., 2004). The common thread for many BPL students is the disillusionment with traditional learning’s inability to explore their passion (Castleman & Littky, 2007). The premise is that students’ taking control of their education makes a critical difference to their high school completion (Castleman & Littky, 2007). Students entering BPL schools are placed into a 10-15 student cohort called an advisory. Each advisory works with one teacher (called a lead advisor) who guides them through high school, and an internship mentor, who guides the out-of-school learning experience.
With guidance from the lead advisor, each student completes a portfolio-based survey to generate his and her personal interests, competencies, and academic and out of school needs. This becomes the foundation for the student’s individualized learning plan (ILP). The plan consists of the survey’s findings and must adhere to the school’s learning goals (Littky et al., 2004): empirical and quantitative reasoning; communication; social reasoning; and personal qualities. Next, students develop personal curricula that support the ILP. “The curriculum is project-based, aligned with state and national standards, and centered on contemporary issues that relate to students’ interests and activities.” Internships are also a large component of the curriculum. Here, students work closely with their mentors in their field of interest (e.g. research lab, hospital, design studio), learning in a real-world setting. Students prove their knowledge and capabilities through multiple demonstrations, including exhibitions and portfolios of their work (Washor, 2003).
The BPL model emphasizes the cultivation of students’ responsibility and resourcefulness. Students create their schedules and manage their learning plans. Staker and her colleagues (2011, p.29) explain:
“In some cases, students might use an online simulation technology, such as a virtual reality welding simulator to learn how to weld at a fraction of the cost (and burns) of real-life welding. When students need help with a topic, they might use an online tutorial or connect over email with their mentors. Students own their performance on state assessment tests, and in some cases, they turn to an online course to prepare them.”
To date, the BPL schools model has not been subject to robust research evaluation. However, an MPR Associates’ study of BPL alumni found positive outcomes for graduates of three BPL high schools (Rotermund, 2012). The descriptive analysis comprises two parts: a web-based survey sent to BPL alumni, and an aggregate-level analysis of National Student Clearinghouse (NSC) data for the three BPL schools. The studies reveal the following:
- 72% of survey respondents reported enrollment in a postsecondary institution. Slightly more than half (53%) of those enrollees reported working while in school.
- 74% of respondents who enrolled in college did so within one year of graduation, with 44% enrolling at four-year institutions.
- Post-secondary persistence rates for BPL students ranged from 88% to 91% from year one to year two.
- Two-thirds of college-going BPL grads reported full-time enrollment throughout their post-secondary education.
- Internships and the opportunity to build self-confidence through work-based learning were identified as the most important contributors to success in life after high school.
This report therefore provides some descriptive statistics of how the BPL experience shapes post-secondary outcomes. However, this evidence is weak by empirical standards; the report does not provide a causal link between BPL and outcomes, nor does the report even provide a comparison of how BPL students’ outcomes compare with similar students who did not experience BPL. More rigorous evaluation is needed to understand the effects of BPL.
The Role of Early Warning Systems
In 2007, the Consortium on Chicago School Research (CCSR) released data showing that freshman in Chicago Public Schools (CPS) were doing poorly on the A, B, C’s: more than half of all freshmen failed at least one course, the average GPA was below a C, and 40% of freshmen had missed more than one month of school (Allensworth & Easton, 2007). In response, CPS began to focus on course performance and commissioned the CCSR to create the On-Track tool for use in all CPS high-schools.
On-Track is a tool that determines whether ninth grade students are making adequate progress to graduation based on credit completion and core-course failures. On-Track enables administrators and educators to monitor student performance via real-time data reports that identify students who are falling “off track” (Rosenkranz, de la Torre, Stevens, & Allensworth, 2014). It is then incumbent upon schools to respond with appropriate interventions. As such, the On-Track initiative is a hybrid model; it is focused on providing actionable analytics that schools select Tier I or Tier II interventions to support student needs.
A recent, post-test analysis on the long-term outcomes of On-Track found a positive relationship between initial increases ninth grade on-track rates and improved student outcomes (Roderick, Kelly-Kemple, Johnson, & Beechum, 2014):
- From 2007-2013, CPS increased the number of on-track students from 57% to 82% (25 percentage points), meaning roughly 6,900 additional students moved from 9th to 10th grade with sufficient credits and without significant course failure.
- African-American males had the greatest gains in improved on-track rates, increasing 28.3 percentage points from 2005-2013, followed by Latino males (25.3 percentage points), and African-American females (21.1 percentage points).
- By 2013, nearly 90% of all high schools using the On-Track model had on-track rates of 70% or above, compared to 25% in 2005.
To test the tool’s replicability, researchers at the Institute for Education Sciences (Hartman, Wilkins, Gregory, Gould, & D’Souza, 2011) used On-Track in five school districts across Texas to measure the tool’s predictive value, i.e., does the tool predict which students graduate on time, and which do not?
The study found:
- On-Track demonstrated predictive value in identifying ninth-grade students on track to graduate, with rates ranging from 61% to 86% across the five Texas districts.
This study implies, at the very least, that On-Track data is or could be actionable. Its findings are limited, however, because the districts were not randomly selected and are not representative of all Texas districts.
CPS gives schools latitude in choosing Tier I or Tier II interventions to respond to On-Track data. The most commonly used intervention is the Network for College Success (NCS). NCS is in place in 17 high schools that serve approximately 20% of the District’s high-school student population. NCS offers four programs designed to strengthen schools’ capacity to respond to at-risk students:
- Instructional improvement. NCS coaches establish instructional leadership teams (ILT) that respond to student performance data and develop school-wide instructional goals.
- Language, literacy and leadership. The NCS coaches help teachers address gaps in adolescent literacy. The goal is for students to hone their reading and writing skills and engage in deeper classroom discussion.
- On-Track to graduation and college readiness. The NCS coaches help partner schools develop On-Track teams to assess student grades, attendance, and behavior.
- College enrollment and success. NCS works with school counselors to overcome common barriers that keep students from considering, applying to, and enrolling in college.
NCS’s partner schools reported an average On-Track to Graduation rate of 86%, which is not very different from the district’s rate of 84%. These figures have not been controlled for student-, teacher-, or school-level factors. NCS may indeed be efficacious; we simply cannot tell from the data.
As the research above demonstrates, On-Track demonstrates positive effects as a predictive tool for identifying academically at-risk youth. Beyond employing the On-Track measures however, we do not yet know what the most effective interventions are in response to On-Track data. “What is clear is that no matter how a school increases on-track rates in ninth grade, graduation rates improve three years later” (Roderick et al., 2014, p. 8).
The Role of 21st Century Community Learning Centers
The 21st Century Community Learning Centers (21st CCLC) initiative provides federal dollars for out-of-school time (OST) academic enrichment opportunities. Principally for students attending high poverty and/or low-performing schools, states receive funding based on their share of Title 1 funding. From there, states award grants to various entities offering academically enriching OST services (see below) (Devaney, Naftzger, Liu, & Sniegowski, 2016). In FY 2016, the program had roughly $1.1B in funding, with an average award of $21.9M.
21st CCLC funding supports an array of OST programming, including 21st CCLC-branded activities (Harvard Family Research Project, 2012). States receiving 21st CCLC funding can apply those funds to a wide array of afterschool programming including:
- Academic enrichment activities that can help students meet state and local achievement standard;
- Complimentary services and activities including: drug and violence prevention programs, career and technical programs, counseling programs, art, music, and recreation programs, STEM programs, and character education programs; and
- Literacy and related educational development services to the families of children participating in the program
While the array of programming is broad, 21st CCLC funding primarily provides remediation and academic support services:
- An AIR evaluation of Texas’s 21st CCLC programming observed that most activities (73%) were designed to build skills in a specific academic content area (i.e. an observed small-group math tutoring session) (Devaney et al., 2015, p. 103).
- Analysis of West Virginia’s 21st CCLC programming (Hammer & White, 2015, p. 4) found the highest percentage of student referrals to programming were for “academic support (tutoring, remediation).”
- Florida’s multi-year 21st CCLC funded Read to Succeed principally works with “students in need of academic remediation” (Capital City Consultants, LLC, 2016, p. 1).
While academic supports drive the program, 21st CCLC funding is also used to provide non-academic services, such as after-school meals, life skills training (Erquiaga, 2017), and substance abuse prevention programming (Hammer & White, 2015).
Several state evaluations of 21st CCLC programming have shown evidence of positive impact on student attendance, behavior, and coursework:
- An evaluation of Washington state’s 21st CCLC programming showed statistically significant increases in cumulative GPA (with effect sizes of 0.46-0.20 SD) and course credit accumulation (with effect sizes of 0.10-0.14 SD) for high school students who participated in the 60-plus-day program, compared to non-participants. (Naftzger, Vinson, Liu, Zhu, & Foley, 2014).
- Rhode Island’s 21st CCLC programs participants reported 70% fewer absences and 72% fewer disciplinary referrals than non-participants (Vinson, Marchand, Sparr, & Moroney, 2013).
- High-school students in Texas’s 21st CCLC 60-plus-day programming were 97% more likely to be promoted to the next grade than non-participants in the same school (Devaney et al., 2015)
The evidence implies that 21st CCLC funding provides some positive impact. However, these estimates are correlational rather than causal. When more rigorous methods were used to evaluate 21st CCLC programming, the results were mixed, based on the outcomes measured. James-Burdumy, Dynarksi and Deki (2008) used a randomized control and QED methods to analyze the same data (James-Burdumy et al., 2005) to test the program’s impact on student behavior. James-Burdumy and her colleagues employed the following design for both components of the original study:
- Random assignment of 2,308 elementary students in 12 school districts and;
- A matched comparison design including 4,264 middle students in 32 districts.
Through their analysis, James-Burdumy and her colleagues found 21st CCLC programs had negative effects on student behavior. When compared to the control group, elementary students in the treatment group were found to have statistically significant higher rates of: disciplinary referrals for behavior; parental contact regarding child’s behavior; and suspensions.
Middle school students’ findings were mixed, with no discernible difference in most behavior indicators between the treatment and control groups. However, middle school students who participated in 21st CCLC showed statistically significant increases in certain negative behaviors: they were 2.4 percentage points more likely to break things on purpose and 0.06 percentage points more likely to take illegal drugs (James-Burdumy et al., 2008).
At the same time, an analysis of the impact of 21st CCLC-funded programming on student achievement (Johnston-Gross, 2016) found positive, statistically significant effects. The study compared two 21st CLCC-funded cohorts (n=274 public schools) to non-funded cohorts (n=11,077 public schools) across two school years? (2002-03 and 2003-2004). The study employed difference-in-difference techniques to estimate the programming’s impact. Johnston-Gross’s findings revealed that schools receiving the program experienced a higher percentage of students meeting or exceeding state assessment standards on all tests. Specifically, schools that received the program showed 1.3% higher test scores in the first year and 2.1% higher test scores in the second year compared to schools without the intervention. Further, the gains were highest for middle schools receiving the intervention – 9% in both the first and second years. These findings are statistically significant and are plausibly causal, given the methodology used (Johnston-Gross, 2016).
However, while these findings are positive, the author notes that:
“…returns diminish as the percentage of low income students increases. Some reasons for this include the instability that poverty may bring to a student’s life such as higher risk of moving schools and lower attendance as evidenced by increasing coefficients on % Mobility and % Chronic Truancy.”
Based on the evidence, 21st CCLC programming impacts are mixed. More research is needed to better understand when the program will and will not be effective.
The Limitations of Existing Research
The research on the models discussed above provides early indicators of the positive effects of community-school models on improving student achievement and preventing disengagement. However, the majority of studies evaluate program exposure with no control comparison.
Beyond both the largely if modestly positive findings, and the limitations, of existing research the community-school model faces challenges from other angles. For certain models (e.g. Big Picture Learning), a potential hurdle is the criticism faced by all student-centered learning approaches; some educators fear that moving students through an instructional program at their own pace reduces expectations for low-income and minority students (Banchero, 2014). Second, teacher surveys reveal mistrust of the community-schools approach as just another pedagogical fad (Richmond, n.d.). Finally, not all districts are equipped to support the model (e.g. professional development on identifying signs of trauma) (Richmond, n.d.); in a world of finite resources, even promising programs often go under-funded (Manekin, 2016).
The programs and models reviewed above have real potential to increase positive student outcomes as well as to reengage those students who have effectively written education off. Caution must be exercised in choosing a way forward, however, as not all programming is equally effective nor is all the research on community schools of high quality. School leaders, community organizations, and policymakers should consider not only thoughtful and intentional programmatic design but also long-term fidelity of implementation. In such decisions, they should focus on the salient characteristics of high-quality programs, which include a centralized coordinator, clearly outlined directives and goals, and the organization of existing external services.
Afterschool Alliance. (n.d.). 21st Century Community Learning Centers: Providing Local Afterschool and Summer Learning Programs for Families. Retrieved April 12, 2017, from http://www.afterschoolalliance.org/policy21stcclc.cfm
Allensworth, E. M. (2015). Dropout prevention: A previously intractable problem addressed through systems for monitoring and supporting students. In K. Bosworth (Ed.), Prevention Science in School Settings: Complex relationships and processes. (1st ed., pp. 353–370). New York, N.Y.: Springer International Publishing.
Allensworth, E. M., & Easton, J. Q. (2005). The On-Track Indicator as a Predictor of High School Graduation. University of Chicago: Consortium on Chicago School Research. Retrieved from http://consortium.uchicago.edu/publications/track-indicator-predictor-high-school-graduation
Allensworth, E. M., & Easton, J. Q. (2007). What Matters for Staying On-Track and Graduating in Chicago Public High Schools: A Close Look at Course Grades, Failures, and Attendance in the Freshman Year. Chicago, IL: University of Chicago Consortium on Chicago School Research. Retrieved from http://solaris.techlab360.org/sites/default/files/document_library/07%20What%20Matters%20Final.pdf
Anderson-Moore, K., Caal, S., Carney, R., Lippman, L., Li, W., Muenks, K., … Terzian, M. (2014). Making the Grade: Assessing the Evidence for Integrated Student Supports (No. 2014–7). Bethesda, Md.: Child Trends. Retrieved from https://www.childtrends.org/?publications=making-the-grade-assessing-the-evidence-for-integrated-student-supports
Angrist, J. D., & Pischke, J.-S. (2009). Mostly harmless econometrics: an empiricist’s companion. Princeton: Princeton University Press.
Balfanz, R. (2009). Putting Middle Grades Students on the Graduation Path: A Policy and Practice Brief. Center for Talent and Development: Johns Hopkins University.
Balfanz, R., Herzog, L., & MacIver, D. J. (2007). Preventing student disengagement and keeping students on the graduation path in urban middle-grades schools: Early identification and effective interventions. Educational Psychologist, 42(4), 223–235.
Banchero, S. (2014, March 10). Shaking Up the Classroom. Wall Street Journal. New York, NY. Retrieved from https://www.wsj.com/news/articles/SB10001424052702304899704579391101344310812
Bowden, A. B., Belfield, C. R., Levin, H. R., Shand, R., Wang, A., & Morales, M. (2015). A benefit-cost analysis of City Connects. Center for Benefit-Cost Studies in Education: Teachers College, Columbia University.
Bowen, N. K., & Bowen, G. L. (1999). Effects of Crime and Violence in Neighborhoods and Schools on the School Behavior and Performance of Adolescents. Journal of Adolescent Research, 14(3), 319–342. https://doi.org/10.1177/0743558499143003
Capital City Consultants, LLC. (2016). 21st Century Community Learning Centers Read 2 Succeed Summative Evaluation 2015-2016. Tallahassee, FL: Florida Department of Education. Retrieved from http://www.r2succeed.org/default2.asp?active_page_id=1029
Castleman, B., & Littky, D. (2007, May). Learning to Love. Educational Leadership, 64(8).
Castrechini, S., & London, R. A. (2012). Positive Student Outcomes in Community Schools. Washington DC: Center for American Progress. Retrieved from https://www.americanprogress.org/wp-content/uploads/issues/2012/02/pdf/positive_student_outcomes.pdf
Cauley, K. M., & Jovanovich, D. (2006). Developing an Effective Transition Program for Students Entering Middle School or High School. The Clearing House, 80(1), 15–25.
City Connects. (2014). The Impact of City Connects: Progress Report 2014. Chestnut Hill, MA: Boston College Center for Optimized Student Support. Retrieved from http://www.bc.edu/content/dam/city-connects/Publications/CityConnects_ProgressReport_2014.pdf
Corrin, W., Sepanik, S., Gray, A., Fernandez, F., Briggs, A., & Wang, K. K. (2014). Laying Tracks to Graduation: The First Year of Implementing DIPLOMAS NOW. New York, N.Y.: MDRC.
Corrin, W., Sepanik, S., Rosen, R., & Shane, A. (2016). Addressing early warning indicators: Interim impact findings from the investing in innovation (i3) evaluation of Diplomas Now. New York, N.Y.: MDRC.
Devaney, E., Naftzger, N., Liu, F., Cristobal, S., Vinson, M., Hutson, M., … Hallman, S. (2015). Texas 21st Century Community Learning Centers 2012–13 and 2013–14 Combined Evaluation Report. Washington, DC: American Institutes for Research.
Devaney, E., Naftzger, N., Liu, F., & Sniegowski, S. (2016). Texas 21st Century Community Learning Centers 2014–15 Evaluation Report. Washington, DC: American Institutes for Research.
Erquiaga: Why Trump’s “America First” Budget Puts Children in Poverty Last. (2017, April 9). Retrieved April 13, 2017, from https://www.the74million.org/article/erquiaga-why-trumps-america-first-budget-puts-children-in-poverty-last
Hammer, P. C., & White, L. (2015). 21st Century Community Learning Centers: A Descriptive Evaluation for 2013-2014. Charleston, WV: West Virginia Department of Education, Division of Teaching and Learning, Office of Research.
Hartman, J., Wilkins, C., Gregory, L., Gould, L. F., & D’Souza, S. (2011). Applying an on‑track indicator for high school graduation: adapting the Consortium on Chicago School Research indicator for five Texas districts. Washington DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southwest. Retrieved from http://ies. ed.gov/ncee/edlabs
Harvard Family Research Project. (2012). 21st Century Community Learning Centers Stable Funding for Innovation and Continuous Improvement (Issue Brief No. 8) (pp. 1–8). Cambridge, MA: Harvard University Graduate School of Education. Retrieved from http://www.hfrp.org/publications-resources/publications-series/research-updates-highlights-from-the-out-of-school-time-database/research-update-8-21st-century-community-learning-centers-stable-funding-for-innovation-and-continuous-improvement
HOW IT WORKS – About Us – Big Picture Learning. (n.d.). Retrieved March 7, 2017, from http://www.bigpicture.org/apps/pages/index.jsp?uREC_ID=389353&type=d&pREC_ID=882356
James-Burdumy, S., Dynarski, M., & Deke, J. (2008). AFTER-SCHOOL PROGRAM EFFECTS ON BEHAVIOR: RESULTS FROM THE 21ST CENTURY COMMUNITY LEARNING CENTERS PROGRAM NATIONAL EVALUATION. Economic Inquiry, 46(1), 13–18. https://doi.org/10.1111/j.1465-7295.2007.00074.x
James-Burdumy, S., Dynarski, M., Moore, M., Deke, J., Mansfield, W., Pistorino, C., & Warner, E. (2005). When Schools Stay Open Late: The National Evaluation of the 21st Century Community Learning Centers Program Final Report. Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance.
Johnston-Gross, M. (2016). Measuring the Impact of 21st Century Community Learning Centers (No. 19). Stevenson Center for Community and Economic Development: Illinois State University. Retrieved from http://ir.library.illinoisstate.edu/scced/19/
Kemple, J. J. (2004). Career Academies: Impacts on labor market outcomes and educational attainment. New York, NY: MDRC. Retrieved from http://www.mdrc.org/sites/default/files/full_49.pdf
Kemple, J. J., & Snipes, J. C. (2000). Career Academies: Impacts on students’ engagement and performance in high school. New York, NY: MDRC. Retrieved from http://www.mdrc.org/sites/default/files/Career_Academies_Impacts_on_Students.pdf
Kemple, J. J., & Willner, C. J. (2008). Career Academies: Long-term impacts on labor market outcomes, educational attainment and transitions to adulthood. New York, NY: MDRC. Retrieved from http://www.mdrc.org/sites/default/files/full_50.pdf
Levin, H., & Belfield, C. R. (2009). Some economic consequences of improving math performance. (Technical Report No. 1). Menlo Park, CA: SRI International, Center for Technology in Learning.
Littky, D., Diaz, N., Dolly, D., Hempel, C., Plant, C., Price, P., & Grabelle, S. (2004, May). Moment at the Met. Educational Leadership, 61(8).
Manekin, S. (2016). City Connects: Redesigning Student Support for Academic Success (The Abell Report No. 3). Baltimore, MD: Abell Foundation.
Marzano Research Laboratories. (2011). Appendix B: What is an Effect Size? Retrieved from https://soltreemrls3.s3-us-west-2.amazonaws.com/marzanoresearch.com/media/documents/pdf/AppendixB_DTLGO.pdf
Naftzger, N., Vinson, M., Liu, F., Zhu, B., & Foley, K. (2014). Washington 21st Century Community Learning Centers Program Evaluation: Year 2. Chicago, IL: American Institutes for Research.
OUR STORY – About Us – Big Picture Learning. (n.d.). Retrieved March 7, 2017, from http://www.bigpicture.org/apps/pages/index.jsp?uREC_ID=389353&type=d&pREC_ID=882353
Priest, N., Rudenstine, A., Weisstien, E., & Gerwin, C. (2012). Making mastery work: A closeup view of competency education. Quincy, MA: Nellie Mae Education Foundation. Retrieved from https://1.cdn.edl.io/0CCJxEP7wFihDHfkD3fCk5UxRvXe25U8H3BB8IYo78RmMs9Q.pdf
Richmond, E. (n.d.). Student-Centered Learning. Retrieved March 28, 2017, from http://www.ewa.org/student-centered-learning
Roderick, M., Kelly-Kemple, T., Johnson, D. W., & Beechum, N. O. (2014). Preventable Failure: Improvements in Long-Term Outcomes when High Schools Focused on the Ninth Grade Year. Chicago, IL: University of Chicago Consortium on Chicago School Research. Retrieved from https://consortium.uchicago.edu/sites/default/files/publications/On-Track%20Validation%20RS.pdf
Rosenbaum, P. R., & Rubin, D. B. (1983). The Central Role of the Propensity Score in Observational Studies for Causal Effects. Biometrika, 70(1), 41–55. https://doi.org/10.2307/2335942
Rosenkranz, T., de la Torre, M., Stevens, W. D., & Allensworth, E. M. (2014). Free to Fail or On-Track to College: Why Grades Drop When Students Enter High School and What Adults Can Do About It. Chicago, IL: University of Chicago Consortium on Chicago School Research. Retrieved from https://consortium.uchicago.edu/sites/default/files/publications/FoF%20Why%20Grades%20Drop.pdf
Rotermund, S. (2012). Big Picture Learning: High School Alumni Report. Berkeley, CA: MPR Associates. Retrieved from https://1.cdn.edl.io/vGIpPEdMJfGhhJdIeQYfWT5OJ1oR2YyyjCgWLjqrqpaG8rLt.pdf
Rothstein, R. (2010). How to Fix Schools (No. 286). Washington, DC: Economic Policy Institute. Retrieved from http://www.epi.org/publication/ib286/
Staker, H., Chan, E., Clayton, M., Hernandez, A., Horn, M. B., & Mackey, K. (2011). The Rise of K–12 Blended Learning: Profiles of emerging model. Insight Institute. Retrieved from http://files.eric.ed.gov/fulltext/ED535181.pdf
Stuart, E. A., & Rubin, D. B. (2008). Best practices in quasi-experimental designs. Best practices in quantitative methods, 155-176.
U.S. Department of Education. (2015). Dropout Prevention intervention report: Career Academies. Washington, DC: U.S. Department of Education, Institute of Education Services, What Works Clearinghouse. Retrieved from https://ies.ed.gov/ncee/wwc/Docs/InterventionReports/wwc_careeracademies_092215.pdf
Vinson, M., Marchand, J., Sparr, M., & Moroney, D. (2013). Rhode Island 21st Century Community Learning Centers Program Evaluation: Evaluation Report 2011-12. Chicago, IL: American Institutes for Research.
Walsh, M. E., Lee-St. John, T., Raczek, A., Foley, C., & Madus, G. (2014). Reducing High School Dropout through Elementary School Student Support. Chestnut Hill, MA: Center for Optimized Student Support Lynch School of Education Boston College.
Walsh, M. E., Madaus, G. F., Raczek, A. E., Dearing, E., Foley, C., An, C., … Beaton, A. (2014). A New Model for Student Support in High-Poverty Urban Elementary Schools: Effects on Elementary and Middle School Academic Outcomes. American Educational Research Journal, 51(4), 704–737. https://doi.org/10.3102/0002831214541669
Walsh, M., Gish, J. W., Foley, C., Theodorakakis, M., & Rene, K. (2016). Principles of Effective Practice for Integrated Student Support. Chestnut Hill, MA: Center for Optimized Student Support Lynch, School of Education, Boston College. Retrieved from http://www.bc.edu/content/dam/files/schools/lsoe/cityconnects/pdf/Policy%20Brief%20-%20Building%20Sustainable%20Interventions%20web.pdf
Washor, E. (2003). Innovative Pedagogy and School Facilities: Minneapolis, MN: DesignShare. Retrieved from http://www.designshare.com/Research/Washor/Pedagogy%20and%20Facilities.pdf
What is Big Picture Learning? – Bellevue Big Picture School. (n.d.). Retrieved from http://www.bsd405.org/bigpicture/about/what-is-big-picture-learning/
 Student progress measures include the following: “credit completion, grade retention, high school dropout and promoting power” (Anderson-Moore et al., 2014, p. 62).
 Child Trends conducted a meta-analysis of quasi-experimental (QED) and randomized control (RCT) designed evaluations of three ISS based interventions: Communities in Schools (CIS); Comer School Development Program (Comer SDP); and City Connects (CCNX).
 City Connects launched in 2002.
 Randomized control trials (RCTs) are designed so one part of a group is randomly assigned to receive an intervention or “treatment”, this is referred to as the treatment group, and the rest of the group are randomly assigned to not receive the treatment, this is the control group. After some period of time, in which the treatment group receives the intervention (and the control group does not receive the intervention), the outcomes of the two groups are compared and difference between the outcomes of the two groups is attributed to the intervention or treatment. RCTs provide the highest level of assurance that the estimated relationship between a treatment and outcome is causal, and that the results are not due to some other unobserved characteristic (because the treatment and control groups were randomly selected, which ensures that there is no systematic difference between the two groups). QEDs use observational data, where no randomization took place, but tries to re-create randomization, as best as possible. Therefore QED methods provide a plausibly causal relationship between a treatment and outcome (i.e. we have more assurance that the relationship is causal than if the observational data was used without QED methods, but we have less assurance than if the data came from an RCT). For more information, see Stuart and Rubin (2008) or this What Works Clearninghouse webinar.
 Propensity scoring is a QED method that is used to create a control group from observational data. The control group is selected so that its members are observationally similar to the treatment group. Researchers do this by estimating the probability that a student in the treatment group would be assigned to the treatment, given their observable characteristics. Researchers then estimate this probability for students who did not receive the treatment, and “match” the student from the treatment group with student(s) with similar probabilities who did not receive the treatment. For more information, see Stuart and Rubin (2008) or Rosenbaum and Rubin (1983).
 Massachusetts Comprehensive Assessment score (MCAS).
 “This study followed students’ longitudinal data records in a large urban high-poverty district from their initial enrollment in a City Connects elementary school, continuing after they left the intervention into middle and high school (N=2,265). A comparison group of students who attended school in the same district at the same time but never attended a City Connects school was also followed (N=19,979)” (M. E. Walsh, Lee-St. John, Raczek, Foley, & Madus, 2014).
 “Depending on what share of the community partner services are considered to be above and beyond the baseline level, the total cost estimate range from $1,540 to $9,320 per student”(Bowden et al., 2015, pg. 3).
 Distribution of Core Costs for City Connects for the city of Boston found that schools bore 10.1% of costs in terms of staff, management and some supplies. City Connects bore 89.5% of costs, primarily through Central Program Staff and SCCs. The remaining .4% were bore by parents (i.e. time off work, transportation) during parental involvement activities (Bowden et al., 2015, p. 27) .
 The program produced a societal benefit of $13,850 per student at a cost of $4570. Subtracting the costs, the net value is $9,280, or roughly $3 in benefits per $1 of cost. Bowden and colleagues determined societal benefits based on “lifetime social impacts” (e.g. future labor productivity, higher rates of employment, less dependence on social safety net, and lower incarceration rates) (Levin & Belfield, 2009).
 Participating school’s student populations were at least 80 percent eligible for free or reduced-price lunch and 83 percent black and Hispanic.
 Note that early warning indicators here are defined as better than 85% attendance, less than 3 days of suspension or expulsion, and passing both ELA and math courses. There was no statistically significant difference between the treatment and control groups when a more rigorous definition of early warning indicators was used (that is, better than 90% attendance, no suspensions or expulsions, and passing all core courses of ELA, math, social studies, and science). The authors of the study state that this more rigorous definition of early warning indicators is “suggestive of a more stable educational trajectory” (pp. iii).
 All effect sizes are significant at a 10% level or higher.
 Study meets WWC group design standards without reservation.
 Schools were selected in each of the following states: MD, PA, FL (2), TX, and CA (3).
 Note, however, that Nearly all (96%) of Career Academies students completed secondary education within eight years of their expected graduation date, compared to 94% of the control group (Kemple & Willner, 2008). While not statistically significant, the effect size was 0.27, which is considered substantively important. based on What Works Clearinghouse calculations (See What Works Clearinghouse Procedures and Standards Handbook (Section IV. Reporting on Findings) for additional information). Further, What Works Clearinghouse rated the program’s effects [on completing school] as potential positive within the WWC’s Dropout Prevention topic area.
 Specifically, 21% of Career Academies students versus 32% of non-Career Academies students dropped out of high school.
 Specifically, 40% of Career Academies students versus 26% of non-Career Academies students increased course completion towards graduation.
 Specifically, 32% of Career Academies students versus 16% of non-Career Academies students completed academic core courses.
 Specifically, 58% of Career Academies students versus 38% of non-Career Academies completed 3 or more vocational courses.
 Specifically, 51% of Career Academies students versus 35% of non-Career Academies students applied to college.
 The authors urge caution when making inferences about causal relationships of the students who are at highest risk due to external factors not tested (i.e. stronger support from teachers and peers).
 “HOW IT WORKS – About Us – Big Picture Learning,” n.d. http://www.bigpicture.org/apps/pages/index.jsp?uREC_ID=389353&type=d&pREC_ID=882356
 “OUR STORY – About Us – Big Picture Learning,” n.d.) http://www.bigpicture.org/apps/pages/index.jsp?uREC_ID=389353&type=d&pREC_ID=882353
 (“What is Big Picture Learning?” n.d.) http://www.bsd405.org/bigpicture/about/what-is-big-picture-learning/
MPR Associates designed a web-based survey that was sent to all alumni of the three-participating high
Schools. The survey’s overall response rate was 46%. Response rates of at least 70% are considered necessary for results to be generalizable. Only one of the schools surveyed met that threshold. Thus the results may be subject to response rate bias (Rotermund, 2012, p. 6).
 CPS considers students on-track if they have accumulated five full course credits (the number needed for promotion to tenth grade per CPS policy) and have no more than one F (that is, one-half of a full credit) in a core subject (English, math, science, or social studies) in a given semester (Allensworth & Easton, 2005).
 This study is a descriptive analysis of On-Track post-implementation. Since On-Track is system-wide and no control group is utilized, the outcome data should be treated as correlational with – rather than as causally linked to – the tool and subsequent interventions.
 Examples of grantees include: LEAs, nonprofit and for-profit organizations, and institutions of higher education.
 List of potential programming compiled from http://www.afterschoolalliance.org/policy21stcclc.cfm (Afterschool Alliance, n.d.)
 Washington State 21st CCLC activity clusters are 65.8% academic (45.2% Academic Enrichment; 10.3% Homework Help; 10.3% Tutoring) vs 34.2% non-academic (7.7% Recreational; 26.5% Variety). Variety activities include: Drug/violence prevention, counseling, or character education, promotion of family literacy, mentoring, service learning activities, life skills, and nutrition) (Naftzger, Vinson, Liu, Zhu, & Foley, 2014, p. 15).
 There was also a statistically significant positive increase in the percentage of credits earned for the 30-plus-day program in both years of the study, but a statistically significant decrease in cumulative GPA for participants in the 30-plus-day program during both years of the study.
 Rhode Island 21st CCLC activity clusters are 38% academic (34% Academic Enrichment; 4% Homework Help) vs 62% non-academic (32% Recreational; 30% Variety). Variety activities include: Drug/violence prevention, counseling, or character education, promotion of family literacy, mentoring, service learning activities, life skills, and nutrition) (Vinson, Marchand, Sparr, & Moroney, 2013, p. 17).
 “The activities observed were primarily either academic enrichment (73 percent) characterized by an intentional effort to build youth skills in a specific academic content area or nonacademic enrichment (26 percent)… which generally supported youth development” (Devaney et al., 2015, p. 103)
 Specifically, 22% of students in the treatment group reported being disciplend for behavior compared with 17% of students in the control group. This is a statistically significant effect size of 0.16 SD.
 Specifically, 28% of students in the treatment group had teachers who reported that they had to call parents about a chilld’s behavior “two or more times” compared with 23% of students in the control group. This is a statistically significant effect size of 0.12 SD.
 Specifically, 12% of students in the treatment group were ssped in the most recent school year compared with 8% of students in the control group. This is a statistically significant effect size of 0.16 SD.
 Full sample data consisted of schools at the elementary, middle and high school levels.
 Difference-in-difference is a QED statistical technique that replicates experimental research using observational study data. Researchers study the effect of a treatment on one group vs a control group. Impact estimates are calculated by comparing average change over time between the two groups. The method is often used in public policy analysis to estimate impact and effectiveness (Angrist & Pischke, 2009, pp. 169–174).