Return to flip book view

A 10-year Validation Study

Page 1

Evidence of Positive Impact on Afterschool and Summer Programs in Palm Beach County :LISA M. LINDEMAN, PH.D.CHARLES SMITH, PH.D.STEPHEN C. PECK, PH.D.SUZETTE L. HARVEYA 10-year Validation Study of Prime Time’s Integrated Quality Improvement SystemAUGUST 2019

Page 2

2

Page 3

Evidence of Positive Impact on Afterschool and Summer Programs in Palm Beach County:LISA M. LINDEMAN, PH.D.CHARLES SMITH, PH.D.STEPHEN C. PECK, PH.D.SUZETTE L. HARVEYPRIME TIME PALM BEACH COUNTY2300 HIGH RIDGE ROAD, SUITE 330BOYNTON BEACH, FL 33426561.732.8066WWW.PRIMETIMEPBC.ORG COPYRIGHT © 2019 PRIME TIME PALM BEACH COUNTY. ALL RIGHTS RESERVED.A 10-year Validation Study of Prime Time’s Integrated Quality Improvement SystemAUGUST 20193

Page 4

About Prime Time Palm Beach County ............................................................................................... 6Summary of Findings .......................................................................................................................... 9Raising Program Quality ....................................................................................................................10Connecting Services to Quality .........................................................................................................16PRACTITIONER SPOTLIGHT ...............................................................................................................19Engaging Programs .......................................................................................................................... 20PRACTITIONER SPOTLIGHT .............................................................................................................. 20PRACTITIONER SPOTLIGHT ...............................................................................................................21Professionalizing the Field .................................................................................................................22Retaining Skilled Practitioners ......................................................................................................... 26Helping Children and Youth Succeed .................................................................................................32PRACTITIONER SPOTLIGHT .............................................................................................................. 35Conclusions ....................................................................................................................................... 36Recommendations ............................................................................................................................ 36References .........................................................................................................................................37Appendix ........................................................................................................................................... 38CONTENTS4

Page 5

Afterschool and summer programs, also known as out-of-school time (OST) programs, provide children and youth with a crucial environment in which to learn and develop. When supportive and engaging, OST programs contribute to cognitive, social, and emotional growth as well as improved academic performance. In a high-quality OST program, children are more likely to develop personal skills needed to succeed in school and in life. During the past decade, research has demonstrated that high quality is critical for OST programs to achieve positive outcomes (Durlak et al., 2010, 2011; McCombs, Whitaker, & Yoo, 2017). Youth experience fewer or no benefits when attending lower-quality programs, but high-quality programs contribute to youth development, well-being, and academic success. For that reason, improving program quality through coaching, training, and related supports is Prime Time’s mission. Prime Time is a nonprofit organization that helps children and youth succeed by strengthening and expanding the quality of afterschool and summer programs. This report examines data demonstrating Prime Time’s success in fulfilling this mission.“The first thing I did when I became an afterschool director was the career pathway, I went back to school. Just the education part that Prime Time offers to help you advance your career is amazing! The networking events and the networking we do at trainings has been incredible. The people that I meet through these events and then use these connections to bring opportunities to my kids like the zoo, fishing on a charter boat in the middle of the ocean, Kennedy Space Center… they would have never of done that if it wasn’t for Prime Time.” Frank Verney, City of West Palm Beach Coleman Park Community Center5

Page 6

In the mid 1990s, key stakeholders recognized that the Palm Beach County afterschool field needed to organize, strengthen, and improve so that it could better serve its families. As a result, the Out-of-School Time Consortium was formed in 1996. Initial partners and funders included the Children’s Services Council of Palm Beach County (which remains Prime Time’s primary funder), the School District of Palm Beach County, the Palm Beach County Department of Parks and Recreation, the U.S. Department of Education, The Mary and Robert Pew Public Education Fund, and The John D. and Catherine T. MacArthur Foundation.The Consortium’s initial purpose was to share resources and enhance afterschool and summer programs that served elementary and middle school-age children and youth. In time, the Consortium created a Coordinating Council to investigate critical issues and devise a plan for creating the first non-governmental afterschool intermediary in Florida: Prime Time Palm Beach County. Prime Time was established in 2000 by a group of local stakeholders dedicated to improving out-of-school time (OST) programs. The strategies and services adopted to fulfill this objective were guided by research and best practices in positive youth development. Specifically, tools and frameworks created by the HighScope Educational Research Foundation (Smith & Hohman, 2005) and the David P. Weikart Center for Youth Program Quality (Smith et al., 2012) provided a template for building effective services. In partnership with the community, Prime Time drafted specific standards for program quality, which formed the basis for program quality assessments over the next decade. In 2007, the Palm Beach County Quality Improvement System (QIS) was launched. The QIS represents an integrated system of services, including quality coaching, training, career counseling, networking, and expanded learning opportunities (ELOs) for children and youth offered by organizations throughout the county. Community partnerships continue to be a critical part of Prime Time’s work. The only nonprofit, independent OST intermediary in Florida, Prime Time is one of 20 partner organizations that form a national coalition of similar intermediaries, Every Hour Counts. Leaders at Prime Time have also been integral to the work of Birth to 22: United for Brighter Futures, a county-wide initiative that “supports the healthy growth, development and education of our children and youth prenatally through young adulthood, so that they can graduate from high school and succeed in life” (http://pbcbirthto22.com/). In 2007, 63 OST programs joined the QIS. One decade later, in 2018, participation rose to 149 programs. In addition, Prime Time provides services to more than 200 OST programs each year, out of approximately 300 eligible programs in Palm Beach County. Programs in the QIS receive a more comprehensive set of services centered around quality coaching, which is unavailable to those outside of the QIS. As of the 2018-2019 QIS cycle, 74 programs were school-based (directly linked to an elementary or middle school), 37 were community-based, and 11 were municipality programs. Most programs in the QIS serve elementary school-age children, and 16 primarily serve youth in middle school. The volume and variety of programs have led to expansions in services since 2007. Prime Time began with just three staff members. Ten years later, more than 35 staff make Prime Time’s mission possible.ABOUT PRIME TIME6

Page 7

Prime Time’s ApproachFive elements of Prime Time’s approach have helped to make the organization successful (see Figure 1). These elements include a suite of services that work together synergistically to support all aspects of program quality, the use of data to drive improvement, relationship-building with programs, the creation of a low-stakes environment for motivating change, and ongoing connections with community partners.Prime Time’s comprehensive services address program quality at the system, site, and staff levels. For example, whereas front-line staff learn instructional practices to implement with youth, program managers learn how to best support their staff, and networking events help to build bridges between programs that create a web of community support and information-sharing. Tiered levels of support ensure that programs newer to the system or lower in quality receive more intensive coaching supports; whereas, programs firmly rooted in high quality learn to maintain that level through self-assessment and self-driven improvement planning.The foundation for Prime Time’s work is a web of positive relationships between Prime Time staff and programs. Without the trust and rapport afforded by these relationships, programs would be less likely to communicate their needs and challenges to coaches, trainers, or colleagues in the field. It is this open and honest communication that allows programs to identify areas for improvement and utilize available resources and knowledge to make those improvements.Adding to this foundation is a reliance on rich data for driving improvement. Observations by external assessors yield in-depth information about the strengths and limitations of each program. This information helps to show the way to higher quality, but it is through trusting relationships with Prime Time staff that this data is presented and integrated into the improvement process.Community partnerships in Palm Beach County have helped put OST program quality into a broader context that addresses the success and well-being of children from dawn to dusk and from birth to adulthood. For example, through Prime Time’s work in the Birth to 22 initiative, and in partnership with the School District of Palm Beach County, efforts to improve the experiences of children in afterschool programs is aligned with corresponding shifts in the school day and in connection with other community interventions.Figure 1. Key elements of Prime Time’s approach.Key elements of Prime Time’s approachComprehensive interconnected supportsLow-stakes incentives and trustLong-lasting community partnershipsData-driven improvementTiered levels of support7

Page 8

Figure 2. Logic model showing the connection between Prime Time’s services and outcomes.Pathways to OutcomesThe path from Prime Time’s services to positive outcomes begins with the motivation and engagement of program staff and succeeds only when these services are attended and implemented. The central outcome that is integral to Prime Time’s mission is program quality improvement. Social, emotional, and academic benefits for children and youth are indirect outcomes of Prime Time’s work expected to occur as a result of higher quality. Figure 2 illustrates the causal pathways from services to outcomes in a simple logic model. To reach higher quality, the logic model shows that the following intermediate goals are essential:  Engage programs through relationship- and trust-building between Prime Time staff (particularly, quality advisors) and program staff. Professionalize the field by building a skilled OST workforce through trainings, career advising, and professional learning communities.  Help programs retain skilled practitioners through financial incentives and supports that help to improve workplace culture.In this report, we examine trends in quality improvement as an outcome of years of exposure to core services (e.g., quality coaching, improvement planning, and self-assessment) and to the spectrum of Prime Time’s services (i.e., trainings, career advising, and networking opportunities). Subsequent sections will describe Prime Time’s progress in achieving the three goals above and explore how they have led to improved program quality.8

Page 9

SUMMARY OF FINDINGS Prime Time collects extensive data on program quality each year. Since 2007, external assessors have measured the quality of every OST program participating in the Palm Beach County Quality Improvement System (QIS). This has enabled Prime Time to chart the growth of quality for each program across more than a decade.In recent years, some OST programs have participated in assessing the social and emotional learning (SEL) skills of children and youth enrolled in their program. Results of these assessments yielded mixed findings but highlighted the importance of staff development in noticing and supporting the growth of the social and emotional skills in youth at their program. OST programs receiving Prime Time’s services improved from year to year. Number of years in the QIS (i.e., years of exposure to services) significantly predicts program quality, with more years associated with higher quality. Quality increased most in the first year after programs joined the QIS in the first year. After the first year, programs tended to maintain their higher levels of quality. Programs that were fully engaged (participating on time in core components of the QIS) improved more in the first year compared to programs that were not fully engaged.  Prime Time’s range of integrated services (e.g., trainings, networking events) is key to meaningful improvement for OST programs. Almost all OST programs (91%) that fully utilized Prime Time’s services improved in quality, and more than half improved dramatically (e.g., from moderate to exemplary quality). In contrast, among those programs that participated only in core services (quality coaching), 29% improved, and only 14% improved dramatically. Prime Time’s trainings and networking opportunities are especially critical for programs that start out with low quality. Between 2013 and 2018, among programs starting out with low quality, 50% improved with moderate participation in Prime Time’s services. However, of those that did not make use of Prime Time’s services, none improved. Staff turnover was high at most QIS programs. On average, 32% of staff either started or ended their jobs each year. Of these staff, about half started or left in the middle of the school year (between October and April). OST practitioners enter the field out of a desire to work with children. However, they are likely to leave the field as a result of financial concerns. Low wages and limited hours contributed to high turnover rates.  Wage incentives are associated with staff longevity. The majority of practitioners who received wage incentives continued working at the same program in the following year. SEL skills improved more at higher-quality programs in 2015-2016. Specifically, decision-making skills, personal responsibility, social awareness, and self-management were significantly associated with quality improvement. In 2017-2018, goal-striving mastery improved more for children in programs that offered more opportunities for planning, reflecting, and making choices. Children who began the year with strong SEL skills were more likely to maintain their high level of skills if they attended a high-quality program compared to children in lower-quality programs.9

Page 10

Measuring QualityThe quality improvement cycle begins with rigorous measurement that provides an in-depth view of a program’s physical, social, and psychological environment. This view reveals areas for growth that then comprise feedback that programs receive from their quality advisor. In line with Prime Time’s low-stakes accountability approach, the quality-assessment report is provided to programs in a coaching context by quality advisors who have built a foundation of trust and rapport with directors and staff.Prime Time measures program quality using the Palm Beach County Program Quality Assessment (PBC-PQA). This validated tool consists of 106 items that form 30 scales in eight domains of quality. These domains correspond to the Palm Beach County Quality Standards for Afterschool. External assessors, contracted by Prime Time and trained to score programs with high reliability, visit each program each year and conduct three separate, hour-long observations using the PBC-PQA. This report focuses on Form A of the PBC-PQA. Form A comprises four domains reflecting the observed instructional practices of front-line staff, whereas Form B, completed through interviews and documentation with the program director, primarily measures how the program is organized and managed.Each year, OST programs in the QIS receive comprehensive feedback to support their efforts in creating a safe, supportive, and engaging environment for children and youth. Specifically, programs receive a detailed quality-assessment report that informs their annual improvement plan.  Programs in Palm Beach County excel at providing a safe and supportive environment. The most common area identified for improvement is youth engagement. Specifically, programs must strive to provide children and youth with more frequent opportunities to plan, reflect, and make choices. Most improvement occurs in the first year after joining the QIS. After this initial growth period, programs tend to maintain their elevated scores.  Programs that begin with lower quality show greater improvement over time.RAISING PROGRAM QUALITY 10

Page 11

RAISING PROGRAM QUALITY The Typical Program Environment TodayIn a typical OST program in the Palm Beach County QIS, children and youth experience social support, respect, and encouragement. However, opportunities for collaboration, planning, and decision-making are less common. Most improvement in quality occurs when programs increase the number of opportunities for youth to experience teamwork, mentorship, planning, reflection, and choice. Across all programs working with Prime Time, children and youth experienced physical and psychological safety, which includes respect and inclusion regardless of religion, ethnicity, gender, class, sexual orientation, ability, or appearance. This is the foundation for building higher quality and reflects the importance of equity in a diverse county. Programs that cannot provide a safe environment may fail to meet licensing standards and do not qualify to participate in the QIS. For this reason, average scores for Domain I on the PBC-PQA (“Safe Environment”) were near the maximum of 5.0 each year.Most programs also provided a supportive environment in which youth are welcomed, encouraged, and supported as they build news skills. These practices were measured by PBC-PQA Domain II (“Supportive Environment”), and scores for this domain were also high for most programs. In 2017-2018, the lowest score for this domain was 4.6 out of 5.0. A supportive environment meets basic child and youth social and emotional needs for warmth and connection. The Threshold for High QualityScores on each item of the PBC-PQA range from 1 to 5. In the first domain of quality, “Safe Environment,” programs are given a 1 or 5 for each item. In the other domains, programs are given a 1, 3, or 5 for each item. In general, these scores indicate how frequently practitioners implement a high-quality instructional practice (see Figure 3).Prime Time considers scores of 3.4 or above on Form A as indicators of satisfactory quality. Programs that achieve an overall average score of 4.1 or above have met a high standard of quality and, after two consecutive years with a 4.1 or above, may move to the “maintenance” level. Programs begin their journey in the QIS at entry level. After one year in the system, a program moves from entry level to intermediate level, shifting the coaching and supports from all staff to program directors, in order to allow them to develop maintenance-level skills. Instructional Total ScoresBecause the first domain of the tool, “Safe Environment,” typically scores very high for all programs, another measure used to capture overall quality is the Instructional Total Score (ITS). The ITS is the average for domains II, III, and IV.When the overall Form A score is 4.1, which is the standard threshold for high quality in Palm Beach County, the ITS is typically 3.8 to 3.85. Figure 3. PBC-PQA Form A item response scale and high-quality thresholds.11

Page 12

Where programs varied most was in providing ample opportunities for teamwork, facilitation and mentorship, partnerships with adults, planning, and decision-making. These practices were measured by PBC-PQA Domains III (“Interaction”) and IV (“Engagement”). A high-quality program in the QIS gives children and youth more opportunities to participate in cooperative group activities, lead or mentor their peers, work closely with adults on projects, plan, reflect, and make choices. These elements meet basic social and emotional needs for purpose, autonomy, and a sense of agency. A child able to collaborate with peers and contribute to the direction of activities experiences the environment as more predictable, stable, and empowering than a child without these opportunities.Not surprisingly, the social and emotional skills most influenced by high quality in QIS programs were those directly connected to the experiences of planning and making choices: decision-making skills, responsibility, initiative, and goal-setting.Figure 4 displays the average overall PBC-PQA scores for all programs in 2017-2018. Averages for each of the four domains on the PBC-PQA are also displayed. Each year, programs generally received high scores on Domain I (“Safe Environment”). In contrast, Domain IV (“Engagement”) presented the most opportunities for improvement.Among the four domains on the PBC-PQA, programs were most often encouraged to improve in Domains III and IV (“Interaction” and “Engagement”). Scales in these domains are often selected as an area of focus on the improvement plans that programs generate each year. In Figure 5, average scores for each scale in Domain III (“Interaction”) reveal which practices occur least often (where each PBC-PQA item is implicitly prefaced by: “Staff support to…”). Cooperative groups and mentorship (scales N and O) were common areas for improvement, whereas support for belonging and positive relationships were already quite prevalent. Figure 4. Average domain scores (and overall score) on the PBC-PQA for 2017-2018.Figure 5. Average scores for each scale in Domain III in 2017-2018.12

Page 13

Giving Youth Opportunities to Plan and ReflectPlanning and reflection are important for youth development. However, programs typically scored lowest on these items (see Figure 6). Because these scales are typically the most common areas for improvement, Prime Time offers a Youth Work Method (YWM) series workshop, “Planning and Reflection,” numerous times each year that specifically addresses this need. Programs staffed by practitioners who have attended this training scored higher on all scales of Domain IV. Having all staff attend Planning and Reflection at least once in their history added an average of 0.13 points to each Domain IV scale score compared to programs without this training1 (see Figure 7). Percent of Programs Achieving High QualityMore than half of the programs currently in the QIS surpass the threshold for high-quality, which Prime Time defines as an overall Form A score of 4.1 or above (or an ITS of approximately 3.85). In 2017-2018, 147 OST programs participated in the QIS. Nearly all programs achieved satisfactory quality, and more than half succeeded in creating a high-quality environment for youth. Specifically, 144 programs (98%) received a score at or above 3.4, while 82 (56%) received a score at or above 4.1.Of the 147 programs in the QIS in 2017-2018, 142 were assessed in the previous year, and 81 (or 57%) improved their overall score. Expectations for improvement primarily target programs scoring below a 4.1 overall. In 2017-2018, 101 programs (71%) either maintained a score at or above 4.1 or improved their score.In each QIS cycle, as programs mature in the system (and a few programs join), a larger portion achieve and maintain high quality. Figure 8 shows the percent of programs achieving or maintaining a score at or above 4.1 on Form A for every QIS cycle since 2013. This suggests that the QIS itself improves as programs develop stronger relationships with Prime Time and participate in services that have grown in frequency and depth.Figure 6. Average scores for each scale in Domain IV in 2017-2018.Figure 8. Percent of programs achieving an overall score at or above 4.1 each year and percent maintaining that score. 1Results were obtained using growth curve analysis, with year in QIS (exposure year), PBC-PQA tool version, and average participation level of all staff in Planning and Reflection entered as predictors. All predictors were statistically significant, with training at p = 0.009 (ß = 0.13). Participation in Planning and Reflection is also a proxy indicator for participation in other YWM series workshops. However, Planning and Reflection has been attended considerably more often than any other training and is specifically designed to impart instructional practices measured by Domain IV.Figure 7. Average scores on Domain IV comparing high and low to moderate participation in Planning and Reflection for programs between baseline and year three assessed using the new or older version of the PBC-PQA. Note: High participation is indicated by all program staff having attended the training at least once at any time in the past. 13

Page 14

2 Growth was examined using growth curve analysis (revealing both linear and quadratic trends as significant predictors) and piecewise growth curve modeling with the knot (change point) set after year one and a random intercept term for programs. Piecewise models revealed that significant improvement occurs in the first year (ß = 0.47, p < .0001) and then, after the first year, additional exposure to the QIS does not result in significant change in either direction (ß = 0.04, p = 0.38).Trends in Quality ImprovementThe most noticeable trend in program quality over time is the increase that occurs soon after joining the QIS, followed by the stability of those higher scores for many years (see Figure 9). Programs made the greatest strides in quality during their first year working with Prime Time, particularly in the area of youth engagement. After one or two years, programs typically maintained their higher scores.Analysis of trends in growth during the first year of services compared to growth in subsequent years supports the conclusion that after an initial burst of improvement in the first year, programs generally show slow and steady growth and maintain their scores.2 (See the appendix for visual illustrations of trends in growth.)These trends raise the following question: What distinguishes between programs that continue to improve after the first year and those that slow in growth? This question is addressed in subsequent sections that examine the role of Prime Time’s services and the retention of skilled staff in fueling continued improvement.Figure 9. Average domain scores on the PBC-PQA at baseline (red), after one year, after two years, and after three or more years (green).Figure 10. Quality improvement in Domains III and IV after six to 10 years of services from Prime Time (with baseline score adjustments reflecting revisions to the tool).14

Page 15

Figure 11. Change in Domain III and IV scores from baseline to most recent year for programs assessed on the older or new versions of the PBC-PQA.Quality After Ten Years with Prime TimeSince Prime Time began, 190 OST programs have participated in the QIS. New programs join each year, and other programs leave the system for a year or more. More than 100 programs have participated for at least five years, and 44 have been with Prime Time for a decade. What does quality look like after a decade of support? In the areas of quality for which most programs need to improve, “Interaction” and “Engagement,” instructional practices improved considerably after six to 10 years with Prime Time. Programs used best practices more often, and with more children and youth, compared to their first year. Figure 10 shows the change between baseline measures of quality (blue) and measures after six to 10 years (green). Because the tool was revised before these programs reached their sixth year, baseline scores were adjusted to reflect scores they may have received using the current version of the tool. 3To see overall improvements, as measured on the same version of the PBC-PQA, Figure 11 shows changes for programs between baseline and up to five years, grouping programs based on which version of the PBC-PQA was used for both their baseline and most recent assessment (specifically, their most recent assessment on the same version of the tool). The average scores for each domain on the PBC-PQA (as well as the overall score), given the number of years programs have been in the QIS, are shown in Table A1, in the appendix.Since Prime Time began measuring program quality, the assessment tool was modified. In 2012, revisions to the tool led to a more challenging assessment, and scores dipped the following year, reflecting the increased difficulty. See “Changes in the Measurement Tool” in the appendix for more information.3 The new version of the PBC-PQA reduced Domain III scores by 0.18 points, on average, and Domain IV scores by 0.32 points on average. Baseline scores in Figure 10 were adjusted by -0.18 points for both domains to provide an estimate of actual change.15

Page 16

4 Specifically, QTurn used Ward’s method cluster analysis (on squared Euclidean distances) followed by k-means cluster relocation analysis to create two, separate sets of program profiles: the first set of profiles was based on PBC-PQA scores (Domains II, III and IV) and the second set of profiles, for all of the same programs, was based on patterns of participation in Prime Time’s services (where participation rates were standardized using quasi-absolute scaling).CONNECTING SERVICES TO QUALITY Prime Time’s services are key to quality improvement, particularly when employed together as a complete intervention. Services offered by different departments are designed to function in synergy resulting in greater benefits as a whole than each service could achieve alone. Reflecting on the connection between services, Johnsonley Saint Pierre, assistant director at For the Children Teen Center said, “Prime Time has helped me with everything. No matter what I need, each department I go to has someone who I can talk to. Someone I know by first name and knows me.”After joining the QIS, programs vary in the extent to which they participate in QIS workshops, professional development trainings, and networking events. For programs that joined the QIS after 2012 (when the PBC-PQA tool was revised), participation in services during the first three years was strong for 22% of programs (representing high fidelity of implementation of the QIS) and moderate for 64% of programs (see Figure 12). The combined effect of services was examined using a pattern-centered approach by QTurn, a research consulting group led by Charles Smith (formerly CEO of the Weikart Center). This method involved identifying groups of programs similar in instructional quality (i.e., creating “instructional quality profiles”) and, in a separate analysis, grouping programs based on participation in services.4 Johnsonley Saint Pierre Assistant director at For the Children Teen CenterFigure 12. Participation in Prime Time’s QIS workshops, professional development trainings, and networking events. 16

Page 17

Figure 13. Percent of programs that shifted to a higher quality group based on levels of service use (across the four main services examined). By examining how programs move from one instructional quality profile group to another over time (e.g., from a low-quality group into a high-quality group), it is possible to see an impact of services that other statistical methods may not detect (Smith et al., 2019; Smith et al., 2016).Although programs joined in different calendar years, exposure to the QIS (and the spectrum of Prime Time’s services) was measured in terms of years in the QIS (from baseline to year three). Participation in four main services was included: 1. Workshops delivered by the Quality Improvement team (and quality advisors) addressing program-level strategies for achieving high quality (e.g., self-assessment). 2. Youth Work Method workshop series. 3. Professional development trainings other than the YWM series, such as Progressive Afterschool Practitioner. 4. Networking events.Results of the analysis are shown in Figure 13. Almost all programs with strong participation in Prime Time’s services in their first three years (indicating full implementation of the QIS intervention) moved to a higher instructional quality profile group, and 63% improved dramatically, moving up two levels in instructional quality. In contrast, for programs that participated weakly or not at all, less than a third moved to a higher instructional quality group. Among programs starting off with low instructional quality, strong participation in Prime Time’s services moved 33% to a higher instructional quality group. However, for low-quality programs with low participation in Prime Time’s services, none improved. Strong participation in Prime Time’s full spectrum of services, which includes attending a variety of trainings and networking events, is important for meaningful improvement in instructional quality.17

Page 18

Recommended Levels of Participation “Weak” participation in Prime Time’s services is the point below which no benefits are expected. Strong participation is the minimum level at which participation matches the design of the QIS, and positive outcomes for programs are likely. In the first three years after joining the QIS, strong participation corresponds to the following minimum recommendations for front-line staff (see Figure 14):  Four Youth Work Method series workshops.  Two trainings developed by Prime Time’s Professional Development team (in particular, Progressive Afterschool Practitioner).  One to two Quality Improvement (QI) workshops (such as Planning with Data).  One to two major networking events (such as the Annual Symposium). At the same time, higher quality requires strong participation by program leaders (i.e., directors, assistant directors, program managers), particularly in trainings that support the growth of instructional practices. In the first three years, it is recommended that program leaders attend four Youth Work Method series workshops alongside front-line staff (see Figure 14). Prior research in other communities has shown that the participation of leadership in these trainings contributes to seeing benefits in the program (Smith et al., 2012). When leaders are included, front-line staff are further supported in implementing what they’ve learned. In order to move their program forward, in addition to the YWM series, program leaders should also attend: One to two professional development trainings (e.g., Progressive Afterschool Director). Three to four quality improvement workshops. Eight networking events. The importance of QI workshops and networking events for program leaders reflects the power of these services to address overarching, program-level needs and connect leaders across programs to form a professional learning community. In Figure 14, for example, strong participation of front-line staff corresponds to four or more Youth Work Method trainings for each staff (on average) during the three-year period. Weak participation corresponds to less than one QI workshop, on average, for all staff during the three years. To put these recommendations in context, many practitioners have attended dozens of Prime Time trainings and workshops during their careers as OST practitioners, and a few who’ve been with Prime Time since the beginning have attended more than 100 trainings and events. Practitioners often return to the same trainings one or two years later to refresh their skills, and networking events have become opportunities for some to form long-lasting relationships with colleagues from other programs.Figure 14. Weak vs. strong levels of participation in Prime Time’s services (shown as average number of events attended per staff) for front-line staff and program leaders during the first three years in the QIS. 18

Page 19

Monica Herring, afterschool practitioner at Achievement Centers for Children and Families Village Academy, shared her experiences with Prime Time in a 2019 interview. After more than 80 professional development trainings, 30 Prime Time events, one 40-Hour School-Age Certification, and one Director Credential, Monica Herring continues to commit herself to connecting with children at her program. She began working as an afterschool practitioner in the OST field before quality afterschool was a priority. “When I first began taking courses, they would be geared toward early learning, and not necessarily toward afterschool. Prime Time came along and began educating, equipping, and empowering those who work in youth development. The trainings they offered gave us more structure, gave us more accountability and more of an opportunity to enhance our learning,” said Monica. Prime Time trainings have really made an impact on Monica and how she gives the children in her program opportunities to make decisions on their own: “Making sure that the children have a voice and a choice… that they get an opportunity to see their dreams, their goals and then I create an environment where they don’t feel limited on how they can express themselves.” PRACTITIONER SPOTLIGHT“The work changed from a babysitting service to now afterschool has a role.” 19

Page 20

Participation in Prime Time’s services is voluntary. Whether or not programs participate in trainings, career consultations, or networking events depends on the motivation and personal investment of program leaders and front-line staff. For this reason, engaging programs is essential for the success of Prime Time’s mission. When programs join the QIS, quality advisors begin by building trust and a sense of rapport with program leaders. Programs learn in their first year that Prime Time’s approach is characterized by a focus on collegial support rather than compliance. As program leaders shift from the perception of being audited for compliance to an understanding of how Prime Time offers support, true improvement can begin. In addition to setting the tone for honest improvement, this relationship influences whether programs attend professional development trainings, workshops, and networking events. Toward that end, quality advisors often act as a liaison between programs and other departments within Prime Time (e.g., Professional Development and Community Partnerships), which strengthens the combined effect of Prime Time’s services on quality. One powerful, though unwritten role, of quality advisors is to provide a listening ear when practitioners are coping with various challenges, either professionally or personally. The moral support of a coach or colleague may make it easier for program leaders to be fully present in the workplace, in turn giving their staff the support they need to be present for children and youth.ENGAGING PROGRAMS PRACTITIONER SPOTLIGHTWhen asked about experiences with Prime Time, Kendra Williams (pictured right), director of the Milagro Center, wrote, “One of the memories that comes to mind is when I met with Patrick Freeland, (who was then a quality advisor and is now Prime Time’s quality improvement manager).... I remember how frustrated I was at the time with balancing professional relationships with staff and their responsibilities. Patrick was understanding and said things that renewed my sense of empowerment.”20

Page 21

Program engagement has been logged by quality advisors each quarter since 2013 (see Figure 15). Each year, more than 70% of programs were considered “fully engaged” for the full year. Given the intentional efforts to build positive relationships with programs, it is not surprising that the vast majority of programs in the QIS were fully engaged in Prime Time’s core services every quarter. The most common reasons programs were not marked “fully engaged” was due to incompletion of self-assessments, progress checks, or improvement plans. Although it occurred rarely, programs were considered disengaged when quality advisors had difficulty reaching program staff, leadership had left the program, or the program had closed. Program engagement is significantly associated with the rate of quality improvement in the first year. Fully engaged programs improved more in the first year compared to programs that were not fully engaged throughout the year. Program engagement contributes to a 0.5 increase in Instructional Total Scores (ITS) in the first year compared to non-engaged programs.5 Participation in workshops offered by the Quality Improvement Team, such as Self-Assessment I and II, which contribute to being “fully engaged,” is also associated with higher quality after the first year. Figure 16 shows that program quality is higher after two years in the QIS when participation in Quality Improvement workshops is strong.PRACTITIONER SPOTLIGHTA program director, who asked to remain anonymous, shared that he was hesitant and nervous when he first began working with Prime Time. His quality advisor was persistent, however, and explained how Prime Time’s services could help. He received his associate’s degree with support of Prime Time scholarships. He credits his quality advisor with helping him to stay on the right track and become a director, adding that he learned how to be a role model for his staff and encourage them to utilize Prime Time’s resources. “Without Prime Time, the quality of my afterschool program would not be the same,” he said. “Prime Time helps create the huge and needed difference between babysitting and an afterschool learning environment for kids.”Figure 16. Changes in average Instructional Total Scores for programs at varying levels of participation in QIS workshops between joining the QIS (year “0”) and the second year. ENGAGING PROGRAMS Figure 15. Program engagement by QIS cycle. 5 In a piecewise growth model, program engagement significantly predicts ITS in the first year (ß = 0.55, p = .003).21

Page 22

PROFESSIONALIZING THE FIELDA large part of Prime Time’s work is centered on professionalizing the field: building a skilled, stable workforce in which critical skills and talents are recognized as integral to the work. As programs build positive relationships with quality coaches and other Prime Time staff, they receive recommendations for trainings, workshops, and events designed to improve instructional practices in areas identified for improvement. OST practitioners in Palm Beach County consist primarily of young women completing college coursework. Most stay in the field for fewer than five years (with front-line staff staying fewer years), and programs typically begin each school year with many staff who are new to the field.Because the workforce is in constant change, with almost a third of staff entering or leaving their programs each year, giving practitioners the skills and confidence to use best practices in their work with youth is vital to raising quality. To accomplish this task, Prime Time offers professional development opportunities throughout each year. Data-Driven ServicesTo effectively professionalize the field, the design and availability of professional development services are informed by local needs and in-depth knowledge of the workforce. Since 2014, Prime Time has collected information about the local workforce through the Out-of-School Time (OST) Registry, which captures a wealth of information about the experience, training, and employment history of OST staff throughout the county. The data has shown that: Half of all practitioners were under the age of 27, and the vast majority were women (82%). Almost two-thirds of practitioners had an associate’s degree, whereas one-third had a bachelor’s degree. The typical afterschool counselor had been employed at their program for 2.1 to 3.2 years, whereas the average director had been at the same program for six to 9.1 years.6The Registry also supports the career advancement of practitioners. Practitioners are able to use the Registry to organize their resumes and transcripts, which they may choose to share with future employers. At the same time, it allows Prime Time to explore the positive outcomes of professional development for programs and children. As of spring, 2019, more than 2,000 individuals had joined the OST Registry.76 Range is based on the average length of employment for previous vs. current positions. Because end dates were underreported, the estimated average length of employment for current positions (3.2 and 9.1 respectively) may be an overestimate.7 Not all Registry members currently work in programs that participate in the Quality Improvement System (QIS), and some Registry members are no longer active. This report summarizes information about all Registry members, both active and inactive. Of all active contacts in Prime Time’s database, 61% are Registry members.22

Page 23

Figure 17. Job titles of individuals in the OST Registry.Figure 18. Race/ethnicity of practitioners.Most practitioners in the Registry were counselors or other front-line staff (see Figure 17). There were 118 current directors or executive directors in the Registry who worked in QIS programs. Of all current QIS program directors, 81% were in the Registry as of spring 2019.Half of all practitioners in the Registry were under the age of 27.8 The youngest practitioner in the Registry was 16, and the oldest was 88. Many practitioners were in their early to mid-20s (see Figure A8, in the appendix). The average age of practitioners also differed depending on their position or job title. The average counselor was 29, whereas the average director was 43 (see Table A2, in the appendix).Most OST practitioners were young women. Of those in the OST Registry, 82% were women. The percent of female practitioners, however, varied somewhat by position. (See Figure A9, in the appendix, for the percent of men and women for each job title.)Most practitioners were African American, Caucasian, or Hispanic. Figure 18 shows the percent for each racial/ethnic classification. Of note, 9% of practitioners were classified as “Other,” “Multi-Racial/Ethnic,” or unknown.8 This is the median. The mean age is 31.23

Page 24

Pursuing Their EducationMany practitioners were working toward an associate’s or bachelor’s degree while working part-time in the OST field. The vast majority of practitioners (97%) had a high school diploma (see Figure 19).9 Figure 19 provides an overview of the current education levels of OST practitioners. In addition, many practitioners had obtained a college degree. Specifically, 61% of practitioners had an associate degree, and 20% had a credential of some kind. Table A3, in the appendix, shows the number and percentages of practitioners with degrees or certificates.10Practitioners also earned certificates and credentials designed to improve instructional practices. Among practitioners who had submitted this information in the Registry, one in five have earned the 40-hour Afterschool Certification, and one in 10 have earned their School-Age Professional Certificate (SAPC). Table A4, in the appendix, shows the percent of practitioners who have completed the most common certificates and credentials.Trainings and EventsOver the past 10 years, participation in professional development trainings has been consistently high. Figure 20 illustrates how attendance has grown from year to year, particularly for the most popular and needed trainings. (A list of the most well-attended trainings or events is shown in Table A5, in the appendix.)Figure 20. Training attendance by year.9 Of the 2008 practitioners in the Registry, 1412 (or 70%) have provided education records. Importantly, practitioners without a high school diploma may be less likely to provide an education record.10 Percentages for each category do not total 100% as some practitioners have more than one degree or, for example, both a degree and a certificate.Figure 19. Percent of practitioners with a diploma, certificate, credential, or degree.24

Page 25

Professional development trainings significantly contributed to quality improvement, although the effect of training was stronger where staff turnover was low. Cumulative staff experience in the Youth Work Method series trainings or all professional development trainings significantly predicted program quality between the first and second year.11 Importantly, programs with high levels of staff training crossed the quality threshold for maintenance-level status sooner than programs with below-average levels. By achieving maintenance status, programs take a more active role in assessing and managing their own improvement process, and Prime Time’s quality advisors are able to direct more attention to programs that have yet to improve. As shown in Figure 21, the impact of trainings becomes visible by the second year (between the second and third assessments). In the first year, the cumulative training of program staff may be too low to see an impact. In addition, many programs were assessed in the fall, before staff had attended trainings. Further supporting the power of trainings to advance programs to the next level, programs that achieved maintenance status were distinguished by their highly-trained staff. Average participation in trainings (measured cumulatively across all preceding years) was substantially higher at maintenance-level programs compared to intermediate- or entry-level programs, regardless of how long the program had been in the QIS. Figure 22 shows the average training of staff at each QIS level (entry, intermediate, and maintenance) and each dosage level (years two through 10).Figure 21. Changes in Form A scores on the PBC-PQA by average staff training levels. Figure 22. Average professional development training experience at entry-, intermediate-, and maintenance-level programs, split by year in QIS (from 2 to 10).11 Based on a regression analysis, with year two ITS as the dependent variable and year one ITS as a covariate, cumulative participation in all trainings (or YWM series workshops) was associated with higher scores in year two (ß = 0.27, p = .009).25

Page 26

12 This is the average percent across four years (2013-2017) of new hires each year according to data for staff participating in the OST Registry.13 This percent is likely to be much higher as job end dates are underreported, and practitioners who work at a program for less than a year may not join the Registry to begin with.14 Duration of employment was calculated based on the practitioner’s start date at their current program, although they may have begun their employment with a different job title. For example, a director employed at a program for eight years may have worked as an assistant director for four of those years. Current positions reflect those records for which no end date has been provided to Prime Time. Because end dates are underreported, averages for previous positions may provide a better indication of how long practitioners tend to stay at the same program.15 The results were based on a linear growth model, with percent of staff turnover as the key predictor variable (ß = 0.28, p = 0.047), controlling for year in the QIS.RETAINING SKILLED PRACTITIONERS The enrichment, care, and support that children receive in OST programs comes primarily from part-time staff who receive low wages, work a second job, and leave the field within a couple of years, often for financial reasons. Retaining skilled practitioners is critical for achieving both program- and youth-level outcomes.Each year, approximately 25% of all staff at QIS sites were new to their program,12 and at least 14% left the program that year.13 Some turnover corresponded naturally with the beginning or end of the school year. However, 15% of staff started or left in the middle of the school year, between October and April. How Long Staff Stay at Their ProgramsThe average length of employment varied by position. Figure 23 shows the average number of years practitioners had been employed at their program, based on their current job title.14 Longevity in the field is associated with changes in position. Program directors generally remained employed at the same program far longer than front-line staff. In contrast, afterschool counselors stayed at their program for about two years. If they moved on to become certified counselors, they stayed an additional year or two. In the field, staff retention may be tied to opportunities for advancement. Turnover and Quality ImprovementHigh turnover poses a challenge to raising program quality. Staff turnover influenced the rate at which programs improved.15 The vast 26

Page 27

Figure 23. Average number of years practitioners stayed in their program.majority of programs improved the most in their first year. After the first year, improvement slowed. However, programs with low staff turnover showed continued growth compared to programs with high turnover. Turnover is defined here as the percent of staff starting or ending their job in the given QIS cycle. High turnover is defined as 25% or more of staff starting or ending their job in the QIS cycle. Table 1 shows changes in quality from year one to two.16Turnover does not necessarily have a negative impact in all cases. In the first year, when programs joined the QIS, the addition of new staff was associated with greater improvement, whereas staff departures were associated with less improvement. Figure 24 shows that the most improvement between the baseline and first year assessment occurred for programs with new staff starting but not leaving.The Effects of Quality on TurnoverAlthough staff turnover may impede quality improvement, program quality may also influence whether or not staff stay at their program. In other words, the influence runs both ways. As programs retain staff and improve in quality, they create a more positive work environment where staff are more likely to stay.Participation in services (fidelity of implementation of the QIS) was associated with reduced turnover across three years of participation in the QIS, according to pattern-centered analysis by QTurn (Smith, Peck, Lindeman, & Smith, 2019). Programs with low participation in Prime Time’s services and high turnover did not experience improvements in quality over the three years.Figure 24. Changes in Instructional Total Scores (ITS) between the baseline assessment and first year for programs with staff turnover in the first year (positions starting vs. ending).Table 1. Staff turnover and changes in Instructional Total Scores after the first year.TurnoverNumber of ProgramsAverage Year One ITSAverage Year Two ITSChangeLow to Moderate 86 3.76 3.86 0.10High 43 3.70 3.68 -0.0216 Years in which programs were assessed using different versions of the PBC-PQA are not included.27

Page 28

Turnover and TrainingCreating a skilled OST workforce requires that practitioners participate in various professional development (PD) opportunities and then stay in the field. Because many practitioners leave the field within a few years, Prime Time offers numerous training opportunities each year (see Figure 25). In the past decade, as more programs have joined the QIS, Prime Time has offered more trainings (with about three per week in 2017-2018). This high frequency of training opportunities may compensate for the impact of turnover across many years. However, turnover remains an obstacle. The influence of PD trainings on quality was cumulative from one year to the next. This influence can be seen after the first year, when practitioners had attained enough training to significantly improve their instructional practices. However, staff turnover interrupted this process, and the benefits of training disappeared when staff left. When programs retained skilled practitioners, the impact of PD training was seen after the first year of growth. Figure 26 shows that when staff turnover was low (under 25%), training was associated with significant improvement. In contrast, programs with high turnover (25% or above) were less likely to show the benefits of training.17 This means that frequent PD trainings each year cannot fully mitigate the impact of turnover in the field. Instead, frequent trainings may help programs with new staff retain their same level of quality. Table 2 shows the scores and sample sizes corresponding to Figure 26.18Figure 26. Changes in Instructional Total Scores from year one to year two by staff turnover and professional development training.Figure 25. Number of professional development trainings offered during each QIS cycle.Table 2. Average year one and year two ITS by staff turnover and professional development training.Training TurnoverAverage Year One ITSAverage Year Two ITSChangeNumber of ProgramsNo Low to Moderate3.71 3.60 -0.11 20No High 3.40 3.53 0.13 6Yes Low to Moderate3.77 3.94 0.17 66Yes High 3.74 3.70 -0.04 3717 In a linear regression analysis – with change in ITS from year one to year two as the outcome variable and turnover, training, and interaction between turnover and training as predictors – training is associated with positive change, or improvement in quality (ß = 0.27, p = .009), but training combined with high turnover is associated with less change (ß = -0.47, p = .025).18 Of note, the six programs with high turnover and no training also began year one with the lowest average ITS, which yielded greater room for improvement.28

Page 29

Reasons for TurnoverA 2018 Prime Time survey of OST program staff found that financial considerations influenced how much longer practitioners anticipated staying in the field. When asked what drives them to do their best each day, practitioners rated the desire to help children and youth as the strongest motivators and pay and incentives as the weakest. However, where asked what challenges them to stay motivated, pay and incentives were rated as the biggest challenge (see Figure 27). The more challenging respondents found “rate of pay” and “lack of financial rewards for hard work” to staying motivated, the less likely they were to expect to stay four or more years at their program. Where asked how much longer they anticipated staying at their program or in the field, those who ranked financial concerns as more challenging to their motivation than other concerns estimated staying fewer years than those who ranked financial concerns as less challenging to their motivation than other concerns.19Among these respondents, more than two-thirds (71%) of front-line staff, 85% of directors, and 83% of assistant directors indicated that they need a higher income in order to meet their basic needs. The average wage for front-line staff in QIS programs was $12.76 per hour. In Palm Beach County, one adult with no children must earn $13.11/hour21 and work full time to meet basic needs for food and housing without needing public assistance. For a single mother (one adult with one child), the living wage is $27.33/hr.22Financial IncentivesWhile frequent trainings do soften the negative impact of turnover, financial interventions can help programs retain skilled staff. Prime Time offers wage incentives, through ACHIEVE OST, and scholarships for college coursework in youth development. In combination with frequent trainings, these strategies help to create a skilled workforce that remains stable from year to year.19 Based on a stepwise regression analysis including all OST staff, with anticipated longevity in the field as the dependent variable and the following predictors: number of years in one’s current program, PQA domain scores for 2016-2017, and all Motivation and Challenge subscales. R2 = .15, p = .005 20 These results were based on survey scales, each representing multiple items on the survey. Scales were developed using the expertise of quality advisors and principal component analysis.21 See http://livingwage.mit.edu/counties/12099. 22 A living wage is the minimum income needed to meet basic subsistence needs. According to the Living Wage Calculator User’s Guide (Nadeau, 2018), “the living wage is the minimum income standard that, if met, draws a very fine line between the financial independence of the working poor and the need to seek out public assistance or suffer consistent and severe housing and food insecurity” (p. 2).Figure 27. Challenges to staying motivated. 2029

Page 30

ACHIEVE OST Incentive Award ProgramPractitioners at QIS programs are eligible to receive ACHIEVE OST, a tiered incentive award program, if they earn less than $17.50/hour. They must also remain employed at the same program for at least six months. Those who are eligible must work at least 15 hours per week with children and youth and complete a certain number of training hours or education levels from a regionally accredited college, which are outlined by the ACHIEVE OST Award Pathway.ACHIEVE OST is designed to increase the stability of the OST workforce, improve program quality by reducing staff turnover, and encourage the continued professional development of practitioners. The vast majority of practitioners who received an ACHIEVE award in a Figure 28. Number of practitioners who continued working at, or left, their program the year after receiving an ACHIEVE OST award.Figure 29. Number of ACHIEVE OST awards by QIS cycle.Table 3. Total ACHIEVE OST awards by QIS cycle.QIS Cycle Total Awarded Number of Awardees Average Award Amount2015-2016 $99,638 149 $6692016-2017 $177,888 235 $7572017-2018 $228,712 282 $81130

Page 31

given year continued to work for the same program in the following year (see Figure 28). The number of ACHIEVE OST awards has steadily increased since the program was launched in 2015. Prior to 2015, Prime Time offered wage incentives through the WAGE$ program. Table 3 shows the amount awarded and number of award recipients since 2015.With more than 2,000 active OST practitioners in QIS programs who qualify each year, only 8-12% received a wage incentive, which is an average of one or two practitioners per program. Figure 29 shows the percent of practitioners who received ACHIEVE OST awards each year. Because turnover poses a barrier to quality improvement, expanding this service is recommended. ScholarshipsIn addition to ACHIEVE OST, Prime Time offers scholarships for practitioners to support continued professional development. Practitioners can apply for scholarships to pursue certificate or degree programs or to attend national or state conferences, workshops, and seminars relevant to the OST field. Conferences enable practitioners to network with colleagues and share best practices. Practitioners who receive scholarships typically receive multiple amounts per year, coinciding with coursework or conference attendance. Figure 30 shows both the total number of scholarships awarded each fiscal year (blue) and the number of recipients (green). Scholarships can make all the difference for practitioners struggling to stay in the field. For Monica Herring (highlighted in the “Practitioner’s Spotlight” on p. 19), staying motivated to further advance herself in the field wasn’t always easy. Monica experienced personal challenges while working in afterschool but utilized financial incentives to continue pursuing her goals. “I lost both parents while working in afterschool,” she said, “so going through the process of being able to still go to school while trying to tend to them was hard. It was really comforting to know, okay I can apply for a scholarship, and then, at least that part I won’t have to worry about.”Figure 30. Number of scholarships and recipients by fiscal year.“The more you study, the more degrees you get, you’ll get more money (ACHIEVE OST Incentive) and then you’ll want to keep moving up and want to stay in the field.” – Laurent Alvarez Gomez, director, Starlight Cove Elementary Afterschool Program31

Page 32

HELPING CHILDREN AND YOUTH SUCCEEDIn the past decade, Prime Time has successfully strengthened and expanded quality in OST programs throughout the county. This mission is reinforced by research evidence demonstrating that high program quality contributes to positive outcomes for children and youth. In the past few years, Prime Time found evidence that high-quality programs in the Palm Beach County QIS promote positive outcomes for children and youth.From 2015 through 2018, Prime Time increased its focus on SEL skill growth by inviting OST practitioners to assess the SEL skills of children and youth. Practitioners completed the 72-item Devereaux Student Strengths Assessment (DESSA) in the fall and spring of 2015-2016 and 2016-2017. In 2017-2018, practitioners completed the 14-item Staff Rating of Youth Behavior (SRYB, short form) developed by the Weikart Center. Each practitioner assessed approximately 15 children, and among participating programs, two to three practitioners participated.Data LimitationsBetween 480 and 730 matching assessments were obtained for children and youth each year (with the fewest in 2017-2018). Participation was limited, with only 2-3% of eligible practitioners (between 30 and 50) taking part each year. Less than 20% of programs (22 to 29) completed both pre- and post-test assessments. Most participating programs (and in particular those that participated in both the pre- and post-test phases) were more established in the QIS (members for more than six years) and of higher quality. Low participation in SEL assessments, under-representation of lower-quality sites and, in particular, lack of participation by newer programs makes the influence of program quality on youth outcomes difficult to detect. However, modest findings suggest that high-quality instructional practices do play an important role in fueling positive growth in SEL skills.32

Page 33

FindingsImprovements in self-management, social awareness, decision-making, and personal responsibility skills between the fall of 2015 and spring 2016 were more likely at programs that improved in quality throughout that time.23 Improvements were modest but statistically significant. A one-point increase in Instructional Total Scores (ITS) between the current and subsequent year was associated with an average increase in ratings on these scales between the fall and spring of 0.27 (on a scale from 1 to 5). Improvements in SEL skills also occurred for children assessed in 2016-2017. However, ITS scores did not significantly predict improvement over that time.In 2017-2018, improvements in goal-striving mastery, a scale on the SRYB, were associated with higher Domain IV scores in the preceding year.24At the highest quality programs participating in SEL assessments, children were more likely to maintain strong social and emotional skills. According to pattern-centered analysis by QTurn, 71% of children sustained strong skills at high-quality programs compared to 55% at lower-quality programs (see Figure 31 and the supplemental technical report by Smith, Peck, Lindeman, & Smith, 2019).Children and youth who began the year with poorly developed social and emotional skills tended to improve throughout the year at any of the programs participating in SEL assessments.25Skills Targeted by Higher QualityBecause participating programs were of higher quality than non-participating programs, the “low” quality group in these comparisons is actually of moderate quality. This lack of a true low-quality group of programs is prevalent among programs in the QIS, as programs improve enormously in just their first year. The main difference between moderate- and high-quality programs lies in scores on Domain IV, “Engagement.” Not surprisingly, the primary skills associated with higher quality (higher Domain IV scores) are those associated with planning and reflection: decision-making, personal responsibility, and setting and pursuing goals. This aligns with findings from Prime Time’s SEL assessments. Improvements in these skills were more likely at programs excelling in providing children and youth with more opportunities to plan, reflect, and make choices.Figure 31. Percent of Low-, Moderate-, and High-Risk Youth with SEL Skill Growth by Practice Quality.23 Results were obtained using multilevel regression analysis, with practitioners at level two. The key predictor variable was change in ITS scores from the current to subsequent year.24 According to a multilevel regression analysis (ß = 0.56, p = 0.02).25 Changes in scores for children rated low on pre-test assessments may also reflect, in part, a tendency for extreme ratings on a pre-test to be followed by post-test ratings closer to average (cf., regression to the mean).33

Page 34

Staff Turnover and Children’s Social DevelopmentPerhaps one of the most influential factors in children’s social and emotional development in OST programs is staff longevity, or how long staff remain employed at the same program. For a program with high turnover, children and youth may struggle to form stable relationships with practitioners, and the practitioners they do interact with are less likely to be trained and educated in youth development. Children rated by practitioners who reported that they knew the child well appeared to experience greater improvements in skills.26In 2016-2017, practitioners rated the social awareness of children at their program as part of the SEL assessments. Social awareness is one of eight scales on the DESSA. At programs with low staff turnover, social awareness improved significantly, whereas programs with high turnover saw no improvement. Although this finding appears to link program quality with social awareness, closer examination reveals that new staff may rate social and emotional skills differently than staff who are not new to the program (or, children may demonstrate fewer skills in the presence of new staff). Baseline (fall, pre-test) ratings of children’s social awareness were significantly lower at programs with many new staff. By the spring, ratings of social awareness by staff who were new in the fall were nearly identical to the ratings by staff who had worked at the program in the previous year. Social and Emotional Strengths of PractitionersAdding to the above finding, Prime Time found that staff who reported feeling bored, tired, or upset as they completed SEL assessments rated children more negatively. For example, children rated by practitioners who reported feeling bored during assessments appeared to decline in skills by a full point on average (p = 0.0004). This apparent decline is likely due to the influence of boredom on practitioner’s perceptions rather than actual changes in children’s skills.The mood of practitioners may influence how they perceive or remember the skills of their children. Another possibility is that recalling behaviors that reflect a need for stronger skills contributes to the mood of the staff as they complete the ratings. In either case, the perceived social and emotional development of children is connected to the social and emotional state of practitioners. These findings highlight the importance of staff development for improving the SEL skills of children and youth. Strengthening the ability of practitioners to accurately perceive SEL skills and respond positively to children in need of further development is integral to program quality improvement. Although SEL assessments may yield powerful data for gauging Prime Time’s indirect impact on children and youth, an important part of the assessment process is how practitioners learn to carefully notice and oversee the social and emotional development of children in their care.Toward that end, Prime Time piloted a training series in 2019 that was designed to strengthen the social and emotional skills of OST practitioners, and quality advisors have begun to integrate strategies around SEL and SEL measurement into their coaching work. By integrating SEL measurement into the way practitioners approach youth work, future assessments may boost the success of SEL trainings and interventions and, as a bonus, further demonstrate the role of OST program quality in helping children and youth succeed in school and in life.26 Results were obtained using a multilevel regression analysis. Practitioners indicated how well they knew each child, regardless of the length of time since first meeting the child. Responses were associated with an average increase of 0.18 in SEL skills over the course of the year (p = 0.02).34

Page 35

Jane Winters, site director at Beacon Cove Intermediate Afterschool Program, shared her experiences with Prime Time in a 2018 interview.On motivation and passion for youth work:“It doesn’t matter what’s going on in my personal life or in the universe when I get here, I just love being here. This is my second home, this is my passion, being with those kids every day getting hugs, getting love, and giving it back, there’s nothing more important to me than that. My nickname is mom. I sign all my staff’s birthday cards, love Jane aka Mom. I think it starts there.“It’s not just a babysitting service here; we don’t just send the kids out to play. We enrich, we teach, we help them grow new skills, we spend a lot of time with the kids, sometimes more than their own families do. We took opportunities where we used to just play, and now we play with purpose.”The importance of staff camaraderie:“We are an extended family amongst our staff in and of themselves, so that gives us a sense of camaraderie, it’s not unusual for us to sit down and just have lunch and sit and talk, and that makes it a lot better for the kids, I think because they see that relationship amongst the staff, it gives them a sense of comfort, and I don’t think they would have it if was just a bunch of people who really didn’t get along.”Prime Time’s role in supporting improvement:“It has been, for me personally, I want to say 90% of the things I have learned that have made this program as exceptionally great as it is, I have gotten that foundation from Prime Time. If you’re struggling, if you need help, if you want to provide opportunities for your kids that are more than just helping them with homework and then letting them go run wild outside, then go to the trainings, go to the meetings, get involved in the community and go all in.” “I’ve had two amazing mentors in my life, and they both came from Prime Time… Prime Time events for me are like coming home to family. I truly have found friendships and mentorships amongst the staff at Prime Time, and when you can bring that sense of coming home back to your program, I think that really helps a lot of personal growth.”Learning through professional development trainings: “When you go to a training at Prime Time, and you can’t wait to get back to your staff to introduce the ideas that you learned or the games that you played, just so you can get the same giggles that you had at the training, that’s a very big part of it. I have been a part of Prime Time since we were dealing with the Weikart Center, I want to say it was 2005, when we initially were introduced to Prime Time.” “I think Prime Time has been a catalyst in helping this program grow in leaps and bounds over the years because it gave us the foundation and the knowledge through trainings, through events, through community things that we do as a result of the expanded learning opportunities (ELOs), all of those things have given us more of a foundation.”PRACTITIONER SPOTLIGHT35

Page 36

CONCLUSIONSBuilding positive relationships with programs is one of the foundations of Prime Time’s success. The trust and rapport that programs experience with Prime Time staff fuel high rates of program engagement and strong participation in a spectrum of integrated services that are completely voluntary yet critical for long-term growth. Quality advisors at Prime Time, often the first point of contact when programs join the system, help to connect program leaders and their staff with other departments, opening doors to professional development training opportunities, career supports, and networking events. As a result, participation in the QIS, and in the full variety of QIS services, leads to significant improvements in program quality.After their first year in the QIS, programs tend to maintain their higher levels of quality. However, opportunities for further advancement remain. In particular, programs are encouraged to create more engaging environments in which youth can exercise responsibility, take initiative, make decisions, and set and achieve goals. These skills all involve children and youth taking a more active role in their personal and social experiences and development.A key to improvement after the first year is developing and retaining skilled staff. One of the biggest challenges to raising program quality is staff turnover. The loss of a program director, for example, effectively sets a program back to “entry” level and can result in a dramatic dip in quality. While Prime Time provides professional development trainings nearly every two days during the school year, high turnover erases some of the gains that programs experience by participating. In addition, children and youth who need stable relationships with caring adults lose out when nearly one in three staff enter or leave the program each year. Prime Time’s financial incentives are designed to help to address this challenge. Combined with other strategies to support OST practitioners as they grow and thrive as professionals, Prime Time has helped to strengthen the OST field, creating more enriching environments for tens of thousands of children.Recommendations Identify programs with low to moderate participation in services, or high turnover, and offer focused, intensified, strategies around the following: relationship-building, program engagement, incentives, and encouragement to participate in a variety of trainings and events. Increase funding for wage incentives. At present, wage incentives reach 8-12% of staff (less than two practitioners per program, on average). A suggested three-year goal is to reach at least 25% of staff. Increase the number of contact hours between quality advisors and programs. Given the powerful role that relationships play in motivating improvement, ensure that Prime Time staff spend sufficient time engaged in personal, authentic communication with programs, particularly those that underutilize services. Strengthen the integration of Prime Time’s services by expanding the trust and rapport that currently exists between programs and quality advisors to encompass staff from other departments within Prime Time. Support the role of quality advisors as liaisons rather than gatekeepers. Gauge program buy-in, motivation, and trust at regular intervals, and use these data for decision-making. In a system that is voluntary, richer information about the status of partnerships between Prime Time and programs may help to strengthen trust, fuel motivation, and inform customized strategies for keeping programs engaged and skilled staff in the field. Increase participation in SEL assessments, and build strategies for integrating SEL measurement into how practitioners approach youth work36

Page 37

Durlak, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D., & Schellinger, K. (2011). The impact of enhancing students’ social and emotional learning: A meta-analysis of school-based universal interventions. Child Development, 82, 405-432.Durlak, J. A., Weissberg, R. P., & Pachan, M. (2010). A meta-analysis of after-school programs that seek to promote personal and social skills in children and adolescents. American Journal of Community Psychology, 45, 294-309.McCombs, J. S., Whitaker, A. A., & Yoo, P. Y. (2017). The value of out-of-school time programs. Santa Monica, CA: RAND. www.rand.org/pubs/perspectives/PE267.htmlNadeau, C. A. (2018). Living wage calculator: User’s guide / technical notes - 2018 update. Cambridge, MA: Massachusetts Institute of Technology, Department of Urban Studies and Planning. Smith, C., Akiva, T., Sugar, S. A., Lo, Y. J., Frank, K. A., Peck, S. C., Cortina, K., & Devaney, T. (2012). Continuous quality improvement in afterschool settings: Impact findings from the Youth Program Quality Intervention study. Washington, DC: Forum for Youth Investment. Smith, C., & Hohmann, C. (2005). Full findings from the Youth PQA validation study. Ypsilanti, MI: High/Scope Educational Research Foundation. Smith, C., McGovern, G., Peck, S. C., Larson, R. W., Hillaker, B., & Roy, L. (2016). Preparing youth to thrive: Methodology and findings from the social and emotional learning challenge. Washington, DC: Forum for Youth Investment.Smith, C., Peck S. C., Lindeman, L. M., & Smith, L. (2019). Supplemental impact evaluation for Palm Beach County quality improvement system (QIS) using fully pattern-centered analytics. Ypsilanti, MI: QTurn.Smith, C., Peck, S. C., Roy, L., & Smith, L. (2019, April). Measure once, cut twice: Using data for continuous improvement and impact evaluation in education programs. In D. Darling (Chair), Digging deeper: Using mixed-methods research to assess promising practices in out-of-school time. Symposium conducted at the annual meeting of the American Educational Research Association, Toronto, ON.REFERENCES37

Page 38

Instructional Quality Improvement Since BaselineInstructional quality improvement since each program began its work with Prime Time can be visualized as moving forward or backward from a starting line. Changes since each program’s first assessment are shown in Figures A1-A3. If the program demonstrated higher quality in the most recent assessment compared to the first assessment, the program is represented to the right of zero on the horizontal axis. The center line is a minimal threshold.Figure A1 shows that many programs had provided more opportunities for children and youth to plan, reflect, and make choices between their first year with Prime Time and their latest year. Every point to the right of zero represents a program that improved youth engagement. Programs to the left represent those that have struggled in this area. Of note, most are older programs that joined the QIS before 2012 and began with high scores. The next section offers an explanation for this finding. Improvements in how programs support social interactions show a similar trend (see Figure A2). When programs scored low on their initial assessments, they were more likely to improve. This makes sense. Programs beginning their journey with high quality have less room for measurable improvement.For reference, Figure A3 shows changes in scores for Domains I and II between the first and most recent assessments. For these domains, improvement was generally not needed. Scores began high and either remained high or improved somewhat. Table A1 shows the average scores for each domain on the PQA (as well as the overall score) given the number of years programs had been in the QIS. New programs, or those with 0 years in the QIS, received a baseline assessment. Baseline assessments are shown in the first row. These reflect program quality before receiving one year of coaching, training, and other supports from Prime Time.APPENDIXFigure A1. Changes in PBC-PQA Domain IV (Engagement) scores between the first (baseline) assessment and most recent assessment.Note: “Year Joined” refers to the first year of each QIS cycle (e.g., 2007 is 2007-2008).Figure A2. Changes in PBC-PQA Domain III (Interaction) scores between the first (baseline) assessment and most recent assessment.Figure A3. Changes in PBC-PQA Domains I (Safe Environment) and II (Supportive Environment) between the first (baseline) assessment and most recent assessment.38

Page 39

Growth Curve AnalysisFigure A4. Spaghetti plot showing Form A scores for all programs that joined the QIS after 2012 (when the PBC-PQA was revised). The trendline illustrates a quadratic trend, with the most improvement occurring in the first year.Figure A5. Spaghetti plot showing Domain IV scores for the same programs. The quadratic trend is similar to Form A scores, but there is greater variation in the Domain IV scores (apparent by reference to the y-axis).Table A1. PBC-PQA Form A domain scores (and overall Form A score) for programs at each year in the QIS.Year in QISN of Programs Overall I II III IV0 (baseline) 190 3.76 4.89 4.21 3.42 2.501 174 4.01 4.94 4.39 3.71 2.942 156 4.07 4.96 4.44 3.81 3.053 128 4.00 4.97 4.36 3.68 3.034 119 4.08 4.97 4.43 3.87 3.065 115 4.10 4.99 4.45 3.85 3.106 106 4.08 4.98 4.39 3.92 3.067 100 4.10 4.97 4.39 3.91 3.148 74 4.14 4.97 4.44 3.93 3.289 63 4.16 4.99 4.47 3.95 3.3210 44 4.19 4.97 4.45 4.02 3.3639

Page 40

Changes in the Measurement ToolFigure A6 shows changes in the quality of youth engagement across time for groups of programs based on when they joined the QIS (their baseline year). Programs that joined before 2013-2014 were assessed using an older version of the tool, which typically led to higher scores. (Similar trends are apparent for the other domains of the tool.)Program quality, as it was measured, appeared to drop in 2013-2014. However, the modifications to the tool explain much of this shift. When the new and older versions of the PBC-PQA are examined separately, slightly different trends for the past ten years are revealed. Figure A7 illustrates change over time in program quality using the PBC-PQA ITS. The blue line represents average scores on the older version of the tool. Only programs that participated in the QIS before 2013 received an assessment on the older tool. The dark red line represents average scores on the new version of the tool for programs that were initially assessed using the older tool.Practitioner DemographicsThe following figures and tables provide a more comprehensive look at the characteristics of practitioners at QIS programs as of spring 2019.Although 90% of practitioners identified English as their primary language, 12% identified English as their second language.27 Spanish was the primary language identified by 4% of practitioners, and 14% identified Spanish as their secondary language. For all other languages indicated in the Registry (i.e., Creole, French, Portuguese, Kanjobal, and “Other”), 2% of practitioners identified one as their primary language.Figure A6. Average scores on Domain IV (Youth Engagement) by year assessed and year joined. Note: “Year Joined” and “Year Assessed” refer to the first year of each QIS cycle (e.g., 2007 is 2007-2008).Figure A7. Instructional Total Scores (ITS) for the new and older version of the PBC-PQA.Figure A8. Age range of practitioners.27 The high percentage of primary English speakers may reflect sampling bias as it relates to Registry membership.40

Page 41

Job TitleNumber of Praconers Average AgeCounselor in Training 27 21.1Counselor 831 28.6Certified Counselor 138 31.0Activity Leader 121 33.7Floater 10 34.9Program Coordinator 64 36.6Assistant Director 54 36.6Other 96 36.7Academic Advisor 8 40.6Director or Executive Director142 43.0Table A2. Average age by position.Figure A9. Percent of men and women by position.Table A3. Percent of practitioners by degree, certificate, or credential.Job TitleNumber of Praconers Average AgeCounselor in Training27 21.1Counselor 831 28.6Certified Counselor138 31.0Activity Leader 121 33.7Floater 10 34.9Program Coordinator64 36.6Assistant Director 54 36.6Other 96 36.7Academic Advisor 8 40.6Director or Executive Director142 43.0Cercate or Credenal Count Percent40-Hour Child Care Certification 311 22%40-Hour Afterschool Certification 304 22%YDCCC 188 13%FL Staff Credential 176 13%AS-Human Services (Youth Development) 136 10%Director Credential (Foundational) 130 9%SAPC 127 9%Staff Credential (School-Age) 87 6%FCCPC 51 4%PBC-AEC 44 3%Director Credential (Level II/Advanced) 43 3%BAS-Supervise/Manage (Youth Development)36 3%Director Credential Renewal (Level II/Advanced)29 2%CDA 27 2%ECPC 15 1%Director Credential Renewal (Foundational)8 0.6%Teacher’s Certificate 6 0.4%Staff Credential Renewal 1 0.1%Table A4. Percent of practitioners with common certificates or credentials related to youth development.41

Page 42

Table A5. Percent of practitioners who have attended the most popular trainings or events.Table A6. Improvement and maintenance in 2013-2017.Training or Event Count Percent of All Trainings Percent of PraconersProfessional Development Presentation 2910 9% 50%May Networking Event 1995 6% 34%Planning and Reflection 1876 6% 32%The Arc Workshop 1855 6% 32%January Networking Event 1579 5% 27%Academic Literacy Enrichment Initiative 1538 5% 26%Cooperative Learning 1393 4% 24%STEAM Initiative Training 1330 4% 23%Ask-Listen-Encourage 1043 3% 18%Youth Voice 1020 3% 18%September Networking Event 921 3% 16%Structure and Clear Limits 919 3% 16%Reframing Conflict 914 3% 16%Active Learning 908 3% 16%Self Assessment I 802 2% 14% 2013 2014 2015 2016 2017Total Number Programs 121 123 143 146 147Programs with Two or More Years Scores 115 115 120 136 142Number At or Above 3.4 112 115 139 142 144Percent At or Above 3.4 93% 94% 97% 97% 98%Number At or Above 4.1 40 47 70 76 82Percent At or Above 4.1 33% 38% 49% 52% 56%Number Maintaining (At or Above 4.1) 30 26 31 46 52Percent Maintaining 26% 23% 26% 34% 37%Number Improved 36 74 70 81 81Percent Improved 31% 64% 58% 60% 57%Average Improvement 0.24 0.34 0.38 0.32 0.34Number Improved Since Baseline 76 81 93 107 116Percent Improved Since Baseline 66% 70% 78% 79% 82%Average Improvement Since Baseline 0.37 0.45 0.48 0.48 0.51Number Improved or Maintained 54 84 86 100 101Percent Improved or Maintained 47% 73% 72% 74% 71%Average Improvement (Under 4.1) 0.29 0.38 0.40 0.37 0.3942

Page 43

Table A7. Improvement and maintenance in 2007-2012 (involving the older version of the PBC-PQA). 2007 2008 2009 2010 2011 2012Total Number Programs 63 89 91 114 115 117Programs with Two or More Years Scores 0 60 81 84 111 111Number At or Above 3.4 57 86 88 108 112 117Percent At or Above 3.4 91% 97% 97% 95% 97% 100%Number At or Above 4.1 8 22 37 35 59 64Percent At or Above 4.1 13% 25% 41% 31% 51% 55%Number Maintaining (At or Above 4.1) 0 4 11 21 27 41Percent Maintaining 7% 14% 25% 24% 37%Number Improved 0 42 54 45 77 61Percent Improved 70% 67% 54% 69% 55%Average Improvement NA 0.37 0.31 0.23 0.34 0.28Number Improved Since Baseline 0 42 62 64 88 96Percent Improved Since Baseline 70% 77% 77% 79% 87%Average Improvement Since Baseline NA 0.37 0.41 0.39 0.48 0.46Number Improved or Maintained 0 43 57 54 85 78Percent Improved or Maintained 72% 70% 64% 77% 70%Average Improvement (Under 4.1) NA 0.38 0.33 0.26 0.36 0.34Year in QIS Assessed with Older VersionAssessed with New Version (Programs that Joined Aer 2012)Assessed with New Version(Programs that Joined Before 2013)Total0 (baseline) 140 50 - 1901 126 48 - 1742 117 39 - 1563 83 45 - 1284 74 45 - 1195 49 66 - 1156 - - 106 1067 - - 100 1008 - - 74 749 - - 63 6310 - - 44 44Table A8. Total number of programs assessed for each year of their participation in the QIS.43

Page 44

2300 High Ridge Road, Suite 330Boynton Beach, FL 33426561-732-8066 ph561-732-8094 faxwww.primetimepbc.org