Archives: 2014 & Earlier

Guiding Disconnected Youth to Academic Success

Program: Community Education Pathways to Success (CEPS).

Organization: Youth Development Institute (YDI); unidentified community-based organizations at 10 New York City sites.

Types of Evaluation: This is an evaluation of the CEPS model, not of the 10 program sites. The evaluation methodology included reviews of site documentation, participant focus groups, participant surveys, on-site observations, and interviews with CEPS instructors, directors, counselors, trainers and advocates. Participants’ academic skills were tested using the Test of Adult Basic Education (TABE).

Sample Size: 443 students total.

Evaluation Period: Three cohorts were included in the evaluation. Three sites began during the 2005-06 school year (Cohort 1), three during the 2006-07 year (Cohort 2) and four during the 2007-08 year (Cohort 3).

Cost: Each of the 10 CEPS sites receives $35,000 a year in funding from YDI. Most of the cost of the CEPS program is paid by the sites themselves. YDI declined to provide the cost of the evaluation.

Funded By: Youth Development Institute.

Contact: Patricia B. Campbell, Campbell-Kibler Associates, (978) 448-5402, www.campbell-kibler.com.

 

Community Education Pathways to Success (CEPS) is a citywide effort initiated in 2005 by the New York City-based Youth Development Institute (YDI) in partnership with community-based organizations (CBOs) and America’s Choice, a national for-profit leader in the educational standards movement.

The CEPS model focuses on improving the academic skills of out-of-school 16- to 24-year-olds who are ineligible for GED preparation due to low literacy and math proficiency, and on strengthening the capacity of CBOs to deliver high-quality developmental, support and educational services to youth.

Core Components

At the core of the CEPS model are structured components that include:

High-quality literacy and math instruction: The CEPS sites use America’s Choice Ramp-Up, a year long, school-based literacy curriculum focused on nonachieving students that provides instructors with detailed lesson plans, assignments, key concepts and classroom rituals.

Communication and collaboration: Staff members meet regularly to discuss students and to develop student support strategies.

A “Primary Person” approach: Each youth is assigned to an adult staff member who acts as a “go to” person for guidance, support and/or referrals.

Collection and use of data: Student progress is assessed periodically using the Test of Adult Basic Skills. TABE scores are reported as the academic grade equivalent of the student’s demonstrable skill set. All students take the TABE upon entry into the program; the test is administered at approximately four-month intervals during the academic year. In addition, student demographic and personal data were gathered and regularly reviewed to determine students’ counseling and social service needs.

Training and technical assistance: YDI provides training and technical assistance to CEPS teams on America’s Choice curricula and youth development concepts through on-site coaching (3½ days per month), regular meetings with CBO staff members and cross-site networking opportunities.

Across these components, the CEPS model incorporates seven “core” youth development practices: safety, caring adult relationships, high expectations, youth voice and participation, mastery and competence, engaging and meaningful activities, and continuity.

Findings

During the 2007-08 year, 18 percent of CEPS students entered GED programs, with a range among sites of 5 percent to 45 percent. The average retention rate (defined as being in the program long enough to take the TABE more than once) among the sites was 54 percent, ranging from 29 percent to 95 percent. The average literacy gain was 1.5 grade equivalents, while the math gain averaged 0.9 grade equivalent.

Such numbers should be viewed as interconnected when defining success, said Patricia B. Campbell, president of Campbell-Kibler Associates, who conducted the evaluation with colleague Jennifer Weisman. “Limiting success to gains in literacy scores, and not looking at how long it takes to make the gain, or what the retention rates are, does everyone a disservice,” she said.

At the same time, when using both retention and grade equivalent gains as indicators of success, it’s important to remember that as more students are retained in the program, their scores might bring down the average gains among the youths overall.

“If you’re starting with 40 kids, and you finish with 10 kids, those 10 kids are probably the ones who are doing OK,” Campbell said. “If you end with 35 kids, those extra 25 kids are probably not the stars.”

Lessons

The key lessons are to be found in tying site activities to outcomes, Campbell notes in her evaluation report. Those include:

Implementing all the pieces. When programs pick and choose which of the model’s components to implement and which to discard, the integrity of the model is compromised. “They shouldn’t expect to get the same results as when you implement the whole model,” Campbell said of those programs that strayed.

Implementation of the model varied widely across the 10 sites. For example, although the sites averaged 11¼ hours of literacy instruction per week, individual sites offered between 4½ and 24 hours per week. The sites averaged only 6.3 hours of math instruction, with individual sites offering between three and 16 hours per week.

At some sites, one staff member’s full-time job was to act as the primary contact person for all students. At other sites, several staff members each act as the primary person to smaller groups of students. And while all of the sites used Ramp-Up, only two used America’s Choice Math Navigator.

As the sites became more effective at implementing all of the pieces of the model, they saw significant differences in their effectiveness. For example, while all sites had informal communication about youths (e.g., staff members discussing particular students over lunch), the sites that moved to more formalized communication (as laid out in the model) saw increases in retention rates. Campbell said that was due to “the quality of the information exchanged, for one, because you’re better prepared, but also the intentionality of covering the students, so it’s much harder for a student to slip through the cracks.”

Experience makes a difference. “There really is a learning curve,” Campbell said. “The work you do the first year, you shouldn’t expect to be as good as it’s going to get. The idea of trying to do it all without learning from what you’ve done the previous year can get you into trouble.”

The four sites with the lowest student retention rates at the time of the final evaluation were the sites that began last. Over time, the six older sites all showed increases in retention rates.

Curriculum and instructors matter. The evaluation report contains several examples of the impact of instructor knowledge on organization, program flexibility and adaptability within the context of the model, and on how staff members buy in to the CEPS model affected student enthusiasm and retention.

There is a measurable impact to having a structured curriculum supported by training and technical assistance specific to that curriculum, Campbell said. She noted that in seven of the eight sites, the youths’ gains in literacy scores were greater than their gains in math scores. She attributed that to the fact that all the sites used the full model, including the structured literacy curriculum (Ramp-Up) and its strong technical assistance, but only two used the math curriculum.

“If you have all of the [model’s] supports … you certainly get gains, but in most cases, the gains are not as good as if you have all of the supports and you have the curriculum,” Campbell said.

Comments
To Top
Skip to content