Evaluation as a Field-Building Strategy


In 1992, the Robert Bowne Foundation (RBF) began offering its grantees formal support in the development and use of program evaluation as a way to strengthen and enhance their afterschool programs.  From the start, the RBF viewed the tool of evaluation as a means of transforming the field of Afterschool Education.

In the late-1990s, for example, RBF’s ambitiously-titled Re-Imagining the Afterschool Program Initiative sought to make program evaluation an integral part of organizational capacity, thereby:

  • Strengthening the education and literacy components of grantee agencies’ afterschool programs;
  • Stabilizing the funding base of each agency as well as enhancing its organizational capacity; and
  • Encouraging each agency to conduct regular:  self-assessments; program planning efforts; and staff development trainings and other supports.

It was RBF Executive Director Dianne Kangisser who grasped that involving an evaluator in a program helped in program planning and development for the following year.  She became particularly interested in Participatory Evaluation, recognizing that both program planning and program improvement are best served when program staff are involved in both the design and implementation of evaluation efforts.

The idea of continual assessment — and thereby, continual learning from experience — is at the core of the Foundation’s philosophy, practice, and approach to grantees.  Such a stance, bolstered by the aim of transforming the field of afterschool education itself, has meant a continuously evolving conceptualization and practice of program evaluation, as well as ongoing utilization of findings.

In 1998, the Foundation sponsored a Participatory Evaluation Institute with the aim of expanding the pool of program evaluators who were both attuned to small non-profit organizations as well as supportive of a participatory approach to program evaluation.  The Summer Institute, led by Anita Baker, introduced attendees to advanced participatory evaluation training. With the conclusion of the Institute, those participants working with RBF grantees continued meeting as the “Participatory Evaluation Group,” with the aim of boosting both the understanding and capacity of professional evaluators to work with, analyze, and support the development of community-based, OST programs. Both Anne Lawrence and Kim Sabo Flores were among the professional evaluators participating in the training program.

Foundation staff quickly recognized, however, that any benefits of the participatory evaluation process relied on the program staff who actually implemented the evaluation process. As a result, the Foundation created the Participatory Evaluation Institute to build the evaluation skills of grantees’ OST program staff.  The Institute ran from 2001 – 2007.

A 2007 report titled Robert Bowne Foundation: Evaluation of the Participatory Evaluation Institute stressed the relationship between the approach to evaluation and the Bowne vision:

“[The Robert Bowne Foundation] chose Participatory Evaluation (PE) as their primary method for building their grantees’ evaluation capacities because it seemed like an ideal fit with RBF’s overall philosophy of ongoing organizational learning and improvement …

It is a highly useful approach because it:

  • Reflects practitioners’ underlying assumptions and questions about their programs;
  • Contributes to practitioners’ knowledge about and pleasure in their work;
  • Supports the process of integrating evaluation strategies into everyday practice; and
  • Promotes ongoing learning.”

Over years of experimentation and reflection, the RBF developed a deep, field-based recognition of the importance of evaluation both for program improvement as well as accountability. Moreover, in formulating and utilizing evaluation processes, grantee organizations develop grounded theories of change, ask relevant and pressing questions, alter programs based on evaluation results, heighten program staffs’ consciousness of and relatedness to the mission of their organizations, and deepen staff discussions of their work both among themselves and with funders.

To further the goal of building the field, the RBF also aimed to engage the funding community in deepening its own understanding of the role of evaluation in program development and improvement. The RBF offered a five-month intensive training institute for funders of afterschool and youth programs, raising such topics as: evaluation planning, data collection, and data analysis methodologies. The RBF also organized a Funder Conference in June 2004 and then initiated a follow-up Funders Study Group on Participatory Evaluation.

Still, many challenges remained, including: gaps between funders’ expectations of an evaluation, on the one hand, and the usefulness of findings for program and organizational staffs, on the other; lack of time or resources for analyzing and making meaning from the collected data; and insufficient funding for OST programs both to operate quality programs as well as evaluate them. Thus, for example, many RBF grantees were successful at collecting a lot of data and, based on those data, having good discussions – but, they were struggling to make such information and insights available in a timely and practical fashion so programs could actually use it in making next-step decisions.

As early as 1998, Dianne Kangisser, then executive director of the RBF, was exploring the idea of “cross-site indicators” and of identifying a valid youth development evaluation tool that could be used across the field.

Anne Lawrence continued the Foundation’s interest in program evaluation, engaging consultant Kim Sabo Flores to work with the Foundation through the years both in developing its own evaluation system as well as providing training for interested grantees. In 2014, Anne collaborated with Kim Sabo Flores, now of Algorhythm, Inc., to work with program staff in developing a research-based, program-employed tool for evaluating OST programs. The collaboration is another instance of the Foundation’s efforts to develop the field of OST education as well as support grantees in building their own capacity and sustainability.

During an 18-month period in 2014-15, a cohort of eight RBF grantee organizations worked with Sabo Flores and Algorhythm, Inc., to develop a common tool for measuring participation outcomes of enrollees in youth development programs. The tool can now be used by OST programs across the country.  In addition, the effort has had another immediate impact: the commitment of many RBF participants to advocate with both funders and policy makers for program evaluation as well as program practices that promote positive youth development.

In 2015, as one of its final activities before closing, the RBF co-sponsored a conference with Youth Inc. and Algorhythm Measuring What Matters for funders and others – in which organizations that participated in RBF’s Evaluation Capacity Building Initiative shared their experiences in developing and implementing a cross-site youth development evaluation system.


LINKS:

To Top
Skip to content