History & Evolution of Robert Bowne Foundation Support for Evaluation Professional Development


The Robert Bowne Foundation’s (RBF) support for promising programs has always included a commitment to help improve grantees’ organizations. It has meant developing the capacity of organizations to reflect on program quality as well as evaluate the impact of a program on participating youth.

The Foundation’s stance led to a generosity of supports, including:  in-depth site visits, availability of specialized technical assistance, an “open door” policy, linking programs to each other via networking opportunities, and the establishment of reflective practice — learning to reflect on and assess the quality and effectiveness of one’s own program.

RBF-supported consultants worked to engage practitioners in discovering:

  • How evaluation can support program design and delivery;
  • Ways to engage and train program staff in developing and implementing a program-specific evaluation;
  • Methods of collecting and reviewing evaluation data in order to identify program strengths and weaknesses; and
  • How to create program improvement plans based on such a program-specific analysis.

RBF first offered formal professional development training in the late 1990s focused on program evaluation. RBF program consultants, hired to work with grantees, met together for several years as the Participatory Evaluation Group. Together, they learned about data collection, studied research, and shared information about what they were doing at their sites. The group of 17 consultants — including Anne Lawrence and Kim Sabo Flores– discovered that involving program staff in the evaluation process meant that programs were more likely to “own” the evaluation, value the data they were collecting, learn continuously about, and improve their programs.

In turn, the RBF offered grantee staff members the opportunity to take part in a Participatory Evaluation Institute. Each Institute aimed to build the capacity of OST programs by providing staff with intensive evaluation training as well as on-site, one-on-one consultations.

The training aimed to engage and enable program staff members to:

  • Better understand the uses and practices of evaluation;
  • Integrate evaluation and reflective practice into their work;
  • Use the evaluation process and its findings to improve programs; and
  • More accurately articulate their program models.

Between 2001 and 2007, more than 70 individuals from 42 organizations – including executive directors and program staff members – participated in such institutes. The institutes ranged in length from an introductory seminar consisting of four half-day sessions, to an 18-month basic training seminar followed by the formation and work of an Advanced Study Group.

According to a 2007 evaluation of the Participatory Evaluation Institutes  42 programs completed evaluation plans and nearly half of those (18 programs) completed full evaluation reports. Across the initiative, all participants could name specific areas of personal as well as program growth, and more than 70 percent also reported agency growth as a result of Institute participation.

Moreover, based on their experiences with professional development, along with findings from Kim Sabo Flores’ study of quality out-of-school time programs, RBF staff and board members came to a critical insight. They were convinced that the strength and depth of an organization’s capacity is directly linked to being a “learning organization.”  That is, an organization where practitioners have opportunities and capacity to: collect and examine data; reflect on their work; share successes and challenges; and make changes based on what they have learned.

Suzanne Marten, Executive Director of Center for Educational Options described the uses of evaluation:

“Evaluations can be used for reporting out about programs…Evaluations can also be used internally to find how staff, youth, parents think, and to inform everyday decisions about the program.”

In its final two years, RBF supported a cohort of grantee organizations, each with an established level of evaluation expertise, to form a learning community. Longtime RBF consultant Kim Sabo Flores then worked with the group, facilitating a two-phase Evaluation Capacity Building Initiative aimed at addressing multiple goals, including:

  • Supporting eight participating agencies to improve evaluation practices;
  • Developing the expertise of individuals to lead evaluation efforts within their agencies;
  • Exploring the feasibility of creating evaluation tools to be used across programs; and
  • Offering the opportunity to participate in building and piloting these tools.

Kim Sabo Flores designed group activities meant to create a learning community and to engage participants in developing a common language; communicating complex information; and building collegial relationships. Moreover, Lawrence and Sabo Flores created opportunities for administration — as well as staff — to participate in the learning community, as administrative buy-in is a crucial part in the successful implementation of any evaluation instrument developed through the project.


LINKS:

To Top
Skip to content