RBF_3_07_History and Evolution of Robert Bowne Foundation Support for Professional Development Evaluation

The Robert Bowne Foundation’s (RBF) support for promising youth literacy programs has always included a commitment to help improve grantees’ programs. Over time, such support has meant enriching programs as well as boosting staffs' ability to work effectively with participants. Even more, it has meant developing the capacity of grantee organizations both to reflect on program quality as well as evaluate formally the impact of a program on participating youth.

For RBF grantees, such a stance has led to a generosity in supports, including:  in-depth site visits, availability of specialized technical assistance, an “open door” foundation stance, and the practice of linking programs to each other. In addition, RBF created and made available to all grantees a range of professional development offerings, from building and enriching youth reading practices to creating in-program libraries for the use of participating youth, from cross-program sharing of innovative practices to the establishment of reflective practice -- that is, learning to reflect on and assess the quality and effectiveness of one's own program.

RBF-supported consultants worked to engage practitioners in discovering:

  • How evaluation can support program design and delivery;
  • Ways to engage and train program staff in developing and implementing a program-specific evaluation;
  • Methods of collecting and reviewing evaluation data in order to identify program strengths and weaknesses; and
  • How to create program improvement plans based on such a program-specific analysis.

It was in the late 1990s that the RBF first offered formal professional development training focused on program evaluation. RBF program consultants, hired to work with specific grantees, met together for several years as the foundation’s Participatory Evaluation Group. Together, they learned about data collection, studied research, and shared information about what they were doing and learning at their sites. The group of 17 consultants -- including Anne Lawrence and Kim Sabo Flores-- discovered that involving program staff in the evaluation process meant that programs were more likely to “own” the evaluation, value the data they were collecting, learn continuously about, and improve their programs.

In turn, the RBF offered grantee staff members the opportunity to take part in a Participatory Evaluation Institute. Each Institute aimed to build the capacity of OST programs by providing participating program staff with intensive evaluation training as well as on-site, one-on-one consultations.

The training explicitly aimed both to engage and enable program staff members to:

  • Better understand the uses and practices of evaluation;
  • Integrate evaluation and reflective practice into their work;
  • Use the evaluation process and its findings to improve programs; and
  • More accurately articulate their program models.

Between 2001 and 2007, more than 70 individuals from 42 organizations – including executive directors and program staff members – participated in such institutes. The institutes themselves ranged in length from an introductory seminar consisting of four half-day sessions, to an 18-month basic training seminar followed by the formation and work of an Advanced Study Group.

According to a 2007 evaluation of the Participatory Evaluation Institutes  [PDF4_Summer_2007_ Evaluation_Institute], 42 programs completed evaluation plans and nearly half of those (18 programs) completed full evaluation reports. Across the initiative, all participants could name specific areas of personal as well as program growth, and more than 70 percent also reported agency growth as a result of Institute participation.

Moreover, based on their experiences with professional development, along with findings from Kim Sabo Flores’ study of quality out-of-school time programs [PDF9_Research-Report_2010_Dynamic_Framework], RBF staff and board members came to a critical insight. They were convinced that the strength and depth of an organization’s capacity is directly linked to being a “learning organization.”  That is, an organization where practitioners have opportunities and capacity to: collect and examine data; reflect on their work; share successes and challenges; and make changes based on what they have learned.

In 2006, the RBF began offering networking meetings where OST program staff could talk with each other, sharing experiences and insights. Within this context, Lawrence considered ways to think and talk about evaluation. Ultimately, she decided to introduce what is possible to do within a program as well as share the usefulness of ongoing evaluation data for informing such decisions as choosing next steps with program youth or staff – but also to caution participants that learning about evaluation takes more than one workshop. In a January 2012 networking meeting, co-facilitator Suzanne Marten described the uses of evaluation:

Evaluations can be used for reporting out about programs...Evaluations can also be used internally to find how staff, youth, parents think, and to inform everyday decisions about the program.

In its final two years, RBF supported a cohort of grantee organizations, each with an established level of evaluation expertise, to form a learning community. Longtime RBF consultant Kim Sabo Flores then worked with the group, facilitating a two-phase Evaluation Capacity Building Initiative [PDF10_Snapshot_Evaluation_PD_Examples] aimed at addressing multiple goals, including:

  • Supporting eight participating agencies to improve evaluation practices;
  • Developing the expertise of individuals to lead evaluation efforts within their agencies;
  • Exploring the feasibility of creating evaluation tools to be used across programs; and
  • Offering the opportunity to participate in building and piloting these tools.

During the February 2014 orientation for the project, Anne Lawrence introduced the concept of a “learning community” as the key strategy for achieving such ambitious goals.

We believe that just by bringing people together and sharing experiences a lot can happen.  Getting a chance to talk sometimes solves problems.

In addition, during a 2016 discussion of Learning Communities, Lawrence cited Melody Schneider’s article Building a Community of Learners from an early 90’s issue of Literacy Assistance Update:

A community is a place where people come together for common reasons, a place where you feel safe and take risks. Power should be shared and knowledge valued. Dialogue, respect and high levels of comfort and security, is vital.

To create such a learning community, Kim Sabo Flores designed group activities meant to engage participants in:  developing a common language; communicating complex information; and building collegial relationships. Moreover, Lawrence and Sabo Flores created opportunities for directors -- as well as staff -- to participate in the learning community. They considered administrative buy-in a crucial part of gaining recognition in the youth development field of any evaluation instrument developed through the project.


  • Powerpoint presentation to the Board: Robert Bowne Foundation: Evaluation of the Participatory Evaluation Institute, July 2007 [PDF4_Summer_2007_ Evaluation_Institute]
  • 2010 Research report: A Dynamic Framework for Understanding the Complex Work of Quality Out-of-School Time Programs, Kim Sabo Flores, Ph.D. [PDF9_Research-Report_2010_Dynamic_Framework]
  • Snapshot: Evaluation Capacity-Building Examples:  Professional Development Sessions  [PDF10_Snapshot_Evaluation_PD_Examples].
  • Participants in the Evaluation Capacity Building Project describe successes, challenges, and the impact on their programs at the Measuring What Matters Conference on September 30, 2015.  The Robert Bowne Foundation and Youth, INC hosted the conference, which celebrated the work of their grantees to integrate evaluation into their organizations in partnership with Algorhythm.io.  [ML9_EVC_MeasureWhatMatters_093016]