Have you ever been to a conference or attended a webinar and left impressed with the program or strategy you just heard? Walked away saying “I want one of those for my community.” I know I did years ago.
Recently I heard this effect described as “plopping” — bringing back new programs and “plopping” them in your community — whether they are a good fit or not. Before launching a new program or strategy, consider the below points to help define, and take action on, “what works.”
The following considerations can be beneficial for leaders who care about making an impact on the population or community of interest and need to practice good stewardship of the limited resources available. These ideas are especially important for decision-making bodies in a community that allocate funding or prioritize investments.
- Community indicator data: Do you know who the members of your community are? Age? Family status? Special needs? Where community members live, learn and work? How well are community members doing academically, emotionally, physically, civically? Is the information available at the level (e.g. ZIP code, county, census tract) needed for the targeted community?
- Community experiences and expectations: It is important to know the community that is the focus of the strategy or intervention. This includes bringing community members with lived experience into the planning and decision-making process early and often. A good way to recognize their investment in the process is to refer to them as “context experts” — in balance with those who see themselves as content experts — and to provide them some resources to enable them to participate fully and frequently. This includes supporting their ability to gather information and represent others from their community. Most of the other participants in this process typically are able to participate as part of their paid job.
- Research and science: Based on the issues or needs you’ve identified, do you have a handle on what the latest research says is important? In the past decade, there has been significant progress in understanding the continuing growth and development of the brain through the early 20s; these insights shed light on what works for older adolescents.
- Evidence-based program lists: There are a number of sources from the federal government, higher education and national not-for-profits that provide information on programs that meet different levels of impact evidence. Usually the highest level are those programs that have conducted randomized controlled trials. Some of these lists will also have information about practices (as opposed to programs) available such as the U.S. Department of Education’s What Works Clearinghouse practice guides for education interventions. It is useful to remember that there are various levels of evidence on these lists and that the outcomes impacted by programs may be pretty specific. Here are two sites that contain registries:
- Youth.gov: a set of registries from federal agencies containing lists of evidence-based programs as a way to disseminate information about programs and their level of effectiveness;
- The Blueprints for Healthy Youth Development mission is to promote interventions that work.
- Return on investment — cost benefit analysis: Judicious use of resources is important. In part, the focus of evidence-based decision making is to help promote the efficient use of our limited resources or investments in strategies and interventions. One very good site that looks at cost benefit analysis is Washington State Institute for Public Policy, which has a mandate from state government to identify evidence-based policies and draw conclusions about what works. They cover a number of areas including juvenile justice, health care, public health and prevention and more.
- Implementation science: Yes, there is a science to implementation. The National Implementation Research Network has very useful tools such as the Hexagon Tool, which helps leaders ask some relevant questions about a potential new strategy or intervention before they use it. This helps avoid problems later (from my experience: a community job market not having enough qualified staff — social workers in one case or alcoholism counselors in another case — to implement the desired program). And it helps confront the issue of sustainability before you implement. Too often communities wait until partway through a grant to think about how to keep the effort going now that they know it works.
Look at programs + kernels
Using evidence to make decisions has value when done well. Evidence can include data, statistics, performance improvement, evaluation, social and behavioral sciences, community experiences and more.
There is not a single template on how to do this decision-making. Two key ingredients for success include having discussions and ultimately building consensus on the criteria for evidence, and second, engaging critical stakeholders, including context experts.
There is no one program, however much evidence there is, that will cleanly solve complex social problems. Complex problems require a comprehensive set of strategies, interventions and/or programs and practices. Many communities have faced challenges over trying to take evidence-based programs to scale for a number of reasons. Evidence is also useful in improving existing programs or for discontinuing funding for programs that demonstrate no impact over time.
Before defaulting to “plopping” down a new evidence-based program in your community, take a step back and consider program components, practices and policies. Evidence-based programs are often an integration of services or programs (e.g. group therapy, social skills training) combined with some practices or “kernels” — active ingredients of existing programs. Evidence-based programs package these with protocols, standards, training and dosage so they can be repeated with fidelity.
There now have been efforts to unpack these evidence-based programs. One, the Standardized Program Evaluation Protocol (SPEP™), is a validated, data-driven rating scheme for determining how well an existing program or service matches the research evidence for the effectiveness of that particular type of intervention for reducing the recidivism of juvenile offenders. Some of the generic program types identified in SPEP include mentoring, individual counselling, cognitive-behavioral therapy, etc. With proper assessments of recidivism risk in juvenile justice, use of the more generic program types that have the important characteristics around program philosophy, program type, dose and quality of services have proven successful. Many communities have these programs already.
And there are the “kernels,” the active ingredients in evidence-based programs and beyond. Looking at social emotional learning evidence-based programs, Stephanie Jones and others at the Harvard Graduate School of Education distilled the active kernels from 25 top SEL programs. These kernels (such as the “turtle effect”) could be applied across different settings — in classrooms, for example, but also on the playground and at home — to maximize their impact. Nicely, these kernels also align with the practices that youth workers in exemplary youth programs identify as critical to impacting youth and maintaining program quality.
And then, teens express in their own words the importance of similar practices in their lives. Youth Communications provides powerful, teen-written stories and professional development to help educators and youth workers engage struggling youth.
When making decisions about actions that will work, it is important to include a policy focus. This includes decisions about how resources are allocated. It is not just government, at all levels, that make policies and statutes but also funders, schools boards, businesses and organizations. Impacting policies are often key to making more sweeping system changes.
Back to the beginning: As often is the case, there is not a silver bullet or best mousetrap. That being said, we now know more and have more lists, tools, methods, etc. to guide our decision-making. We are getting better at making sure we have context experts at the table throughout the process but there’s still a long way to go on this point.
So be comfortable with having a collaborative, transparent, deliberative process. You have permission to not pick just one or just what looks like the best one. There are good programs that might not fit your community, your time and your needs. Stay the course through implementation — if it works, you are more likely to sustain it.
Larry Pasti is a connector of ideas, leaders and places: He joined The Forum for Youth Investment in October 2008, having retired after many years at the New York State Office of Children and Family Services, where he was bureau director.