Employment: Archives 2014 & Earlier

Precision Engineering

Creating sound bites that are not only memorable but communicate a message is a passion of mine. Every now and then I hit one out of the park. “Problem-free isn’t fully prepared” is still in use after more than 15 years. I think “bringing precision to our passion” may be the next one to stick.

I’m excited. The paradigm shift associated with this new slogan may be as important as the initial idea that the focus of youth policies and programs must be on preparation, even for vulnerable youth.

Passion and precision are sometimes seen as opposites. The call for passion and precision is a call for us to blend them. Think head/heart; capacity/motivation. Our actions improve when they are informed by both.

Changing the odds for young people requires us to not only be passionate in our commitment to providing high-quality supports and opportunities to all youth, but precise in measuring how well we and they are doing, so that we can make real-time system adjustments. It’s not enough to care. We have to calculate, compare and count.

It is also not enough to simply develop metrics for our own programs. Young people don’t grow up in programs; they grow up in communities. We must develop community-level dashboards whose clear measurements compel community leaders to think differently, so that together they can act differently.

Don’t we have these dashboards? Like Kids Count, and scores of report cards? They are simply thermometers, not advanced diagnostic tools. We need integrated quality improvement systems that not only store data, but parse it at levels normally found in evaluations. In order to achieve our goals, we need more than gross annual numbers on youth participation and youth outcomes. We need to be able to look for patterns across populations, sort by neighborhood, create profiles and test hypotheses. We need all the data to be linked to individual children.

Is this possible? Absolutely. Cities like Chicago and Providence, R.I., with help from the Wallace Funds, use program participation tracking software that stores basic information about youth (such as age, gender and address) and youth organizations (including size, type, location and program offerings). This lets the cities generate profiles broken down by youth behavior (what programs they attend, which activities they engage in), demographics (the attendance patterns of high schoolers or of youths in one catchment area) and community supports (how much art instruction is available, in what neighborhoods, for what age groups).

Imagine the questions we could ask if we added to this the typical youth data about school attendance, reading scores and summer employment, along with information about public resources, like funding levels.

Isn’t this prohibitively expensive? Not really. There are up-front costs for developing the system and installing the software in the places and programs where kids spend their time. But these are outweighed by the long-term benefits of improving our ability to match program offerings with participation patterns in order to increase overall reach.

My early discussions with business leaders suggest that these are the kinds of investments they would be willing to make if they were convinced that community leaders would use the data to change the ways they do business.

Isn’t this risky? Sure. Programs that know they are not doing well – because of low attendance, high turnover, poor quality or high expenses – will get identified. But because this is a performance system, not a witch hunt, cities will recognize high-quality programs and support program improvement. If recent research findings are borne out, this exposure will strengthen, not damage, the case for greater investment in youth development.

Meta-analyses conducted by Joe Durlak and Roger Weissberg in 2006 found that, other things being equal, young people participating in high-quality community programs had higher academic and social-emotional skills than did those in a control group. Those participating in low-quality programs, however, showed no gains. Quality was related not to program content, but to program composition. Successful programs, according to these researchers, are “SAFE” – sequential, active, focused and explicit – about the skills they want to produce.

Consider this scenario: Two families encourage their children to attend school and participate in an after-school youth program. Both youths participate at equal rates. But one attends a low-performing school and a low-quality youth program. There we have equal youth effort, but unequal youth outcomes.

We know which kids get the short end of the stick year after year, falling further behind. Technology makes it possible to calculate the pitiful returns they and their families are getting on their personal investments in schools and programs that literally aren’t worth their time.

How can we not want this data? How could it not impel us to greater outrage and action, to both precision and passion?

Comments
To Top
Skip to content